Sample records for computer model packages

  1. THE USE OF COMPUTER MODELING PACKAGES TO ILLUSTRATE UNCERTAINTY IN RISK ASSESSMENTS: AN EASE OF USE AND INTERPRETATION COMPARISON

    EPA Science Inventory

    Consistent improvements in processor speed and computer access have substantially increased the use of computer modeling by experts and non-experts alike. Several new computer modeling packages operating under graphical operating systems (i.e. Microsoft Windows or Macintosh) m...

  2. Optimal segmentation and packaging process

    DOEpatents

    Kostelnik, Kevin M.; Meservey, Richard H.; Landon, Mark D.

    1999-01-01

    A process for improving packaging efficiency uses three dimensional, computer simulated models with various optimization algorithms to determine the optimal segmentation process and packaging configurations based on constraints including container limitations. The present invention is applied to a process for decontaminating, decommissioning (D&D), and remediating a nuclear facility involving the segmentation and packaging of contaminated items in waste containers in order to minimize the number of cuts, maximize packaging density, and reduce worker radiation exposure. A three-dimensional, computer simulated, facility model of the contaminated items are created. The contaminated items are differentiated. The optimal location, orientation and sequence of the segmentation and packaging of the contaminated items is determined using the simulated model, the algorithms, and various constraints including container limitations. The cut locations and orientations are transposed to the simulated model. The contaminated items are actually segmented and packaged. The segmentation and packaging may be simulated beforehand. In addition, the contaminated items may be cataloged and recorded.

  3. Implementation and use of direct-flow connections in a coupled ground-water and surface-water model

    USGS Publications Warehouse

    Swain, Eric D.

    1994-01-01

    The U.S. Geological Survey's MODFLOW finite-difference ground-water flow model has been coupled with three surface-water packages - the MODBRANCH, River, and Stream packages - to simulate surface water and its interaction with ground water. Prior to the development of the coupling packages, the only interaction between these modeling packages was that leakage values could be passed between MODFLOW and the three surface-water packages. To facilitate wider and more flexible uses of the models, a computer program was developed and added to MODFLOW to allow direct flows or stages to be passed between any of the packages and MODFLOW. The flows or stages calculated in one package can be set as boundary discharges or stages to be used in another package. Several modeling packages can be used in the same simulation depending upon the level of sophistication needed in the various reaches being modeled. This computer program is especially useful when any of the River, Stream, or MODBRANCH packages are used to model a river flowing directly into or out of wetlands in direct connection with the aquifer and represented in the model as an aquifer block. A field case study is shown to illustrate an application.

  4. Open-source Software for Exoplanet Atmospheric Modeling

    NASA Astrophysics Data System (ADS)

    Cubillos, Patricio; Blecic, Jasmina; Harrington, Joseph

    2018-01-01

    I will present a suite of self-standing open-source tools to model and retrieve exoplanet spectra implemented for Python. These include: (1) a Bayesian-statistical package to run Levenberg-Marquardt optimization and Markov-chain Monte Carlo posterior sampling, (2) a package to compress line-transition data from HITRAN or Exomol without loss of information, (3) a package to compute partition functions for HITRAN molecules, (4) a package to compute collision-induced absorption, and (5) a package to produce radiative-transfer spectra of transit and eclipse exoplanet observations and atmospheric retrievals.

  5. Revealing Neurocomputational Mechanisms of Reinforcement Learning and Decision-Making With the hBayesDM Package

    PubMed Central

    Ahn, Woo-Young; Haines, Nathaniel; Zhang, Lei

    2017-01-01

    Reinforcement learning and decision-making (RLDM) provide a quantitative framework and computational theories with which we can disentangle psychiatric conditions into the basic dimensions of neurocognitive functioning. RLDM offer a novel approach to assessing and potentially diagnosing psychiatric patients, and there is growing enthusiasm for both RLDM and computational psychiatry among clinical researchers. Such a framework can also provide insights into the brain substrates of particular RLDM processes, as exemplified by model-based analysis of data from functional magnetic resonance imaging (fMRI) or electroencephalography (EEG). However, researchers often find the approach too technical and have difficulty adopting it for their research. Thus, a critical need remains to develop a user-friendly tool for the wide dissemination of computational psychiatric methods. We introduce an R package called hBayesDM (hierarchical Bayesian modeling of Decision-Making tasks), which offers computational modeling of an array of RLDM tasks and social exchange games. The hBayesDM package offers state-of-the-art hierarchical Bayesian modeling, in which both individual and group parameters (i.e., posterior distributions) are estimated simultaneously in a mutually constraining fashion. At the same time, the package is extremely user-friendly: users can perform computational modeling, output visualization, and Bayesian model comparisons, each with a single line of coding. Users can also extract the trial-by-trial latent variables (e.g., prediction errors) required for model-based fMRI/EEG. With the hBayesDM package, we anticipate that anyone with minimal knowledge of programming can take advantage of cutting-edge computational-modeling approaches to investigate the underlying processes of and interactions between multiple decision-making (e.g., goal-directed, habitual, and Pavlovian) systems. In this way, we expect that the hBayesDM package will contribute to the dissemination of advanced modeling approaches and enable a wide range of researchers to easily perform computational psychiatric research within different populations. PMID:29601060

  6. Optimal segmentation and packaging process

    DOEpatents

    Kostelnik, K.M.; Meservey, R.H.; Landon, M.D.

    1999-08-10

    A process for improving packaging efficiency uses three dimensional, computer simulated models with various optimization algorithms to determine the optimal segmentation process and packaging configurations based on constraints including container limitations. The present invention is applied to a process for decontaminating, decommissioning (D and D), and remediating a nuclear facility involving the segmentation and packaging of contaminated items in waste containers in order to minimize the number of cuts, maximize packaging density, and reduce worker radiation exposure. A three-dimensional, computer simulated, facility model of the contaminated items are created. The contaminated items are differentiated. The optimal location, orientation and sequence of the segmentation and packaging of the contaminated items is determined using the simulated model, the algorithms, and various constraints including container limitations. The cut locations and orientations are transposed to the simulated model. The contaminated items are actually segmented and packaged. The segmentation and packaging may be simulated beforehand. In addition, the contaminated items may be cataloged and recorded. 3 figs.

  7. 10 CFR 431.92 - Definitions concerning commercial air conditioners and heat pumps.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... expressed in identical units of measurement. Commercial package air-conditioning and heating equipment means... application. Computer Room Air Conditioner means a basic model of commercial package air-conditioning and heating equipment (packaged or split) that is: Used in computer rooms, data processing rooms, or other...

  8. Comparison of Computer Based Instruction to Behavior Skills Training for Teaching Staff Implementation of Discrete-Trial Instruction with an Adult with Autism

    ERIC Educational Resources Information Center

    Nosik, Melissa R.; Williams, W. Larry; Garrido, Natalia; Lee, Sarah

    2013-01-01

    In the current study, behavior skills training (BST) is compared to a computer based training package for teaching discrete trial instruction to staff, teaching an adult with autism. The computer based training package consisted of instructions, video modeling and feedback. BST consisted of instructions, modeling, rehearsal and feedback. Following…

  9. Spin wave Feynman diagram vertex computation package

    NASA Astrophysics Data System (ADS)

    Price, Alexander; Javernick, Philip; Datta, Trinanjan

    Spin wave theory is a well-established theoretical technique that can correctly predict the physical behavior of ordered magnetic states. However, computing the effects of an interacting spin wave theory incorporating magnons involve a laborious by hand derivation of Feynman diagram vertices. The process is tedious and time consuming. Hence, to improve productivity and have another means to check the analytical calculations, we have devised a Feynman Diagram Vertex Computation package. In this talk, we will describe our research group's effort to implement a Mathematica based symbolic Feynman diagram vertex computation package that computes spin wave vertices. Utilizing the non-commutative algebra package NCAlgebra as an add-on to Mathematica, symbolic expressions for the Feynman diagram vertices of a Heisenberg quantum antiferromagnet are obtained. Our existing code reproduces the well-known expressions of a nearest neighbor square lattice Heisenberg model. We also discuss the case of a triangular lattice Heisenberg model where non collinear terms contribute to the vertex interactions.

  10. I-deas TMG to NX Space Systems Thermal Model Conversion and Computational Performance Comparison

    NASA Technical Reports Server (NTRS)

    Somawardhana, Ruwan

    2011-01-01

    CAD/CAE packages change on a continuous basis as the power of the tools increase to meet demands. End -users must adapt to new products as they come to market and replace legacy packages. CAE modeling has continued to evolve and is constantly becoming more detailed and complex. Though this comes at the cost of increased computing requirements Parallel processing coupled with appropriate hardware can minimize computation time. Users of Maya Thermal Model Generator (TMG) are faced with transitioning from NX I -deas to NX Space Systems Thermal (SST). It is important to understand what differences there are when changing software packages We are looking for consistency in results.

  11. Space-Shuttle Emulator Software

    NASA Technical Reports Server (NTRS)

    Arnold, Scott; Askew, Bill; Barry, Matthew R.; Leigh, Agnes; Mermelstein, Scott; Owens, James; Payne, Dan; Pemble, Jim; Sollinger, John; Thompson, Hiram; hide

    2007-01-01

    A package of software has been developed to execute a raw binary image of the space shuttle flight software for simulation of the computational effects of operation of space shuttle avionics. This software can be run on inexpensive computer workstations. Heretofore, it was necessary to use real flight computers to perform such tests and simulations. The package includes a program that emulates the space shuttle orbiter general- purpose computer [consisting of a central processing unit (CPU), input/output processor (IOP), master sequence controller, and buscontrol elements]; an emulator of the orbiter display electronics unit and models of the associated cathode-ray tubes, keyboards, and switch controls; computational models of the data-bus network; computational models of the multiplexer-demultiplexer components; an emulation of the pulse-code modulation master unit; an emulation of the payload data interleaver; a model of the master timing unit; a model of the mass memory unit; and a software component that ensures compatibility of telemetry and command services between the simulated space shuttle avionics and a mission control center. The software package is portable to several host platforms.

  12. Faster than Real-Time Dynamic Simulation for Large-Size Power System with Detailed Dynamic Models using High-Performance Computing Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Renke; Jin, Shuangshuang; Chen, Yousu

    This paper presents a faster-than-real-time dynamic simulation software package that is designed for large-size power system dynamic simulation. It was developed on the GridPACKTM high-performance computing (HPC) framework. The key features of the developed software package include (1) faster-than-real-time dynamic simulation for a WECC system (17,000 buses) with different types of detailed generator, controller, and relay dynamic models, (2) a decoupled parallel dynamic simulation algorithm with optimized computation architecture to better leverage HPC resources and technologies, (3) options for HPC-based linear and iterative solvers, (4) hidden HPC details, such as data communication and distribution, to enable development centered on mathematicalmore » models and algorithms rather than on computational details for power system researchers, and (5) easy integration of new dynamic models and related algorithms into the software package.« less

  13. EQS Goes R: Simulations for SEM Using the Package REQS

    ERIC Educational Resources Information Center

    Mair, Patrick; Wu, Eric; Bentler, Peter M.

    2010-01-01

    The REQS package is an interface between the R environment of statistical computing and the EQS software for structural equation modeling. The package consists of 3 main functions that read EQS script files and import the results into R, call EQS script files from R, and run EQS script files from R and import the results after EQS computations.…

  14. High-Performance Computing for the Electromagnetic Modeling and Simulation of Interconnects

    NASA Technical Reports Server (NTRS)

    Schutt-Aine, Jose E.

    1996-01-01

    The electromagnetic modeling of packages and interconnects plays a very important role in the design of high-speed digital circuits, and is most efficiently performed by using computer-aided design algorithms. In recent years, packaging has become a critical area in the design of high-speed communication systems and fast computers, and the importance of the software support for their development has increased accordingly. Throughout this project, our efforts have focused on the development of modeling and simulation techniques and algorithms that permit the fast computation of the electrical parameters of interconnects and the efficient simulation of their electrical performance.

  15. Use of computer modeling to investigate a dynamic interaction problem in the Skylab TACS quad-valve package

    NASA Technical Reports Server (NTRS)

    Hesser, R. J.; Gershman, R.

    1975-01-01

    A valve opening-response problem encountered during development of a control valve for the Skylab thruster attitude control system (TACS) is described. The problem involved effects of dynamic interaction among valves in the quad-redundant valve package. Also described is a detailed computer simulation of the quad-valve package which was helpful in resolving the problem.

  16. A New Streamflow-Routing (SFR1) Package to Simulate Stream-Aquifer Interaction with MODFLOW-2000

    USGS Publications Warehouse

    Prudic, David E.; Konikow, Leonard F.; Banta, Edward R.

    2004-01-01

    The increasing concern for water and its quality require improved methods to evaluate the interaction between streams and aquifers and the strong influence that streams can have on the flow and transport of contaminants through many aquifers. For this reason, a new Streamflow-Routing (SFR1) Package was written for use with the U.S. Geological Survey's MODFLOW-2000 ground-water flow model. The SFR1 Package is linked to the Lake (LAK3) Package, and both have been integrated with the Ground-Water Transport (GWT) Process of MODFLOW-2000 (MODFLOW-GWT). SFR1 replaces the previous Stream (STR1) Package, with the most important difference being that stream depth is computed at the midpoint of each reach instead of at the beginning of each reach, as was done in the original Stream Package. This approach allows for the addition and subtraction of water from runoff, precipitation, and evapotranspiration within each reach. Because the SFR1 Package computes stream depth differently than that for the original package, a different name was used to distinguish it from the original Stream (STR1) Package. The SFR1 Package has five options for simulating stream depth and four options for computing diversions from a stream. The options for computing stream depth are: a specified value; Manning's equation (using a wide rectangular channel or an eight-point cross section); a power equation; or a table of values that relate flow to depth and width. Each stream segment can have a different option. Outflow from lakes can be computed using the same options. Because the wetted perimeter is computed for the eight-point cross section and width is computed for the power equation and table of values, the streambed conductance term no longer needs to be calculated externally whenever the area of streambed changes as a function of flow. The concentration of solute is computed in a stream network when MODFLOW-GWT is used in conjunction with the SFR1 Package. The concentration of a solute in a stream reach is based on a mass-balance approach and accounts for exchanges with (inputs from or losses to) ground-water systems. Two test examples are used to illustrate some of the capabilities of the SFR1 Package. The first test simulation was designed to illustrate how pumping of ground water from an aquifer connected to streams can affect streamflow, depth, width, and streambed conductance using the different options. The second test simulation was designed to illustrate solute transport through interconnected lakes, streams, and aquifers. Because of the need to examine time series results from the model simulations, the Gage Package first described in the LAK3 documentation was revised to include time series results of selected variables (streamflows, stream depth and width, streambed conductance, solute concentrations, and solute loads) for specified stream reaches. The mass-balance or continuity approach for routing flow and solutes through a stream network may not be applicable for all interactions between streams and aquifers. The SFR1 Package is best suited for modeling long-term changes (months to hundreds of years) in ground-water flow and solute concentrations using averaged flows in streams. The Package is not recommended for modeling the transient exchange of water between streams and aquifers when the objective is to examine short-term (minutes to days) effects caused by rapidly changing streamflows.

  17. Documentation of a computer program to simulate aquifer-system compaction using the modular finite-difference ground-water flow model

    USGS Publications Warehouse

    Leake, S.A.; Prudic, David E.

    1991-01-01

    Removal of ground water by pumping from aquifers may result in compaction of compressible fine-grained beds that are within or adjacent to the aquifers. Compaction of the sediments and resulting land subsidence may be permanent if the head declines result in vertical stresses beyond the previous maximum stress. The process of permanent compaction is not routinely included in simulations of ground-water flow. To simulate storage changes from both elastic and inelastic compaction, a computer program was written for use with the U.S. Geological Survey modular finite-difference ground- water flow model. The new program, the Interbed-Storage Package, is designed to be incorporated into this model. In the Interbed-Storage Package, elastic compaction or expansion is assumed to be proportional to change in head. The constant of proportionality is the product of the skeletal component of elastic specific storage and the thickness of the sediments. Similarly, inelastic compaction is assumed to be proportional to decline in head. The constant of proportionality is the product of the skeletal component of inelastic specific storage and the thickness of the sediments. Storage changes are incorporated into the ground-water flow model by adding an additional term to the right-hand side of the flow equation. Within a model time step, the package appropriately apportions storage changes between elastic and inelastic components on the basis of the relation of simulated head to the previous minimum (preconsolidation) head. Two tests were performed to verify that the package works correctly. The first test compared model-calculated storage and compaction changes to hand-calculated values for a three-dimensional simulation. Model and hand-calculated values were essentially equal. The second test was performed to compare the results of the Interbed-Storage Package with results of the one-dimensional Helm compaction model. This test problem simulated compaction in doubly draining confining beds stressed by head changes in adjacent aquifers. The Interbed-Storage Package and the Helm model computed essentially equal values of compaction. Documentation of the Interbed-Storage Package includes data input instructions, flow charts, narratives, and listings for each of the five modules included in the package. The documentation also includes an appendix describing input instructions and a listing of a computer program for time-variant specified-head boundaries. That package was developed to reduce the amount of data input and output associated with one of the Interbed-Storage Package test problems.

  18. Software package for modeling spin-orbit motion in storage rings

    NASA Astrophysics Data System (ADS)

    Zyuzin, D. V.

    2015-12-01

    A software package providing a graphical user interface for computer experiments on the motion of charged particle beams in accelerators, as well as analysis of obtained data, is presented. The software package was tested in the framework of the international project on electric dipole moment measurement JEDI (Jülich Electric Dipole moment Investigations). The specific features of particle spin motion imply the requirement to use a cyclic accelerator (storage ring) consisting of electrostatic elements, which makes it possible to preserve horizontal polarization for a long time. Computer experiments study the dynamics of 106-109 particles in a beam during 109 turns in an accelerator (about 1012-1015 integration steps for the equations of motion). For designing an optimal accelerator structure, a large number of computer experiments on polarized beam dynamics are required. The numerical core of the package is COSY Infinity, a program for modeling spin-orbit dynamics.

  19. Computational modelling of a thermoforming process for thermoplastic starch

    NASA Astrophysics Data System (ADS)

    Szegda, D.; Song, J.; Warby, M. K.; Whiteman, J. R.

    2007-05-01

    Plastic packaging waste currently forms a significant part of municipal solid waste and as such is causing increasing environmental concerns. Such packaging is largely non-biodegradable and is particularly difficult to recycle or to reuse due to its complex composition. Apart from limited recycling of some easily identifiable packaging wastes, such as bottles, most packaging waste ends up in landfill sites. In recent years, in an attempt to address this problem in the case of plastic packaging, the development of packaging materials from renewable plant resources has received increasing attention and a wide range of bioplastic materials based on starch are now available. Environmentally these bioplastic materials also reduce reliance on oil resources and have the advantage that they are biodegradable and can be composted upon disposal to reduce the environmental impact. Many food packaging containers are produced by thermoforming processes in which thin sheets are inflated under pressure into moulds to produce the required thin wall structures. Hitherto these thin sheets have almost exclusively been made of oil-based polymers and it is for these that computational models of thermoforming processes have been developed. Recently, in the context of bioplastics, commercial thermoplastic starch sheet materials have been developed. The behaviour of such materials is influenced both by temperature and, because of the inherent hydrophilic characteristics of the materials, by moisture content. Both of these aspects affect the behaviour of bioplastic sheets during the thermoforming process. This paper describes experimental work and work on the computational modelling of thermoforming processes for thermoplastic starch sheets in an attempt to address the combined effects of temperature and moisture content. After a discussion of the background of packaging and biomaterials, a mathematical model for the deformation of a membrane into a mould is presented, together with its finite element discretisation. This model depends on material parameters of the thermoplastic and details of tests undertaken to determine these and the results produced are given. Finally the computational model is applied for a thin sheet of commercially available thermoplastic starch material which is thermoformed into a specific mould. Numerical results of thickness and shape for this problem are given.

  20. Logistic Regression with Multiple Random Effects: A Simulation Study of Estimation Methods and Statistical Packages.

    PubMed

    Kim, Yoonsang; Choi, Young-Ku; Emery, Sherry

    2013-08-01

    Several statistical packages are capable of estimating generalized linear mixed models and these packages provide one or more of three estimation methods: penalized quasi-likelihood, Laplace, and Gauss-Hermite. Many studies have investigated these methods' performance for the mixed-effects logistic regression model. However, the authors focused on models with one or two random effects and assumed a simple covariance structure between them, which may not be realistic. When there are multiple correlated random effects in a model, the computation becomes intensive, and often an algorithm fails to converge. Moreover, in our analysis of smoking status and exposure to anti-tobacco advertisements, we have observed that when a model included multiple random effects, parameter estimates varied considerably from one statistical package to another even when using the same estimation method. This article presents a comprehensive review of the advantages and disadvantages of each estimation method. In addition, we compare the performances of the three methods across statistical packages via simulation, which involves two- and three-level logistic regression models with at least three correlated random effects. We apply our findings to a real dataset. Our results suggest that two packages-SAS GLIMMIX Laplace and SuperMix Gaussian quadrature-perform well in terms of accuracy, precision, convergence rates, and computing speed. We also discuss the strengths and weaknesses of the two packages in regard to sample sizes.

  1. An Innovative Learning Model for Computation in First Year Mathematics

    ERIC Educational Resources Information Center

    Tonkes, E. J.; Loch, B. I.; Stace, A. W.

    2005-01-01

    MATLAB is a sophisticated software tool for numerical analysis and visualization. The University of Queensland has adopted Matlab as its official teaching package across large first year mathematics courses. In the past, the package has met severe resistance from students who have not appreciated their computational experience. Several main…

  2. Surface Modeling, Solid Modeling and Finite Element Modeling. Analysis Capabilities of Computer-Assisted Design and Manufacturing Systems.

    ERIC Educational Resources Information Center

    Nee, John G.; Kare, Audhut P.

    1987-01-01

    Explores several concepts in computer assisted design/computer assisted manufacturing (CAD/CAM). Defines, evaluates, reviews and compares advanced computer-aided geometric modeling and analysis techniques. Presents the results of a survey to establish the capabilities of minicomputer based-systems with the CAD/CAM packages evaluated. (CW)

  3. Ku-Band rendezvous radar performance computer simulation model

    NASA Technical Reports Server (NTRS)

    Magnusson, H. G.; Goff, M. F.

    1984-01-01

    All work performed on the Ku-band rendezvous radar performance computer simulation model program since the release of the preliminary final report is summarized. Developments on the program fall into three distinct categories: (1) modifications to the existing Ku-band radar tracking performance computer model; (2) the addition of a highly accurate, nonrealtime search and acquisition performance computer model to the total software package developed on this program; and (3) development of radar cross section (RCS) computation models for three additional satellites. All changes in the tracking model involved improvements in the automatic gain control (AGC) and the radar signal strength (RSS) computer models. Although the search and acquisition computer models were developed under the auspices of the Hughes Aircraft Company Ku-Band Integrated Radar and Communications Subsystem program office, they have been supplied to NASA as part of the Ku-band radar performance comuter model package. Their purpose is to predict Ku-band acquisition performance for specific satellite targets on specific missions. The RCS models were developed for three satellites: the Long Duration Exposure Facility (LDEF) spacecraft, the Solar Maximum Mission (SMM) spacecraft, and the Space Telescopes.

  4. Ku-Band rendezvous radar performance computer simulation model

    NASA Astrophysics Data System (ADS)

    Magnusson, H. G.; Goff, M. F.

    1984-06-01

    All work performed on the Ku-band rendezvous radar performance computer simulation model program since the release of the preliminary final report is summarized. Developments on the program fall into three distinct categories: (1) modifications to the existing Ku-band radar tracking performance computer model; (2) the addition of a highly accurate, nonrealtime search and acquisition performance computer model to the total software package developed on this program; and (3) development of radar cross section (RCS) computation models for three additional satellites. All changes in the tracking model involved improvements in the automatic gain control (AGC) and the radar signal strength (RSS) computer models. Although the search and acquisition computer models were developed under the auspices of the Hughes Aircraft Company Ku-Band Integrated Radar and Communications Subsystem program office, they have been supplied to NASA as part of the Ku-band radar performance comuter model package. Their purpose is to predict Ku-band acquisition performance for specific satellite targets on specific missions. The RCS models were developed for three satellites: the Long Duration Exposure Facility (LDEF) spacecraft, the Solar Maximum Mission (SMM) spacecraft, and the Space Telescopes.

  5. An Ada Linear-Algebra Software Package Modeled After HAL/S

    NASA Technical Reports Server (NTRS)

    Klumpp, Allan R.; Lawson, Charles L.

    1990-01-01

    New avionics software written more easily. Software package extends Ada programming language to include linear-algebra capabilities similar to those of HAL/S programming language. Designed for such avionics applications as Space Station flight software. In addition to built-in functions of HAL/S, package incorporates quaternion functions used in Space Shuttle and Galileo projects and routines from LINPAK solving systems of equations involving general square matrices. Contains two generic programs: one for floating-point computations and one for integer computations. Written on IBM/AT personal computer running under PC DOS, v.3.1.

  6. EFTofPNG: a package for high precision computation with the effective field theory of post-Newtonian gravity

    NASA Astrophysics Data System (ADS)

    Levi, Michele; Steinhoff, Jan

    2017-12-01

    We present a novel public package ‘EFTofPNG’ for high precision computation in the effective field theory of post-Newtonian (PN) gravity, including spins. We created this package in view of the timely need to publicly share automated computation tools, which integrate the various types of physics manifested in the expected increasing influx of gravitational wave (GW) data. Hence, we created a free and open source package, which is self-contained, modular, all-inclusive, and accessible to the classical gravity community. The ‘EFTofPNG’ Mathematica package also uses the power of the ‘xTensor’ package, suited for complicated tensor computation, where our coding also strategically approaches the generic generation of Feynman contractions, which is universal to all perturbation theories in physics, by efficiently treating n-point functions as tensors of rank n. The package currently contains four independent units, which serve as subsidiaries to the main one. Its final unit serves as a pipeline chain for the obtainment of the final GW templates, and provides the full computation of derivatives and physical observables of interest. The upcoming ‘EFTofPNG’ package version 1.0 should cover the point mass sector, and all the spin sectors, up to the fourth PN order, and the two-loop level. We expect and strongly encourage public development of the package to improve its efficiency, and to extend it to further PN sectors, and observables useful for the waveform modelling.

  7. 'spup' - an R package for uncertainty propagation in spatial environmental modelling

    NASA Astrophysics Data System (ADS)

    Sawicka, Kasia; Heuvelink, Gerard

    2016-04-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability, including case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected static and interactive visualization methods that are understandable by non-experts with limited background in statistics can be used to summarize and visualize uncertainty about the measured input, model parameters and output of the uncertainty propagation. We demonstrate that the 'spup' package is an effective and easy tool to apply and can be used in multi-disciplinary research and model-based decision support.

  8. 'spup' - an R package for uncertainty propagation analysis in spatial environmental modelling

    NASA Astrophysics Data System (ADS)

    Sawicka, Kasia; Heuvelink, Gerard

    2017-04-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability and being able to deal with case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected visualization methods that are understandable by non-experts with limited background in statistics can be used to summarize and visualize uncertainty about the measured input, model parameters and output of the uncertainty propagation. We demonstrate that the 'spup' package is an effective and easy tool to apply and can be used in multi-disciplinary research and model-based decision support.

  9. The `TTIME' Package: Performance Evaluation in a Cluster Computing Environment

    NASA Astrophysics Data System (ADS)

    Howe, Marico; Berleant, Daniel; Everett, Albert

    2011-06-01

    The objective of translating developmental event time across mammalian species is to gain an understanding of the timing of human developmental events based on known time of those events in animals. The potential benefits include improvements to diagnostic and intervention capabilities. The CRAN `ttime' package provides the functionality to infer unknown event timings and investigate phylogenetic proximity utilizing hierarchical clustering of both known and predicted event timings. The original generic mammalian model included nine eutherian mammals: Felis domestica (cat), Mustela putorius furo (ferret), Mesocricetus auratus (hamster), Macaca mulatta (monkey), Homo sapiens (humans), Mus musculus (mouse), Oryctolagus cuniculus (rabbit), Rattus norvegicus (rat), and Acomys cahirinus (spiny mouse). However, the data for this model is expected to grow as more data about developmental events is identified and incorporated into the analysis. Performance evaluation of the `ttime' package across a cluster computing environment versus a comparative analysis in a serial computing environment provides an important computational performance assessment. A theoretical analysis is the first stage of a process in which the second stage, if justified by the theoretical analysis, is to investigate an actual implementation of the `ttime' package in a cluster computing environment and to understand the parallelization process that underlies implementation.

  10. 10 CFR 431.92 - Definitions concerning commercial air conditioners and heat pumps.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... measurement. Commercial package air-conditioning and heating equipment means air-cooled, water-cooled... Conditioner means a basic model of commercial package air-conditioning and heating equipment (packaged or split) that is: Used in computer rooms, data processing rooms, or other information technology cooling...

  11. Application of GA package in functional packaging

    NASA Astrophysics Data System (ADS)

    Belousova, D. A.; Noskova, E. E.; Kapulin, D. V.

    2018-05-01

    The approach to application program for the task of configuration of the elements of the commutation circuit for design of the radio-electronic equipment on the basis of the genetic algorithm is offered. The efficiency of the used approach for commutation circuits with different characteristics for computer-aided design on radio-electronic manufacturing is shown. The prototype of the computer-aided design subsystem on the basis of a package GA for R with a set of the general functions for optimization of multivariate models is programmed.

  12. Software package for modeling spin–orbit motion in storage rings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zyuzin, D. V., E-mail: d.zyuzin@fz-juelich.de

    2015-12-15

    A software package providing a graphical user interface for computer experiments on the motion of charged particle beams in accelerators, as well as analysis of obtained data, is presented. The software package was tested in the framework of the international project on electric dipole moment measurement JEDI (Jülich Electric Dipole moment Investigations). The specific features of particle spin motion imply the requirement to use a cyclic accelerator (storage ring) consisting of electrostatic elements, which makes it possible to preserve horizontal polarization for a long time. Computer experiments study the dynamics of 10{sup 6}–10{sup 9} particles in a beam during 10{supmore » 9} turns in an accelerator (about 10{sup 12}–10{sup 15} integration steps for the equations of motion). For designing an optimal accelerator structure, a large number of computer experiments on polarized beam dynamics are required. The numerical core of the package is COSY Infinity, a program for modeling spin–orbit dynamics.« less

  13. Alloy Design Workbench-Surface Modeling Package Developed

    NASA Technical Reports Server (NTRS)

    Abel, Phillip B.; Noebe, Ronald D.; Bozzolo, Guillermo H.; Good, Brian S.; Daugherty, Elaine S.

    2003-01-01

    NASA Glenn Research Center's Computational Materials Group has integrated a graphical user interface with in-house-developed surface modeling capabilities, with the goal of using computationally efficient atomistic simulations to aid the development of advanced aerospace materials, through the modeling of alloy surfaces, surface alloys, and segregation. The software is also ideal for modeling nanomaterials, since surface and interfacial effects can dominate material behavior and properties at this level. Through the combination of an accurate atomistic surface modeling methodology and an efficient computational engine, it is now possible to directly model these types of surface phenomenon and metallic nanostructures without a supercomputer. Fulfilling a High Operating Temperature Propulsion Components (HOTPC) project level-I milestone, a graphical user interface was created for a suite of quantum approximate atomistic materials modeling Fortran programs developed at Glenn. The resulting "Alloy Design Workbench-Surface Modeling Package" (ADW-SMP) is the combination of proven quantum approximate Bozzolo-Ferrante-Smith (BFS) algorithms (refs. 1 and 2) with a productivity-enhancing graphical front end. Written in the portable, platform independent Java programming language, the graphical user interface calls on extensively tested Fortran programs running in the background for the detailed computational tasks. Designed to run on desktop computers, the package has been deployed on PC, Mac, and SGI computer systems. The graphical user interface integrates two modes of computational materials exploration. One mode uses Monte Carlo simulations to determine lowest energy equilibrium configurations. The second approach is an interactive "what if" comparison of atomic configuration energies, designed to provide real-time insight into the underlying drivers of alloying processes.

  14. Logistic Regression with Multiple Random Effects: A Simulation Study of Estimation Methods and Statistical Packages

    PubMed Central

    Kim, Yoonsang; Emery, Sherry

    2013-01-01

    Several statistical packages are capable of estimating generalized linear mixed models and these packages provide one or more of three estimation methods: penalized quasi-likelihood, Laplace, and Gauss-Hermite. Many studies have investigated these methods’ performance for the mixed-effects logistic regression model. However, the authors focused on models with one or two random effects and assumed a simple covariance structure between them, which may not be realistic. When there are multiple correlated random effects in a model, the computation becomes intensive, and often an algorithm fails to converge. Moreover, in our analysis of smoking status and exposure to anti-tobacco advertisements, we have observed that when a model included multiple random effects, parameter estimates varied considerably from one statistical package to another even when using the same estimation method. This article presents a comprehensive review of the advantages and disadvantages of each estimation method. In addition, we compare the performances of the three methods across statistical packages via simulation, which involves two- and three-level logistic regression models with at least three correlated random effects. We apply our findings to a real dataset. Our results suggest that two packages—SAS GLIMMIX Laplace and SuperMix Gaussian quadrature—perform well in terms of accuracy, precision, convergence rates, and computing speed. We also discuss the strengths and weaknesses of the two packages in regard to sample sizes. PMID:24288415

  15. MODFLOW-2005 : the U.S. Geological Survey modular ground-water model--the ground-water flow process

    USGS Publications Warehouse

    Harbaugh, Arlen W.

    2005-01-01

    This report presents MODFLOW-2005, which is a new version of the finite-difference ground-water model commonly called MODFLOW. Ground-water flow is simulated using a block-centered finite-difference approach. Layers can be simulated as confined or unconfined. Flow associated with external stresses, such as wells, areal recharge, evapotranspiration, drains, and rivers, also can be simulated. The report includes detailed explanations of physical and mathematical concepts on which the model is based, an explanation of how those concepts are incorporated in the modular structure of the computer program, instructions for using the model, and details of the computer code. The modular structure consists of a MAIN Program and a series of highly independent subroutines. The subroutines are grouped into 'packages.' Each package deals with a specific feature of the hydrologic system that is to be simulated, such as flow from rivers or flow into drains, or with a specific method of solving the set of simultaneous equations resulting from the finite-difference method. Several solution methods are incorporated, including the Preconditioned Conjugate-Gradient method. The division of the program into packages permits the user to examine specific hydrologic features of the model independently. This also facilitates development of additional capabilities because new packages can be added to the program without modifying the existing packages. The input and output systems of the computer program also are designed to permit maximum flexibility. The program is designed to allow other capabilities, such as transport and optimization, to be incorporated, but this report is limited to describing the ground-water flow capability. The program is written in Fortran 90 and will run without modification on most computers that have a Fortran 90 compiler.

  16. A multimedia adult literacy program: Combining NASA technology, instructional design theory, and authentic literacy concepts

    NASA Technical Reports Server (NTRS)

    Willis, Jerry W.

    1993-01-01

    For a number of years, the Software Technology Branch of the Information Systems Directorate has been involved in the application of cutting edge hardware and software technologies to instructional tasks related to NASA projects. The branch has developed intelligent computer aided training shells, instructional applications of virtual reality and multimedia, and computer-based instructional packages that use fuzzy logic for both instructional and diagnostic decision making. One outcome of the work on space-related technology-supported instruction has been the creation of a significant pool of human talent in the branch with current expertise on the cutting edges of instructional technologies. When the human talent is combined with advanced technologies for graphics, sound, video, CD-ROM, and high speed computing, the result is a powerful research and development group that both contributes to the applied foundations of instructional technology and creates effective instructional packages that take advantage of a range of advanced technologies. Several branch projects are currently underway that combine NASA-developed expertise to significant instructional problems in public education. The branch, for example, has developed intelligent computer aided software to help high school students learn physics and staff are currently working on a project to produce educational software for young children with language deficits. This report deals with another project, the adult literacy tutor. Unfortunately, while there are a number of computer-based instructional packages available for adult literacy instruction, most of them are based on the same instructional models that failed these students when they were in school. The teacher-centered, discrete skill and drill-oriented, instructional strategies, even when they are supported by color computer graphics and animation, that form the foundation for most of the computer-based literacy packages currently on the market may not be the most effective or most desirable way to use computer technology in literacy programs. This project is developing a series of instructional packages that are based on a different instructional model - authentic instruction. The instructional development model used to create these packages is also different. Instead of using the traditional five stage linear, sequential model based on behavioral learning theory, the project uses the recursive, reflective design and development model (R2D2) that is based on cognitive learning theory, particularly the social constructivism of Vygotsky, and an epistemology based on critical theory. Using alternative instructional and instructional development theories, the result of the summer faculty fellowship is LiteraCity, a multimedia adult literacy instructional package that is a simulation of finding and applying for a job. The program, which is about 120 megabytes, is distributed on CD-ROM.

  17. ARM Data-Oriented Metrics and Diagnostics Package for Climate Model Evaluation Value-Added Product

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Chengzhu; Xie, Shaocheng

    A Python-based metrics and diagnostics package is currently being developed by the U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Infrastructure Team at Lawrence Livermore National Laboratory (LLNL) to facilitate the use of long-term, high-frequency measurements from the ARM Facility in evaluating the regional climate simulation of clouds, radiation, and precipitation. This metrics and diagnostics package computes climatological means of targeted climate model simulation and generates tables and plots for comparing the model simulation with ARM observational data. The Coupled Model Intercomparison Project (CMIP) model data sets are also included in the package to enable model intercomparison as demonstratedmore » in Zhang et al. (2017). The mean of the CMIP model can serve as a reference for individual models. Basic performance metrics are computed to measure the accuracy of mean state and variability of climate models. The evaluated physical quantities include cloud fraction, temperature, relative humidity, cloud liquid water path, total column water vapor, precipitation, sensible and latent heat fluxes, and radiative fluxes, with plan to extend to more fields, such as aerosol and microphysics properties. Process-oriented diagnostics focusing on individual cloud- and precipitation-related phenomena are also being developed for the evaluation and development of specific model physical parameterizations. The version 1.0 package is designed based on data collected at ARM’s Southern Great Plains (SGP) Research Facility, with the plan to extend to other ARM sites. The metrics and diagnostics package is currently built upon standard Python libraries and additional Python packages developed by DOE (such as CDMS and CDAT). The ARM metrics and diagnostic package is available publicly with the hope that it can serve as an easy entry point for climate modelers to compare their models with ARM data. In this report, we first present the input data, which constitutes the core content of the metrics and diagnostics package in section 2, and a user's guide documenting the workflow/structure of the version 1.0 codes, and including step-by-step instruction for running the package in section 3.« less

  18. Documentation of a computer program to simulate aquifer-system compaction using the modular finite-difference ground-water flow model

    USGS Publications Warehouse

    Leake, S.A.; Prudic, David E.

    1988-01-01

    The process of permanent compaction is not routinely included in simulations of groundwater flow. To simulate storage changes from both elastic and inelastic compaction, a computer program was written for use with the U. S. Geological Survey modular finite-difference groundwater flow model. The new program is called the Interbed-Storage Package. In the Interbed-Storage Package, elastic compaction or expansion is assumed to be proportional to change in head. The constant of proportionality is the product of skeletal component of elastic specific storage and thickness of the sediments. Similarly, inelastic compaction is assumed to be proportional to decline in head. The constant of proportionality is the product of the skeletal component of inelastic specific storage and the thickness of the sediments. Storage changes are incorporated into the groundwater flow model by adding an additional term to the flow equation. Within a model time step, the package appropriately apportions storage changes between elastic and inelastic components on the basis of the relation of simulated head to the previous minimum head. Another package that allows for a time-varying specified-head boundary is also documented. This package was written to reduce the data requirements for test simulations of the Interbed-Storage Package. (USGS)

  19. LakeMetabolizer: An R package for estimating lake metabolism from free-water oxygen using diverse statistical models

    USGS Publications Warehouse

    Winslow, Luke; Zwart, Jacob A.; Batt, Ryan D.; Dugan, Hilary; Woolway, R. Iestyn; Corman, Jessica; Hanson, Paul C.; Read, Jordan S.

    2016-01-01

    Metabolism is a fundamental process in ecosystems that crosses multiple scales of organization from individual organisms to whole ecosystems. To improve sharing and reuse of published metabolism models, we developed LakeMetabolizer, an R package for estimating lake metabolism from in situ time series of dissolved oxygen, water temperature, and, optionally, additional environmental variables. LakeMetabolizer implements 5 different metabolism models with diverse statistical underpinnings: bookkeeping, ordinary least squares, maximum likelihood, Kalman filter, and Bayesian. Each of these 5 metabolism models can be combined with 1 of 7 models for computing the coefficient of gas exchange across the air–water interface (k). LakeMetabolizer also features a variety of supporting functions that compute conversions and implement calculations commonly applied to raw data prior to estimating metabolism (e.g., oxygen saturation and optical conversion models). These tools have been organized into an R package that contains example data, example use-cases, and function documentation. The release package version is available on the Comprehensive R Archive Network (CRAN), and the full open-source GPL-licensed code is freely available for examination and extension online. With this unified, open-source, and freely available package, we hope to improve access and facilitate the application of metabolism in studies and management of lentic ecosystems.

  20. GRAVTool, a Package to Compute Geoid Model by Remove-Compute-Restore Technique

    NASA Astrophysics Data System (ADS)

    Marotta, G. S.; Blitzkow, D.; Vidotti, R. M.

    2015-12-01

    Currently, there are several methods to determine geoid models. They can be based on terrestrial gravity data, geopotential coefficients, astro-geodetic data or a combination of them. Among the techniques to compute a precise geoid model, the Remove-Compute-Restore (RCR) has been widely applied. It considers short, medium and long wavelengths derived from altitude data provided by Digital Terrain Models (DTM), terrestrial gravity data and global geopotential coefficients, respectively. In order to apply this technique, it is necessary to create procedures that compute gravity anomalies and geoid models, by the integration of different wavelengths, and that adjust these models to one local vertical datum. This research presents a developed package called GRAVTool based on MATLAB software to compute local geoid models by RCR technique and its application in a study area. The studied area comprehends the federal district of Brazil, with ~6000 km², wavy relief, heights varying from 600 m to 1340 m, located between the coordinates 48.25ºW, 15.45ºS and 47.33ºW, 16.06ºS. The results of the numerical example on the studied area show the local geoid model computed by the GRAVTool package (Figure), using 1377 terrestrial gravity data, SRTM data with 3 arc second of resolution, and geopotential coefficients of the EIGEN-6C4 model to degree 360. The accuracy of the computed model (σ = ± 0.071 m, RMS = 0.069 m, maximum = 0.178 m and minimum = -0.123 m) matches the uncertainty (σ =± 0.073) of 21 points randomly spaced where the geoid was computed by geometrical leveling technique supported by positioning GNSS. The results were also better than those achieved by Brazilian official regional geoid model (σ = ± 0.099 m, RMS = 0.208 m, maximum = 0.419 m and minimum = -0.040 m).

  1. Application of CEDA and ASPIC computer packages to the hairtail ( Trichiurus japonicus) fishery in the East China Sea

    NASA Astrophysics Data System (ADS)

    Wang, Yu; Liu, Qun

    2013-01-01

    Surplus-production models are widely used in fish stock assessment and fisheries management due to their simplicity and lower data demands than age-structured models such as Virtual Population Analysis. The CEDA (catch-effort data analysis) and ASPIC (a surplus-production model incorporating covariates) computer packages are data-fitting or parameter estimation tools that have been developed to analyze catch-and-effort data using non-equilibrium surplus production models. We applied CEDA and ASPIC to the hairtail ( Trichiurus japonicus) fishery in the East China Sea. Both packages produced robust results and yielded similar estimates. In CEDA, the Schaefer surplus production model with log-normal error assumption produced results close to those of ASPIC. CEDA is sensitive to the choice of initial proportion, while ASPIC is not. However, CEDA produced higher R 2 values than ASPIC.

  2. GillesPy: A Python Package for Stochastic Model Building and Simulation.

    PubMed

    Abel, John H; Drawert, Brian; Hellander, Andreas; Petzold, Linda R

    2016-09-01

    GillesPy is an open-source Python package for model construction and simulation of stochastic biochemical systems. GillesPy consists of a Python framework for model building and an interface to the StochKit2 suite of efficient simulation algorithms based on the Gillespie stochastic simulation algorithms (SSA). To enable intuitive model construction and seamless integration into the scientific Python stack, we present an easy to understand, action-oriented programming interface. Here, we describe the components of this package and provide a detailed example relevant to the computational biology community.

  3. GillesPy: A Python Package for Stochastic Model Building and Simulation

    PubMed Central

    Abel, John H.; Drawert, Brian; Hellander, Andreas; Petzold, Linda R.

    2017-01-01

    GillesPy is an open-source Python package for model construction and simulation of stochastic biochemical systems. GillesPy consists of a Python framework for model building and an interface to the StochKit2 suite of efficient simulation algorithms based on the Gillespie stochastic simulation algorithms (SSA). To enable intuitive model construction and seamless integration into the scientific Python stack, we present an easy to understand, action-oriented programming interface. Here, we describe the components of this package and provide a detailed example relevant to the computational biology community. PMID:28630888

  4. Quantum lattice model solver HΦ

    NASA Astrophysics Data System (ADS)

    Kawamura, Mitsuaki; Yoshimi, Kazuyoshi; Misawa, Takahiro; Yamaji, Youhei; Todo, Synge; Kawashima, Naoki

    2017-08-01

    HΦ [aitch-phi ] is a program package based on the Lanczos-type eigenvalue solution applicable to a broad range of quantum lattice models, i.e., arbitrary quantum lattice models with two-body interactions, including the Heisenberg model, the Kitaev model, the Hubbard model and the Kondo-lattice model. While it works well on PCs and PC-clusters, HΦ also runs efficiently on massively parallel computers, which considerably extends the tractable range of the system size. In addition, unlike most existing packages, HΦ supports finite-temperature calculations through the method of thermal pure quantum (TPQ) states. In this paper, we explain theoretical background and user-interface of HΦ. We also show the benchmark results of HΦ on supercomputers such as the K computer at RIKEN Advanced Institute for Computational Science (AICS) and SGI ICE XA (Sekirei) at the Institute for the Solid State Physics (ISSP).

  5. Motivation, values, and work design as drivers of participation in the R open source project for statistical computing

    PubMed Central

    Mair, Patrick; Hofmann, Eva; Gruber, Kathrin; Hatzinger, Reinhold; Zeileis, Achim; Hornik, Kurt

    2015-01-01

    One of the cornerstones of the R system for statistical computing is the multitude of packages contributed by numerous package authors. This amount of packages makes an extremely broad range of statistical techniques and other quantitative methods freely available. Thus far, no empirical study has investigated psychological factors that drive authors to participate in the R project. This article presents a study of R package authors, collecting data on different types of participation (number of packages, participation in mailing lists, participation in conferences), three psychological scales (types of motivation, psychological values, and work design characteristics), and various socio-demographic factors. The data are analyzed using item response models and subsequent generalized linear models, showing that the most important determinants for participation are a hybrid form of motivation and the social characteristics of the work design. Other factors are found to have less impact or influence only specific aspects of participation. PMID:26554005

  6. Motivation, values, and work design as drivers of participation in the R open source project for statistical computing.

    PubMed

    Mair, Patrick; Hofmann, Eva; Gruber, Kathrin; Hatzinger, Reinhold; Zeileis, Achim; Hornik, Kurt

    2015-12-01

    One of the cornerstones of the R system for statistical computing is the multitude of packages contributed by numerous package authors. This amount of packages makes an extremely broad range of statistical techniques and other quantitative methods freely available. Thus far, no empirical study has investigated psychological factors that drive authors to participate in the R project. This article presents a study of R package authors, collecting data on different types of participation (number of packages, participation in mailing lists, participation in conferences), three psychological scales (types of motivation, psychological values, and work design characteristics), and various socio-demographic factors. The data are analyzed using item response models and subsequent generalized linear models, showing that the most important determinants for participation are a hybrid form of motivation and the social characteristics of the work design. Other factors are found to have less impact or influence only specific aspects of participation.

  7. Models and techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1978-01-01

    Progress in the development of system models and techniques for the formulation and evaluation of aircraft computer system effectiveness is reported. Topics covered include: analysis of functional dependence: a prototype software package, METAPHOR, developed to aid the evaluation of performability; and a comprehensive performability modeling and evaluation exercise involving the SIFT computer.

  8. Use of symbolic computation in robotics education

    NASA Technical Reports Server (NTRS)

    Vira, Naren; Tunstel, Edward

    1992-01-01

    An application of symbolic computation in robotics education is described. A software package is presented which combines generality, user interaction, and user-friendliness with the systematic usage of symbolic computation and artificial intelligence techniques. The software utilizes MACSYMA, a LISP-based symbolic algebra language, to automatically generate closed-form expressions representing forward and inverse kinematics solutions, the Jacobian transformation matrices, robot pose error-compensation models equations, and Lagrange dynamics formulation for N degree-of-freedom, open chain robotic manipulators. The goal of such a package is to aid faculty and students in the robotics course by removing burdensome tasks of mathematical manipulations. The software package has been successfully tested for its accuracy using commercially available robots.

  9. Documentation of a computer program to simulate stream-aquifer relations using a modular, finite-difference, ground-water flow model

    USGS Publications Warehouse

    Prudic, David E.

    1989-01-01

    Computer models are widely used to simulate groundwater flow for evaluating and managing the groundwater resource of many aquifers, but few are designed to also account for surface flow in streams. A computer program was written for use in the US Geological Survey modular finite difference groundwater flow model to account for the amount of flow in streams and to simulate the interaction between surface streams and groundwater. The new program is called the Streamflow-Routing Package. The Streamflow-Routing Package is not a true surface water flow model, but rather is an accounting program that tracks the flow in one or more streams which interact with groundwater. The program limits the amount of groundwater recharge to the available streamflow. It permits two or more streams to merge into one with flow in the merged stream equal to the sum of the tributary flows. The program also permits diversions from streams. The groundwater flow model with the Streamflow-Routing Package has an advantage over the analytical solution in simulating the interaction between aquifer and stream because it can be used to simulate complex systems that cannot be readily solved analytically. The Streamflow-Routing Package does not include a time function for streamflow but rather streamflow entering the modeled area is assumed to be instantly available to downstream reaches during each time period. This assumption is generally reasonable because of the relatively slow rate of groundwater flow. Another assumption is that leakage between streams and aquifers is instantaneous. This assumption may not be reasonable if the streams and aquifers are separated by a thick unsaturated zone. Documentation of the Streamflow-Routing Package includes data input instructions; flow charts, narratives, and listings of the computer program for each of four modules; and input data sets and printed results for two test problems, and one example problem. (Lantz-PTT)

  10. Bird impact analysis package for turbine engine fan blades

    NASA Technical Reports Server (NTRS)

    Hirschbein, M. S.

    1982-01-01

    A computer program has been developed to analyze the gross structural response of turbine engine fan blades subjected to bird strikes. The program couples a NASTRAN finite element model and modal analysis of a fan blade with a multi-mode bird impact analysis computer program. The impact analysis uses the NASTRAN blade model and a fluid jet model of the bird to interactively calculate blade loading during a bird strike event. The analysis package is computationaly efficient, easy to use and provides a comprehensive history of the gross structual blade response. Example cases are presented for a representative fan blade.

  11. Teacher's Corner: Structural Equation Modeling with the Sem Package in R

    ERIC Educational Resources Information Center

    Fox, John

    2006-01-01

    R is free, open-source, cooperatively developed software that implements the S statistical programming language and computing environment. The current capabilities of R are extensive, and it is in wide use, especially among statisticians. The sem package provides basic structural equation modeling facilities in R, including the ability to fit…

  12. spMC: an R-package for 3D lithological reconstructions based on spatial Markov chains

    NASA Astrophysics Data System (ADS)

    Sartore, Luca; Fabbri, Paolo; Gaetan, Carlo

    2016-09-01

    The paper presents the spatial Markov Chains (spMC) R-package and a case study of subsoil simulation/prediction located in a plain site of Northeastern Italy. spMC is a quite complete collection of advanced methods for data inspection, besides spMC implements Markov Chain models to estimate experimental transition probabilities of categorical lithological data. Furthermore, simulation methods based on most known prediction methods (as indicator Kriging and CoKriging) were implemented in spMC package. Moreover, other more advanced methods are available for simulations, e.g. path methods and Bayesian procedures, that exploit the maximum entropy. Since the spMC package was developed for intensive geostatistical computations, part of the code is implemented for parallel computations via the OpenMP constructs. A final analysis of this computational efficiency compares the simulation/prediction algorithms by using different numbers of CPU cores, and considering the example data set of the case study included in the package.

  13. The Amber Biomolecular Simulation Programs

    PubMed Central

    CASE, DAVID A.; CHEATHAM, THOMAS E.; DARDEN, TOM; GOHLKE, HOLGER; LUO, RAY; MERZ, KENNETH M.; ONUFRIEV, ALEXEY; SIMMERLING, CARLOS; WANG, BING; WOODS, ROBERT J.

    2006-01-01

    We describe the development, current features, and some directions for future development of the Amber package of computer programs. This package evolved from a program that was constructed in the late 1970s to do Assisted Model Building with Energy Refinement, and now contains a group of programs embodying a number of powerful tools of modern computational chemistry, focused on molecular dynamics and free energy calculations of proteins, nucleic acids, and carbohydrates. PMID:16200636

  14. Documentation of a computer program to simulate transient leakage from confining units using the modular finite-difference, ground-water flow model

    USGS Publications Warehouse

    Leake, S.A.; Leahy, P.P.; Navoy, A.S.

    1994-01-01

    Transient leakage into or out of a compressible fine-grained confining unit results from ground- water storage changes within the unit. The computer program described in this report provides a new method of simulating transient leakage using the U.S. Geological Survey modular finite- difference ground-water flow model (MODFLOW). The new program is referred to as the Transient- Leakage Package. The Transient-Leakage Package solves integrodifferential equations that describe flow across the upper and lower boundaries of confining units. For each confining unit, vertical hydraulic conductivity, thickness, and specific storage are specified in input arrays. These properties can vary from cell to cell and the confining unit need not be present at all locations in the grid; however, the confining units must be bounded above and below by model layers in which head is calculated or specified. The package was used in an example problem to simulate drawdown around a pumping well in a system with two aquifers separated by a confining unit. For drawdown values in excess of 1 centimeter, the solution using the new package closely matched an exact analytical solution. The problem also was simulated without the new package by using a separate model layer to represent the confining unit. That simulation was refined by using two model layers to represent the confining unit. The simulation using the Transient-Leakage Package was faster and more accurate than either of the simulations using model layers to represent the confining unit.

  15. Computer Applications in the Design Process.

    ERIC Educational Resources Information Center

    Winchip, Susan

    Computer Assisted Design (CAD) and Computer Assisted Manufacturing (CAM) are emerging technologies now being used in home economics and interior design applications. A microcomputer in a computer network system is capable of executing computer graphic functions such as three-dimensional modeling, as well as utilizing office automation packages to…

  16. Comparison of computer based instruction to behavior skills training for teaching staff implementation of discrete-trial instruction with an adult with autism.

    PubMed

    Nosik, Melissa R; Williams, W Larry; Garrido, Natalia; Lee, Sarah

    2013-01-01

    In the current study, behavior skills training (BST) is compared to a computer based training package for teaching discrete trial instruction to staff, teaching an adult with autism. The computer based training package consisted of instructions, video modeling and feedback. BST consisted of instructions, modeling, rehearsal and feedback. Following training, participants were evaluated in terms of their accuracy on completing critical skills for running a discrete trial program. Six participants completed training; three received behavior skills training and three received the computer based training. Participants in the BST group performed better overall after training and during six week probes than those in the computer based training group. There were differences across both groups between research assistant and natural environment competency levels. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. GRAVTool, Advances on the Package to Compute Geoid Model path by the Remove-Compute-Restore Technique, Following Helmert's Condensation Method

    NASA Astrophysics Data System (ADS)

    Marotta, G. S.

    2017-12-01

    Currently, there are several methods to determine geoid models. They can be based on terrestrial gravity data, geopotential coefficients, astrogeodetic data or a combination of them. Among the techniques to compute a precise geoid model, the Remove Compute Restore (RCR) has been widely applied. It considers short, medium and long wavelengths derived from altitude data provided by Digital Terrain Models (DTM), terrestrial gravity data and Global Geopotential Model (GGM), respectively. In order to apply this technique, it is necessary to create procedures that compute gravity anomalies and geoid models, by the integration of different wavelengths, and adjust these models to one local vertical datum. This research presents the advances on the package called GRAVTool to compute geoid models path by the RCR, following Helmert's condensation method, and its application in a study area. The studied area comprehends the federal district of Brazil, with 6000 km², wavy relief, heights varying from 600 m to 1340 m, located between the coordinates 48.25ºW, 15.45ºS and 47.33ºW, 16.06ºS. The results of the numerical example on the studied area show a geoid model computed by the GRAVTool package, after analysis of the density, DTM and GGM values, more adequate to the reference values used on the study area. The accuracy of the computed model (σ = ± 0.058 m, RMS = 0.067 m, maximum = 0.124 m and minimum = -0.155 m), using density value of 2.702 g/cm³ ±0.024 g/cm³, DTM SRTM Void Filled 3 arc-second and GGM EIGEN-6C4 up to degree and order 250, matches the uncertainty (σ =± 0.073) of 26 points randomly spaced where the geoid was computed by geometrical leveling technique supported by positioning GNSS. The results were also better than those achieved by Brazilian official regional geoid model (σ = ± 0.076 m, RMS = 0.098 m, maximum = 0.320 m and minimum = -0.061 m).

  18. A Modular Three-Dimensional Finite-Difference Ground-Water Flow Model

    USGS Publications Warehouse

    McDonald, Michael G.; Harbaugh, Arlen W.; Guo, Weixing; Lu, Guoping

    1988-01-01

    This report presents a finite-difference model and its associated modular computer program. The model simulates flow in three dimensions. The report includes detailed explanations of physical and mathematical concepts on which the model is based and an explanation of how those concepts are incorporated in the modular structure of the computer program. The modular structure consists of a Main Program and a series of highly independent subroutines called 'modules.' The modules are grouped into 'packages.' Each package deals with a specific feature of the hydrologic system which is to be simulated, such as flow from rivers or flow into drains, or with a specific method of solving linear equations which describe the flow system, such as the Strongly Implicit Procedure or Slice-Successive Overrelaxation. The division of the program into modules permits the user to examine specific hydrologic features of the model independently. This also facilita development of additional capabilities because new packages can be added to the program without modifying the existing packages. The input and output systems of the computer program are also designed to permit maximum flexibility. Ground-water flow within the aquifer is simulated using a block-centered finite-difference approach. Layers can be simulated as confined, unconfined, or a combination of confined and unconfined. Flow associated with external stresses, such as wells, areal recharge, evapotranspiration, drains, and streams, can also be simulated. The finite-difference equations can be solved using either the Strongly Implicit Procedure or Slice-Successive Overrelaxation. The program is written in FORTRAN 77 and will run without modification on most computers that have a FORTRAN 77 compiler. For each program ,module, this report includes a narrative description, a flow chart, a list of variables, and a module listing.

  19. Computing and software

    USGS Publications Warehouse

    White, Gary C.; Hines, J.E.

    2004-01-01

    The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methodsavailable. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them.In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented.Rotella et al. (2004) compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004).Efford et al. (2004) present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years.Barker & White (2004) discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine pieces of likelihood functions together to generate estimates. The idea is interesting, and maybe some bright young statistician can work out the specifics to implement the procedure.Choquet et al. (2004) describe MSURGE, a software package that implements the multistate capture–recapture models. The unique feature of MSURGE is that the design matrix is constructed with an interpreted language called GEMACO. Because MSURGE is limited to just multistate models, the special requirements of these likelihoods can be provided.The software and methods presented in these papers gives biologists and wildlife managers an expanding range of possibilities for data analysis. Although ease–of–use is generally getting better, it does not replace the need for understanding of the requirements and structure of the models being computed. The internet provides access to many free software packages as well as user–discussion groups to share knowledge and ideas. (A starting point for wildlife–related applications is (http://www.phidot.org).

  20. Scientific computations section monthly report, November 1993

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buckner, M.R.

    1993-12-30

    This progress report from the Savannah River Technology Center contains abstracts from papers from the computational modeling, applied statistics, applied physics, experimental thermal hydraulics, and packaging and transportation groups. Specific topics covered include: engineering modeling and process simulation, criticality methods and analysis, plutonium disposition.

  1. MODFLOW-2000, the U.S. Geological Survey Modular Ground-Water Model -Documentation of the Hydrogeologic-Unit Flow (HUF) Package

    USGS Publications Warehouse

    Anderman, E.R.; Hill, M.C.

    2000-01-01

    This report documents the Hydrogeologic-Unit Flow (HUF) Package for the groundwater modeling computer program MODFLOW-2000. The HUF Package is an alternative internal flow package that allows the vertical geometry of the system hydrogeology to be defined explicitly within the model using hydrogeologic units that can be different than the definition of the model layers. The HUF Package works with all the processes of MODFLOW-2000. For the Ground-Water Flow Process, the HUF Package calculates effective hydraulic properties for the model layers based on the hydraulic properties of the hydrogeologic units, which are defined by the user using parameters. The hydraulic properties are used to calculate the conductance coefficients and other terms needed to solve the ground-water flow equation. The sensitivity of the model to the parameters defined within the HUF Package input file can be calculated using the Sensitivity Process, using observations defined with the Observation Process. Optimal values of the parameters can be estimated by using the Parameter-Estimation Process. The HUF Package is nearly identical to the Layer-Property Flow (LPF) Package, the major difference being the definition of the vertical geometry of the system hydrogeology. Use of the HUF Package is illustrated in two test cases, which also serve to verify the performance of the package by showing that the Parameter-Estimation Process produces the true parameter values when exact observations are used.

  2. MODFLOW-2000, the U.S. Geological Survey modular ground-water model : user guide to the LMT6 package, the linkage with MT3DMS for multi-species mass transport modeling

    USGS Publications Warehouse

    Zheng, Chunmiao; Hill, Mary Catherine; Hsieh, Paul A.

    2001-01-01

    MODFLOW-2000, the newest version of MODFLOW, is a computer program that numerically solves the three-dimensional ground-water flow equation for a porous medium using a finite-difference method. MT3DMS, the successor to MT3D, is a computer program for modeling multi-species solute transport in three-dimensional ground-water systems using multiple solution techniques, including the finite-difference method, the method of characteristics (MOC), and the total-variation-diminishing (TVD) method. This report documents a new version of the Link-MT3DMS Package, which enables MODFLOW-2000 to produce the information needed by MT3DMS, and also discusses new visualization software for MT3DMS. Unlike the Link-MT3D Packages that coordinated previous versions of MODFLOW and MT3D, the new Link-MT3DMS Package requires an input file that, among other things, provides enhanced support for additional MODFLOW sink/source packages and allows list-directed (free) format for the flow model produced flow-transport link file. The report contains four parts: (a) documentation of the Link-MT3DMS Package Version 6 for MODFLOW-2000; (b) discussion of several issues related to simulation setup and input data preparation for running MT3DMS with MODFLOW-2000; (c) description of two test example problems, with comparison to results obtained using another MODFLOW-based transport program; and (d) overview of post-simulation visualization and animation using the U.S. Geological Survey?s Model Viewer.

  3. Sampling and sensitivity analyses tools (SaSAT) for computational modelling

    PubMed Central

    Hoare, Alexander; Regan, David G; Wilson, David P

    2008-01-01

    SaSAT (Sampling and Sensitivity Analysis Tools) is a user-friendly software package for applying uncertainty and sensitivity analyses to mathematical and computational models of arbitrary complexity and context. The toolbox is built in Matlab®, a numerical mathematical software package, and utilises algorithms contained in the Matlab® Statistics Toolbox. However, Matlab® is not required to use SaSAT as the software package is provided as an executable file with all the necessary supplementary files. The SaSAT package is also designed to work seamlessly with Microsoft Excel but no functionality is forfeited if that software is not available. A comprehensive suite of tools is provided to enable the following tasks to be easily performed: efficient and equitable sampling of parameter space by various methodologies; calculation of correlation coefficients; regression analysis; factor prioritisation; and graphical output of results, including response surfaces, tornado plots, and scatterplots. Use of SaSAT is exemplified by application to a simple epidemic model. To our knowledge, a number of the methods available in SaSAT for performing sensitivity analyses have not previously been used in epidemiological modelling and their usefulness in this context is demonstrated. PMID:18304361

  4. PyBoolNet: a python package for the generation, analysis and visualization of boolean networks.

    PubMed

    Klarner, Hannes; Streck, Adam; Siebert, Heike

    2017-03-01

    The goal of this project is to provide a simple interface to working with Boolean networks. Emphasis is put on easy access to a large number of common tasks including the generation and manipulation of networks, attractor and basin computation, model checking and trap space computation, execution of established graph algorithms as well as graph drawing and layouts. P y B ool N et is a Python package for working with Boolean networks that supports simple access to model checking via N u SMV, standard graph algorithms via N etwork X and visualization via dot . In addition, state of the art attractor computation exploiting P otassco ASP is implemented. The package is function-based and uses only native Python and N etwork X data types. https://github.com/hklarner/PyBoolNet. hannes.klarner@fu-berlin.de. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  5. Engaging Undergraduate Math Majors in Geoscience Research using Interactive Simulations and Computer Art

    NASA Astrophysics Data System (ADS)

    Matott, L. S.; Hymiak, B.; Reslink, C. F.; Baxter, C.; Aziz, S.

    2012-12-01

    As part of the NSF-sponsored 'URGE (Undergraduate Research Group Experiences) to Compute' program, Dr. Matott has been collaborating with talented Math majors to explore the design of cost-effective systems to safeguard groundwater supplies from contaminated sites. Such activity is aided by a combination of groundwater modeling, simulation-based optimization, and high-performance computing - disciplines largely unfamiliar to the students at the outset of the program. To help train and engage the students, a number of interactive and graphical software packages were utilized. Examples include: (1) a tutorial for exploring the behavior of evolutionary algorithms and other heuristic optimizers commonly used in simulation-based optimization; (2) an interactive groundwater modeling package for exploring alternative pump-and-treat containment scenarios at a contaminated site in Billings, Montana; (3) the R software package for visualizing various concepts related to subsurface hydrology; and (4) a job visualization tool for exploring the behavior of numerical experiments run on a large distributed computing cluster. Further engagement and excitement in the program was fostered by entering (and winning) a computer art competition run by the Coalition for Academic Scientific Computation (CASC). The winning submission visualizes an exhaustively mapped optimization cost surface and dramatically illustrates the phenomena of artificial minima - valley locations that correspond to designs whose costs are only partially optimal.

  6. Seismic waveform modeling over cloud

    NASA Astrophysics Data System (ADS)

    Luo, Cong; Friederich, Wolfgang

    2016-04-01

    With the fast growing computational technologies, numerical simulation of seismic wave propagation achieved huge successes. Obtaining the synthetic waveforms through numerical simulation receives an increasing amount of attention from seismologists. However, computational seismology is a data-intensive research field, and the numerical packages usually come with a steep learning curve. Users are expected to master considerable amount of computer knowledge and data processing skills. Training users to use the numerical packages, correctly access and utilize the computational resources is a troubled task. In addition to that, accessing to HPC is also a common difficulty for many users. To solve these problems, a cloud based solution dedicated on shallow seismic waveform modeling has been developed with the state-of-the-art web technologies. It is a web platform integrating both software and hardware with multilayer architecture: a well designed SQL database serves as the data layer, HPC and dedicated pipeline for it is the business layer. Through this platform, users will no longer need to compile and manipulate various packages on the local machine within local network to perform a simulation. By providing users professional access to the computational code through its interfaces and delivering our computational resources to the users over cloud, users can customize the simulation at expert-level, submit and run the job through it.

  7. Theory and Programs for Dynamic Modeling of Tree Rings from Climate

    Treesearch

    Paul C. van Deusen; Jennifer Koretz

    1988-01-01

    Computer programs written in GAUSS(TM) for IBM compatible personal computers are described that perform dynamic tree ring modeling with climate data; the underlying theory is also described. The programs and a separate users manual are available from the authors, although users must have the GAUSS software package on their personal computer. An example application of...

  8. A graph-based computational framework for simulation and optimisation of coupled infrastructure networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jalving, Jordan; Abhyankar, Shrirang; Kim, Kibaek

    Here, we present a computational framework that facilitates the construction, instantiation, and analysis of large-scale optimization and simulation applications of coupled energy networks. The framework integrates the optimization modeling package PLASMO and the simulation package DMNetwork (built around PETSc). These tools use a common graphbased abstraction that enables us to achieve compatibility between data structures and to build applications that use network models of different physical fidelity. We also describe how to embed these tools within complex computational workflows using SWIFT, which is a tool that facilitates parallel execution of multiple simulation runs and management of input and output data.more » We discuss how to use these capabilities to target coupled natural gas and electricity systems.« less

  9. A graph-based computational framework for simulation and optimisation of coupled infrastructure networks

    DOE PAGES

    Jalving, Jordan; Abhyankar, Shrirang; Kim, Kibaek; ...

    2017-04-24

    Here, we present a computational framework that facilitates the construction, instantiation, and analysis of large-scale optimization and simulation applications of coupled energy networks. The framework integrates the optimization modeling package PLASMO and the simulation package DMNetwork (built around PETSc). These tools use a common graphbased abstraction that enables us to achieve compatibility between data structures and to build applications that use network models of different physical fidelity. We also describe how to embed these tools within complex computational workflows using SWIFT, which is a tool that facilitates parallel execution of multiple simulation runs and management of input and output data.more » We discuss how to use these capabilities to target coupled natural gas and electricity systems.« less

  10. AstroBlend: Visualization package for use with Blender

    NASA Astrophysics Data System (ADS)

    Naiman, J. P.

    2015-12-01

    AstroBlend is a visualization package for use in the three dimensional animation and modeling software, Blender. It reads data in via a text file or can use pre-fab isosurface files stored as OBJ or Wavefront files. AstroBlend supports a variety of codes such as FLASH (ascl:1010.082), Enzo (ascl:1010.072), and Athena (ascl:1010.014), and combines artistic 3D models with computational astrophysics datasets to create models and animations.

  11. Computer Aided Design Parameters for Forward Basing

    DTIC Science & Technology

    1988-12-01

    21 meters. Systematic errors within limits stated for absolute accuracy are tolerated at this level. DEM data acquired photogrammetrically using manual ...This is a professional drawing package, 19 capable of the manipulation required for this project. With the AutoLISP programming language (a variation on...Table 2). 0 25 Data Conversion Package II GWN System’s Digital Terrain Modeling (DTM) package was used. This AutoLISP -based third party software is

  12. An Object-Oriented Serial DSMC Simulation Package

    NASA Astrophysics Data System (ADS)

    Liu, Hongli; Cai, Chunpei

    2011-05-01

    A newly developed three-dimensional direct simulation Monte Carlo (DSMC) simulation package, named GRASP ("Generalized Rarefied gAs Simulation Package"), is reported in this paper. This package utilizes the concept of simulation engine, many C++ features and software design patterns. The package has an open architecture which can benefit further development and maintenance of the code. In order to reduce the engineering time for three-dimensional models, a hybrid grid scheme, combined with a flexible data structure compiled by C++ language, are implemented in this package. This scheme utilizes a local data structure based on the computational cell to achieve high performance on workstation processors. This data structure allows the DSMC algorithm to be very efficiently parallelized with domain decomposition and it provides much flexibility in terms of grid types. This package can utilize traditional structured, unstructured or hybrid grids within the framework of a single code to model arbitrarily complex geometries and to simulate rarefied gas flows. Benchmark test cases indicate that this package has satisfactory accuracy for complex rarefied gas flows.

  13. MODFLOW Ground-Water Model - User Guide to the Subsidence and Aquifer-System Compaction Package (SUB-WT) for Water-Table Aquifers

    USGS Publications Warehouse

    Leake, S.A.; Galloway, D.L.

    2007-01-01

    A new computer program was developed to simulate vertical compaction in models of regional ground-water flow. The program simulates ground-water storage changes and compaction in discontinuous interbeds or in extensive confining units, accounting for stress-dependent changes in storage properties. The new program is a package for MODFLOW, the U.S. Geological Survey modular finite-difference ground-water flow model. Several features of the program make it useful for application in shallow, unconfined flow systems. Geostatic stress can be treated as a function of water-table elevation, and compaction is a function of computed changes in effective stress at the bottom of a model layer. Thickness of compressible sediments in an unconfined model layer can vary in proportion to saturated thickness.

  14. Modeling unsaturated zone flow and runoff processes by integrating MODFLOW-LGR and VSF, and creating the new CFL package

    USGS Publications Warehouse

    Borsia, I.; Rossetto, R.; Schifani, C.; Hill, Mary C.

    2013-01-01

    In this paper two modifications to the MODFLOW code are presented. One concerns an extension of Local Grid Refinement (LGR) to Variable Saturated Flow process (VSF) capability. This modification allows the user to solve the 3D Richards’ equation only in selected parts of the model domain. The second modification introduces a new package, named CFL (Cascading Flow), which improves the computation of overland flow when ground surface saturation is simulated using either VSF or the Unsaturated Zone Flow (UZF) package. The modeling concepts are presented and demonstrated. Programmer documentation is included in appendices.

  15. From Numerical Problem Solving to Model-Based Experimentation Incorporating Computer-Based Tools of Various Scales into the ChE Curriculum

    ERIC Educational Resources Information Center

    Shacham, Mordechai; Cutlip, Michael B.; Brauner, Neima

    2009-01-01

    A continuing challenge to the undergraduate chemical engineering curriculum is the time-effective incorporation and use of computer-based tools throughout the educational program. Computing skills in academia and industry require some proficiency in programming and effective use of software packages for solving 1) single-model, single-algorithm…

  16. Computer Facilitated Mathematical Methods in Chemical Engineering--Similarity Solution

    ERIC Educational Resources Information Center

    Subramanian, Venkat R.

    2006-01-01

    High-performance computers coupled with highly efficient numerical schemes and user-friendly software packages have helped instructors to teach numerical solutions and analysis of various nonlinear models more efficiently in the classroom. One of the main objectives of a model is to provide insight about the system of interest. Analytical…

  17. Open Source Molecular Modeling

    PubMed Central

    Pirhadi, Somayeh; Sunseri, Jocelyn; Koes, David Ryan

    2016-01-01

    The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry. PMID:27631126

  18. An Algorithm and R Program for Fitting and Simulation of Pharmacokinetic and Pharmacodynamic Data.

    PubMed

    Li, Jijie; Yan, Kewei; Hou, Lisha; Du, Xudong; Zhu, Ping; Zheng, Li; Zhu, Cairong

    2017-06-01

    Pharmacokinetic/pharmacodynamic link models are widely used in dose-finding studies. By applying such models, the results of initial pharmacokinetic/pharmacodynamic studies can be used to predict the potential therapeutic dose range. This knowledge can improve the design of later comparative large-scale clinical trials by reducing the number of participants and saving time and resources. However, the modeling process can be challenging, time consuming, and costly, even when using cutting-edge, powerful pharmacological software. Here, we provide a freely available R program for expediently analyzing pharmacokinetic/pharmacodynamic data, including data importation, parameter estimation, simulation, and model diagnostics. First, we explain the theory related to the establishment of the pharmacokinetic/pharmacodynamic link model. Subsequently, we present the algorithms used for parameter estimation and potential therapeutic dose computation. The implementation of the R program is illustrated by a clinical example. The software package is then validated by comparing the model parameters and the goodness-of-fit statistics generated by our R package with those generated by the widely used pharmacological software WinNonlin. The pharmacokinetic and pharmacodynamic parameters as well as the potential recommended therapeutic dose can be acquired with the R package. The validation process shows that the parameters estimated using our package are satisfactory. The R program developed and presented here provides pharmacokinetic researchers with a simple and easy-to-access tool for pharmacokinetic/pharmacodynamic analysis on personal computers.

  19. Permeability of model porous medium formed by random discs

    NASA Astrophysics Data System (ADS)

    Gubaidullin, A. A.; Gubkin, A. S.; Igoshin, D. E.; Ignatev, P. A.

    2018-03-01

    Two-dimension model of the porous medium with skeleton of randomly located overlapping discs is proposed. The geometry and computational grid are built in open package Salome. Flow of Newtonian liquid in longitudinal and transverse directions is calculated and its flow rate is defined. The numerical solution of the Navier-Stokes equations for a given pressure drop at the boundaries of the area is realized in the open package OpenFOAM. Calculated value of flow rate is used for defining of permeability coefficient on the base of Darcy law. For evaluating of representativeness of computational domain the permeability coefficients in longitudinal and transverse directions are compered.

  20. Hydrological analysis in R: Topmodel and beyond

    NASA Astrophysics Data System (ADS)

    Buytaert, W.; Reusser, D.

    2011-12-01

    R is quickly gaining popularity in the hydrological sciences community. The wide range of statistical and mathematical functionality makes it an excellent tool for data analysis, modelling and uncertainty analysis. Topmodel was one of the first hydrological models being implemented as an R package and distributed through R's own distribution network CRAN. This facilitated pre- and postprocessing of data such as parameter sampling, calculation of prediction bounds, and advanced visualisation. However, apart from these basic functionalities, the package did not use many of the more advanced features of the R environment, especially from R's object oriented functionality. With R's increasing expansion in arenas such as high performance computing, big data analysis, and cloud services, we revisit the topmodel package, and use it as an example of how to build and deploy the next generation of hydrological models. R provides a convenient environment and attractive features to build and couple hydrological - and in extension other environmental - models, to develop flexible and effective data assimilation strategies, and to take the model beyond the individual computer by linking into cloud services for both data provision and computing. However, in order to maximise the benefit of these approaches, it will be necessary to adopt standards and ontologies for model interaction and information exchange. Some of those are currently being developed, such as the OGC web processing standards, while other will need to be developed.

  1. MELCOR computer code manuals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the U.S. Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, andmore » combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR`s phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package.« less

  2. Wide-field Imaging System and Rapid Direction of Optical Zoom (WOZ)

    DTIC Science & Technology

    2010-09-25

    commercial software packages: SolidWorks, COMSOL Multiphysics, and ZEMAX optical design. SolidWorks is a computer aided design package, which as a live...interface to COMSOL. COMSOL is a finite element analysis/partial differential equation solver. ZEMAX is an optical design package. Both COMSOL and... ZEMAX have live interfaces to MatLab. Our initial investigations have enabled a model in SolidWorks to be updated in COMSOL, an FEA calculation

  3. A Python Implementation of an Intermediate-Level Tropical Circulation Model and Implications for How Modeling Science is Done

    NASA Astrophysics Data System (ADS)

    Lin, J. W. B.

    2015-12-01

    Historically, climate models have been developed incrementally and in compiled languages like Fortran. While the use of legacy compiledlanguages results in fast, time-tested code, the resulting model is limited in its modularity and cannot take advantage of functionalityavailable with modern computer languages. Here we describe an effort at using the open-source, object-oriented language Pythonto create more flexible climate models: the package qtcm, a Python implementation of the intermediate-level Neelin-Zeng Quasi-Equilibrium Tropical Circulation model (QTCM1) of the atmosphere. The qtcm package retains the core numerics of QTCM1, written in Fortran, to optimize model performance but uses Python structures and utilities to wrap the QTCM1 Fortran routines and manage model execution. The resulting "mixed language" modeling package allows order and choice of subroutine execution to be altered at run time, and model analysis and visualization to be integrated in interactively with model execution at run time. This flexibility facilitates more complex scientific analysis using less complex code than would be possible using traditional languages alone and provides tools to transform the traditional "formulate hypothesis → write and test code → run model → analyze results" sequence into a feedback loop that can be executed automatically by the computer.

  4. Personalized cloud-based bioinformatics services for research and education: use cases and the elasticHPC package

    PubMed Central

    2012-01-01

    Background Bioinformatics services have been traditionally provided in the form of a web-server that is hosted at institutional infrastructure and serves multiple users. This model, however, is not flexible enough to cope with the increasing number of users, increasing data size, and new requirements in terms of speed and availability of service. The advent of cloud computing suggests a new service model that provides an efficient solution to these problems, based on the concepts of "resources-on-demand" and "pay-as-you-go". However, cloud computing has not yet been introduced within bioinformatics servers due to the lack of usage scenarios and software layers that address the requirements of the bioinformatics domain. Results In this paper, we provide different use case scenarios for providing cloud computing based services, considering both the technical and financial aspects of the cloud computing service model. These scenarios are for individual users seeking computational power as well as bioinformatics service providers aiming at provision of personalized bioinformatics services to their users. We also present elasticHPC, a software package and a library that facilitates the use of high performance cloud computing resources in general and the implementation of the suggested bioinformatics scenarios in particular. Concrete examples that demonstrate the suggested use case scenarios with whole bioinformatics servers and major sequence analysis tools like BLAST are presented. Experimental results with large datasets are also included to show the advantages of the cloud model. Conclusions Our use case scenarios and the elasticHPC package are steps towards the provision of cloud based bioinformatics services, which would help in overcoming the data challenge of recent biological research. All resources related to elasticHPC and its web-interface are available at http://www.elasticHPC.org. PMID:23281941

  5. Personalized cloud-based bioinformatics services for research and education: use cases and the elasticHPC package.

    PubMed

    El-Kalioby, Mohamed; Abouelhoda, Mohamed; Krüger, Jan; Giegerich, Robert; Sczyrba, Alexander; Wall, Dennis P; Tonellato, Peter

    2012-01-01

    Bioinformatics services have been traditionally provided in the form of a web-server that is hosted at institutional infrastructure and serves multiple users. This model, however, is not flexible enough to cope with the increasing number of users, increasing data size, and new requirements in terms of speed and availability of service. The advent of cloud computing suggests a new service model that provides an efficient solution to these problems, based on the concepts of "resources-on-demand" and "pay-as-you-go". However, cloud computing has not yet been introduced within bioinformatics servers due to the lack of usage scenarios and software layers that address the requirements of the bioinformatics domain. In this paper, we provide different use case scenarios for providing cloud computing based services, considering both the technical and financial aspects of the cloud computing service model. These scenarios are for individual users seeking computational power as well as bioinformatics service providers aiming at provision of personalized bioinformatics services to their users. We also present elasticHPC, a software package and a library that facilitates the use of high performance cloud computing resources in general and the implementation of the suggested bioinformatics scenarios in particular. Concrete examples that demonstrate the suggested use case scenarios with whole bioinformatics servers and major sequence analysis tools like BLAST are presented. Experimental results with large datasets are also included to show the advantages of the cloud model. Our use case scenarios and the elasticHPC package are steps towards the provision of cloud based bioinformatics services, which would help in overcoming the data challenge of recent biological research. All resources related to elasticHPC and its web-interface are available at http://www.elasticHPC.org.

  6. Meeting the needs of an ever-demanding market.

    PubMed

    Rigby, Richard

    2002-04-01

    Balancing cost and performance in packaging is critical. This article outlines techniques to assist in this whilst delivering added value and product differentiation. The techniques include a rigorous statistical process capable of delivering cost reduction and improved quality and a computer modelling process that can save time when validating new packaging options.

  7. qtcm 0.1.2: A Python Implementation of the Neelin-Zeng Quasi-Equilibrium Tropical Circulation model

    NASA Astrophysics Data System (ADS)

    Lin, J. W.-B.

    2008-10-01

    Historically, climate models have been developed incrementally and in compiled languages like Fortran. While the use of legacy compiled languages results in fast, time-tested code, the resulting model is limited in its modularity and cannot take advantage of functionality available with modern computer languages. Here we describe an effort at using the open-source, object-oriented language Python to create more flexible climate models: the package qtcm, a Python implementation of the intermediate-level Neelin-Zeng Quasi-Equilibrium Tropical Circulation model (QTCM1) of the atmosphere. The qtcm package retains the core numerics of QTCM1, written in Fortran to optimize model performance, but uses Python structures and utilities to wrap the QTCM1 Fortran routines and manage model execution. The resulting "mixed language" modeling package allows order and choice of subroutine execution to be altered at run time, and model analysis and visualization to be integrated in interactively with model execution at run time. This flexibility facilitates more complex scientific analysis using less complex code than would be possible using traditional languages alone, and provides tools to transform the traditional "formulate hypothesis → write and test code → run model → analyze results" sequence into a feedback loop that can be executed automatically by the computer.

  8. qtcm 0.1.2: a Python implementation of the Neelin-Zeng Quasi-Equilibrium Tropical Circulation Model

    NASA Astrophysics Data System (ADS)

    Lin, J. W.-B.

    2009-02-01

    Historically, climate models have been developed incrementally and in compiled languages like Fortran. While the use of legacy compiled languages results in fast, time-tested code, the resulting model is limited in its modularity and cannot take advantage of functionality available with modern computer languages. Here we describe an effort at using the open-source, object-oriented language Python to create more flexible climate models: the package qtcm, a Python implementation of the intermediate-level Neelin-Zeng Quasi-Equilibrium Tropical Circulation model (QTCM1) of the atmosphere. The qtcm package retains the core numerics of QTCM1, written in Fortran to optimize model performance, but uses Python structures and utilities to wrap the QTCM1 Fortran routines and manage model execution. The resulting "mixed language" modeling package allows order and choice of subroutine execution to be altered at run time, and model analysis and visualization to be integrated in interactively with model execution at run time. This flexibility facilitates more complex scientific analysis using less complex code than would be possible using traditional languages alone, and provides tools to transform the traditional "formulate hypothesis → write and test code → run model → analyze results" sequence into a feedback loop that can be executed automatically by the computer.

  9. XCAT/DRASIM: a realistic CT/human-model simulation package

    NASA Astrophysics Data System (ADS)

    Fung, George S. K.; Stierstorfer, Karl; Segars, W. Paul; Taguchi, Katsuyuki; Flohr, Thomas G.; Tsui, Benjamin M. W.

    2011-03-01

    The aim of this research is to develop a complete CT/human-model simulation package by integrating the 4D eXtended CArdiac-Torso (XCAT) phantom, a computer generated NURBS surface based phantom that provides a realistic model of human anatomy and respiratory and cardiac motions, and the DRASIM (Siemens Healthcare) CT-data simulation program. Unlike other CT simulation tools which are based on simple mathematical primitives or voxelized phantoms, this new simulation package has the advantages of utilizing a realistic model of human anatomy and physiological motions without voxelization and with accurate modeling of the characteristics of clinical Siemens CT systems. First, we incorporated the 4D XCAT anatomy and motion models into DRASIM by implementing a new library which consists of functions to read-in the NURBS surfaces of anatomical objects and their overlapping order and material properties in the XCAT phantom. Second, we incorporated an efficient ray-tracing algorithm for line integral calculation in DRASIM by computing the intersection points of the rays cast from the x-ray source to the detector elements through the NURBS surfaces of the multiple XCAT anatomical objects along the ray paths. Third, we evaluated the integrated simulation package by performing a number of sample simulations of multiple x-ray projections from different views followed by image reconstruction. The initial simulation results were found to be promising by qualitative evaluation. In conclusion, we have developed a unique CT/human-model simulation package which has great potential as a tool in the design and optimization of CT scanners, and the development of scanning protocols and image reconstruction methods for improving CT image quality and reducing radiation dose.

  10. MC-GenomeKey: a multicloud system for the detection and annotation of genomic variants.

    PubMed

    Elshazly, Hatem; Souilmi, Yassine; Tonellato, Peter J; Wall, Dennis P; Abouelhoda, Mohamed

    2017-01-20

    Next Generation Genome sequencing techniques became affordable for massive sequencing efforts devoted to clinical characterization of human diseases. However, the cost of providing cloud-based data analysis of the mounting datasets remains a concerning bottleneck for providing cost-effective clinical services. To address this computational problem, it is important to optimize the variant analysis workflow and the used analysis tools to reduce the overall computational processing time, and concomitantly reduce the processing cost. Furthermore, it is important to capitalize on the use of the recent development in the cloud computing market, which have witnessed more providers competing in terms of products and prices. In this paper, we present a new package called MC-GenomeKey (Multi-Cloud GenomeKey) that efficiently executes the variant analysis workflow for detecting and annotating mutations using cloud resources from different commercial cloud providers. Our package supports Amazon, Google, and Azure clouds, as well as, any other cloud platform based on OpenStack. Our package allows different scenarios of execution with different levels of sophistication, up to the one where a workflow can be executed using a cluster whose nodes come from different clouds. MC-GenomeKey also supports scenarios to exploit the spot instance model of Amazon in combination with the use of other cloud platforms to provide significant cost reduction. To the best of our knowledge, this is the first solution that optimizes the execution of the workflow using computational resources from different cloud providers. MC-GenomeKey provides an efficient multicloud based solution to detect and annotate mutations. The package can run in different commercial cloud platforms, which enables the user to seize the best offers. The package also provides a reliable means to make use of the low-cost spot instance model of Amazon, as it provides an efficient solution to the sudden termination of spot machines as a result of a sudden price increase. The package has a web-interface and it is available for free for academic use.

  11. Efficient Predictions of Excited State for Nanomaterials Using Aces 3 and 4

    DTIC Science & Technology

    2017-12-20

    by first-principle methods in the software package ACES by using large parallel computers, growing tothe exascale. 15. SUBJECT TERMS Computer...modeling, excited states, optical properties, structure, stability, activation barriers first principle methods , parallel computing 16. SECURITY...2 Progress with new density functional methods

  12. --No Title--

    Science.gov Websites

    interoperability emerging infrastructure for data management on computational grids Software Packages Services : ATLAS: Management and Steering: Computing Management Board Software Project Management Board Database Model Group Computing TDR: 4.5 Event Data 4.8 Database and Data Management Services 6.3.4 Production and

  13. Chemically Aware Model Builder (camb): an R package for property and bioactivity modelling of small molecules.

    PubMed

    Murrell, Daniel S; Cortes-Ciriano, Isidro; van Westen, Gerard J P; Stott, Ian P; Bender, Andreas; Malliavin, Thérèse E; Glen, Robert C

    2015-01-01

    In silico predictive models have proved to be valuable for the optimisation of compound potency, selectivity and safety profiles in the drug discovery process. camb is an R package that provides an environment for the rapid generation of quantitative Structure-Property and Structure-Activity models for small molecules (including QSAR, QSPR, QSAM, PCM) and is aimed at both advanced and beginner R users. camb's capabilities include the standardisation of chemical structure representation, computation of 905 one-dimensional and 14 fingerprint type descriptors for small molecules, 8 types of amino acid descriptors, 13 whole protein sequence descriptors, filtering methods for feature selection, generation of predictive models (using an interface to the R package caret), as well as techniques to create model ensembles using techniques from the R package caretEnsemble). Results can be visualised through high-quality, customisable plots (R package ggplot2). Overall, camb constitutes an open-source framework to perform the following steps: (1) compound standardisation, (2) molecular and protein descriptor calculation, (3) descriptor pre-processing and model training, visualisation and validation, and (4) bioactivity/property prediction for new molecules. camb aims to speed model generation, in order to provide reproducibility and tests of robustness. QSPR and proteochemometric case studies are included which demonstrate camb's application.Graphical abstractFrom compounds and data to models: a complete model building workflow in one package.

  14. New generation of exploration tools: interactive modeling software and microcomputers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krajewski, S.A.

    1986-08-01

    Software packages offering interactive modeling techniques are now available for use on microcomputer hardware systems. These packages are reasonably priced for both company and independent explorationists; they do not require users to have high levels of computer literacy; they are capable of rapidly completing complex ranges of sophisticated geologic and geophysical modeling tasks; and they can produce presentation-quality output for comparison with real-world data. For example, interactive packages are available for mapping, log analysis, seismic modeling, reservoir studies, and financial projects as well as for applying a variety of statistical and geostatistical techniques to analysis of exploration data. More importantly,more » these packages enable explorationists to directly apply their geologic expertise when developing and fine-tuning models for identifying new prospects and for extending producing fields. As a result of these features, microcomputers and interactive modeling software are becoming common tools in many exploration offices. Gravity and magnetics software programs illustrate some of the capabilities of such exploration tools.« less

  15. Development of the Symbolic Manipulator Laboratory modeling package for the kinematic design and optimization of the Future Armor Rearm System robot. Ammunition Logistics Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    March-Leuba, S.; Jansen, J.F.; Kress, R.L.

    1992-08-01

    A new program package, Symbolic Manipulator Laboratory (SML), for the automatic generation of both kinematic and static manipulator models in symbolic form is presented. Critical design parameters may be identified and optimized using symbolic models as shown in the sample application presented for the Future Armor Rearm System (FARS) arm. The computer-aided development of the symbolic models yields equations with reduced numerical complexity. Important considerations have been placed on the closed form solutions simplification and on the user friendly operation. The main emphasis of this research is the development of a methodology which is implemented in a computer program capablemore » of generating symbolic kinematic and static forces models of manipulators. The fact that the models are obtained trigonometrically reduced is among the most significant results of this work and the most difficult to implement. Mathematica, a commercial program that allows symbolic manipulation, is used to implement the program package. SML is written such that the user can change any of the subroutines or create new ones easily. To assist the user, an on-line help has been written to make of SML a user friendly package. Some sample applications are presented. The design and optimization of the 5-degrees-of-freedom (DOF) FARS manipulator using SML is discussed. Finally, the kinematic and static models of two different 7-DOF manipulators are calculated symbolically.« less

  16. System and Method for Providing a Climate Data Persistence Service

    NASA Technical Reports Server (NTRS)

    Schnase, John L. (Inventor); Ripley, III, William David (Inventor); Duffy, Daniel Q. (Inventor); Thompson, John H. (Inventor); Strong, Savannah L. (Inventor); McInerney, Mark (Inventor); Sinno, Scott (Inventor); Tamkin, Glenn S. (Inventor); Nadeau, Denis (Inventor)

    2018-01-01

    A system, method and computer-readable storage devices for providing a climate data persistence service. A system configured to provide the service can include a climate data server that performs data and metadata storage and management functions for climate data objects, a compute-storage platform that provides the resources needed to support a climate data server, provisioning software that allows climate data server instances to be deployed as virtual climate data servers in a cloud computing environment, and a service interface, wherein persistence service capabilities are invoked by software applications running on a client device. The climate data objects can be in various formats, such as International Organization for Standards (ISO) Open Archival Information System (OAIS) Reference Model Submission Information Packages, Archive Information Packages, and Dissemination Information Packages. The climate data server can enable scalable, federated storage, management, discovery, and access, and can be tailored for particular use cases.

  17. Pse-Analysis: a python package for DNA/RNA and protein/ peptide sequence analysis based on pseudo components and kernel methods.

    PubMed

    Liu, Bin; Wu, Hao; Zhang, Deyuan; Wang, Xiaolong; Chou, Kuo-Chen

    2017-02-21

    To expedite the pace in conducting genome/proteome analysis, we have developed a Python package called Pse-Analysis. The powerful package can automatically complete the following five procedures: (1) sample feature extraction, (2) optimal parameter selection, (3) model training, (4) cross validation, and (5) evaluating prediction quality. All the work a user needs to do is to input a benchmark dataset along with the query biological sequences concerned. Based on the benchmark dataset, Pse-Analysis will automatically construct an ideal predictor, followed by yielding the predicted results for the submitted query samples. All the aforementioned tedious jobs can be automatically done by the computer. Moreover, the multiprocessing technique was adopted to enhance computational speed by about 6 folds. The Pse-Analysis Python package is freely accessible to the public at http://bioinformatics.hitsz.edu.cn/Pse-Analysis/, and can be directly run on Windows, Linux, and Unix.

  18. Reviews, Software.

    ERIC Educational Resources Information Center

    Science Teacher, 1988

    1988-01-01

    Reviews two computer software packages for use in physical science, physics, and chemistry classes. Includes "Physics of Model Rocketry" for Apple II, and "Black Box" for Apple II and IBM compatible computers. "Black Box" is designed to help students understand the concept of indirect evidence. (CW)

  19. XDesign: an open-source software package for designing X-ray imaging phantoms and experiments.

    PubMed

    Ching, Daniel J; Gürsoy, Dogˇa

    2017-03-01

    The development of new methods or utilization of current X-ray computed tomography methods is impeded by the substantial amount of expertise required to design an X-ray computed tomography experiment from beginning to end. In an attempt to make material models, data acquisition schemes and reconstruction algorithms more accessible to researchers lacking expertise in some of these areas, a software package is described here which can generate complex simulated phantoms and quantitatively evaluate new or existing data acquisition schemes and image reconstruction algorithms for targeted applications.

  20. XDesign: An open-source software package for designing X-ray imaging phantoms and experiments

    DOE PAGES

    Ching, Daniel J.; Gursoy, Dogˇa

    2017-02-21

    Here, the development of new methods or utilization of current X-ray computed tomography methods is impeded by the substantial amount of expertise required to design an X-ray computed tomography experiment from beginning to end. In an attempt to make material models, data acquisition schemes and reconstruction algorithms more accessible to researchers lacking expertise in some of these areas, a software package is described here which can generate complex simulated phantoms and quantitatively evaluate new or existing data acquisition schemes and image reconstruction algorithms for targeted applications.

  1. Parachute Dynamics Investigations Using a Sensor Package Airdropped from a Small-Scale Airplane

    NASA Technical Reports Server (NTRS)

    Dooley, Jessica; Lorenz, Ralph D.

    2005-01-01

    We explore the utility of various sensors by recovering parachute-probe dynamics information from a package released from a small-scale, remote-controlled airplane. The airdrops aid in the development of datasets for the exploration of planetary probe trajectory recovery algorithms, supplementing data collected from instrumented, full-scale tests and computer models.

  2. Manipulatives and the Computer: A Powerful Partnership for Learners of All Ages.

    ERIC Educational Resources Information Center

    Perl, Teri

    1990-01-01

    Discussed is the concept of mirroring in which computer programs are used to enhance the use of mathematics manipulatives. The strengths and weaknesses of this approach are presented. The uses of the computer in modeling and as a manipulative are also described. Several software packages are suggested. (CW)

  3. Computer Aided Drafting Packages for Secondary Education. Edition 2. PC DOS Compatible Programs. A MicroSIFT Quarterly Report.

    ERIC Educational Resources Information Center

    Pollard, Jim

    This report reviews eight IBM-compatible software packages that are available to secondary schools to teach computer-aided drafting (CAD). Software packages to be considered were selected following reviews of CAD periodicals, computers in education periodicals, advertisements, and recommendations of teachers. The packages were then rated by…

  4. MELCOR computer code manuals: Primer and user`s guides, Version 1.8.3 September 1994. Volume 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the US Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, andmore » combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR`s phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users` Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package.« less

  5. DNA packaging in viral capsids with peptide arms.

    PubMed

    Cao, Qianqian; Bachmann, Michael

    2017-01-18

    Strong chain rigidity and electrostatic self-repulsion of packed double-stranded DNA in viruses require a molecular motor to pull the DNA into the capsid. However, what is the role of electrostatic interactions between different charged components in the packaging process? Though various theories and computer simulation models were developed for the understanding of viral assembly and packaging dynamics of the genome, long-range electrostatic interactions and capsid structure have typically been neglected or oversimplified. By means of molecular dynamics simulations, we explore the effects of electrostatic interactions on the packaging dynamics of DNA based on a coarse-grained DNA and capsid model by explicitly including peptide arms (PAs), linked to the inner surface of the capsid, and counterions. Our results indicate that the electrostatic interactions between PAs, DNA, and counterions have a significant influence on the packaging dynamics. We also find that the packed DNA conformations are largely affected by the structure of the PA layer, but the packaging rate is insensitive to the layer structure.

  6. Model Package Report: Hanford Soil Inventory Model SIM v.2 Build 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nichols, Will E.; Zaher, U.; Mehta, S.

    The Hanford Soil Inventory Model (SIM) is a tool for the estimation of inventory of contaminants that were released to soil from liquid discharges during the U.S. Department of Energy’s Hanford Site operations. This model package report documents the construction and development of a second version of SIM (SIM-v2) to support the needs of Hanford Site Composite Analysis. The SIM-v2 is implemented using GoldSim Pro®1 software with a new model architecture that preserves the uncertainty in inventory estimates while reducing the computational burden (compared to the previous version) and allowing more traceability and transparency in calculation methodology. The calculation architecturemore » is designed in such a manner that future updates to the waste stream composition along with addition or deletion of waste sites can be performed with relative ease. In addition, the new computational platform allows for continued hardware upgrade.« less

  7. Computational models for the viscous/inviscid analysis of jet aircraft exhaust plumes

    NASA Astrophysics Data System (ADS)

    Dash, S. M.; Pergament, H. S.; Thorpe, R. D.

    1980-05-01

    Computational models which analyze viscous/inviscid flow processes in jet aircraft exhaust plumes are discussed. These models are component parts of an NASA-LaRC method for the prediction of nozzle afterbody drag. Inviscid/shock processes are analyzed by the SCIPAC code which is a compact version of a generalized shock capturing, inviscid plume code (SCIPPY). The SCIPAC code analyzes underexpanded jet exhaust gas mixtures with a self-contained thermodynamic package for hydrocarbon exhaust products and air. A detailed and automated treatment of the embedded subsonic zones behind Mach discs is provided in this analysis. Mixing processes along the plume interface are analyzed by two upgraded versions of an overlaid, turbulent mixing code (BOAT) developed previously for calculating nearfield jet entrainment. The BOATAC program is a frozen chemistry version of BOAT containing the aircraft thermodynamic package as SCIPAC; BOATAB is an afterburning version with a self-contained aircraft (hydrocarbon/air) finite-rate chemistry package. The coupling of viscous and inviscid flow processes is achieved by an overlaid procedure with interactive effects accounted for by a displacement thickness type correction to the inviscid plume interface.

  8. Computational models for the viscous/inviscid analysis of jet aircraft exhaust plumes. [predicting afterbody drag

    NASA Technical Reports Server (NTRS)

    Dash, S. M.; Pergament, H. S.; Thorpe, R. D.

    1980-01-01

    Computational models which analyze viscous/inviscid flow processes in jet aircraft exhaust plumes are discussed. These models are component parts of an NASA-LaRC method for the prediction of nozzle afterbody drag. Inviscid/shock processes are analyzed by the SCIPAC code which is a compact version of a generalized shock capturing, inviscid plume code (SCIPPY). The SCIPAC code analyzes underexpanded jet exhaust gas mixtures with a self-contained thermodynamic package for hydrocarbon exhaust products and air. A detailed and automated treatment of the embedded subsonic zones behind Mach discs is provided in this analysis. Mixing processes along the plume interface are analyzed by two upgraded versions of an overlaid, turbulent mixing code (BOAT) developed previously for calculating nearfield jet entrainment. The BOATAC program is a frozen chemistry version of BOAT containing the aircraft thermodynamic package as SCIPAC; BOATAB is an afterburning version with a self-contained aircraft (hydrocarbon/air) finite-rate chemistry package. The coupling of viscous and inviscid flow processes is achieved by an overlaid procedure with interactive effects accounted for by a displacement thickness type correction to the inviscid plume interface.

  9. FIT: statistical modeling tool for transcriptome dynamics under fluctuating field conditions

    PubMed Central

    Iwayama, Koji; Aisaka, Yuri; Kutsuna, Natsumaro

    2017-01-01

    Abstract Motivation: Considerable attention has been given to the quantification of environmental effects on organisms. In natural conditions, environmental factors are continuously changing in a complex manner. To reveal the effects of such environmental variations on organisms, transcriptome data in field environments have been collected and analyzed. Nagano et al. proposed a model that describes the relationship between transcriptomic variation and environmental conditions and demonstrated the capability to predict transcriptome variation in rice plants. However, the computational cost of parameter optimization has prevented its wide application. Results: We propose a new statistical model and efficient parameter optimization based on the previous study. We developed and released FIT, an R package that offers functions for parameter optimization and transcriptome prediction. The proposed method achieves comparable or better prediction performance within a shorter computational time than the previous method. The package will facilitate the study of the environmental effects on transcriptomic variation in field conditions. Availability and Implementation: Freely available from CRAN (https://cran.r-project.org/web/packages/FIT/). Contact: anagano@agr.ryukoku.ac.jp Supplementary information: Supplementary data are available at Bioinformatics online PMID:28158396

  10. Scilab software as an alternative low-cost computing in solving the linear equations problem

    NASA Astrophysics Data System (ADS)

    Agus, Fahrul; Haviluddin

    2017-02-01

    Numerical computation packages are widely used both in teaching and research. These packages consist of license (proprietary) and open source software (non-proprietary). One of the reasons to use the package is a complexity of mathematics function (i.e., linear problems). Also, number of variables in a linear or non-linear function has been increased. The aim of this paper was to reflect on key aspects related to the method, didactics and creative praxis in the teaching of linear equations in higher education. If implemented, it could be contribute to a better learning in mathematics area (i.e., solving simultaneous linear equations) that essential for future engineers. The focus of this study was to introduce an additional numerical computation package of Scilab as an alternative low-cost computing programming. In this paper, Scilab software was proposed some activities that related to the mathematical models. In this experiment, four numerical methods such as Gaussian Elimination, Gauss-Jordan, Inverse Matrix, and Lower-Upper Decomposition (LU) have been implemented. The results of this study showed that a routine or procedure in numerical methods have been created and explored by using Scilab procedures. Then, the routine of numerical method that could be as a teaching material course has exploited.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    March-Leuba, S.; Jansen, J.F.; Kress, R.L.

    A new program package, Symbolic Manipulator Laboratory (SML), for the automatic generation of both kinematic and static manipulator models in symbolic form is presented. Critical design parameters may be identified and optimized using symbolic models as shown in the sample application presented for the Future Armor Rearm System (FARS) arm. The computer-aided development of the symbolic models yields equations with reduced numerical complexity. Important considerations have been placed on the closed form solutions simplification and on the user friendly operation. The main emphasis of this research is the development of a methodology which is implemented in a computer program capablemore » of generating symbolic kinematic and static forces models of manipulators. The fact that the models are obtained trigonometrically reduced is among the most significant results of this work and the most difficult to implement. Mathematica, a commercial program that allows symbolic manipulation, is used to implement the program package. SML is written such that the user can change any of the subroutines or create new ones easily. To assist the user, an on-line help has been written to make of SML a user friendly package. Some sample applications are presented. The design and optimization of the 5-degrees-of-freedom (DOF) FARS manipulator using SML is discussed. Finally, the kinematic and static models of two different 7-DOF manipulators are calculated symbolically.« less

  12. An ARM data-oriented diagnostics package to evaluate the climate model simulation

    NASA Astrophysics Data System (ADS)

    Zhang, C.; Xie, S.

    2016-12-01

    A set of diagnostics that utilize long-term high frequency measurements from the DOE Atmospheric Radiation Measurement (ARM) program is developed for evaluating the regional simulation of clouds, radiation and precipitation in climate models. The diagnostics results are computed and visualized automatically in a python-based package that aims to serve as an easy entry point for evaluating climate simulations using the ARM data, as well as the CMIP5 multi-model simulations. Basic performance metrics are computed to measure the accuracy of mean state and variability of simulated regional climate. The evaluated physical quantities include vertical profiles of clouds, temperature, relative humidity, cloud liquid water path, total column water vapor, precipitation, sensible and latent heat fluxes, radiative fluxes, aerosol and cloud microphysical properties. Process-oriented diagnostics focusing on individual cloud and precipitation-related phenomena are developed for the evaluation and development of specific model physical parameterizations. Application of the ARM diagnostics package will be presented in the AGU session. This work is performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344, IM release number is: LLNL-ABS-698645.

  13. Investigation of different modeling approaches for computational fluid dynamics simulation of high-pressure rocket combustors

    NASA Astrophysics Data System (ADS)

    Ivancic, B.; Riedmann, H.; Frey, M.; Knab, O.; Karl, S.; Hannemann, K.

    2016-07-01

    The paper summarizes technical results and first highlights of the cooperation between DLR and Airbus Defence and Space (DS) within the work package "CFD Modeling of Combustion Chamber Processes" conducted in the frame of the Propulsion 2020 Project. Within the addressed work package, DLR Göttingen and Airbus DS Ottobrunn have identified several test cases where adequate test data are available and which can be used for proper validation of the computational fluid dynamics (CFD) tools. In this paper, the first test case, the Penn State chamber (RCM1), is discussed. Presenting the simulation results from three different tools, it is shown that the test case can be computed properly with steady-state Reynolds-averaged Navier-Stokes (RANS) approaches. The achieved simulation results reproduce the measured wall heat flux as an important validation parameter very well but also reveal some inconsistencies in the test data which are addressed in this paper.

  14. Effects of Computer-Assisted STAD, LTM and ICI Cooperative Learning Strategies on Nigerian Secondary School Students' Achievement, Gender and Motivation in Physics

    ERIC Educational Resources Information Center

    Gambrari, Isiaka Amosa; Yusuf, Mudasiru Olalere; Thomas, David Akpa

    2015-01-01

    This study examined the effectiveness of computer-assisted instruction on Student Team Achievement Division (STAD) and Learning Together Model (LTM) cooperative learning strategies on Nigerian secondary students' achievement and motivation in physics. The efficacy of Authors developed computer assisted instructional package (CAI) for teaching…

  15. High-throughput migration modelling for estimating exposure to chemicals in food packaging in screening and prioritization tools.

    PubMed

    Ernstoff, Alexi S; Fantke, Peter; Huang, Lei; Jolliet, Olivier

    2017-11-01

    Specialty software and simplified models are often used to estimate migration of potentially toxic chemicals from packaging into food. Current models, however, are not suitable for emerging applications in decision-support tools, e.g. in Life Cycle Assessment and risk-based screening and prioritization, which require rapid computation of accurate estimates for diverse scenarios. To fulfil this need, we develop an accurate and rapid (high-throughput) model that estimates the fraction of organic chemicals migrating from polymeric packaging materials into foods. Several hundred step-wise simulations optimised the model coefficients to cover a range of user-defined scenarios (e.g. temperature). The developed model, operationalised in a spreadsheet for future dissemination, nearly instantaneously estimates chemical migration, and has improved performance over commonly used model simplifications. When using measured diffusion coefficients the model accurately predicted (R 2  = 0.9, standard error (S e ) = 0.5) hundreds of empirical data points for various scenarios. Diffusion coefficient modelling, which determines the speed of chemical transfer from package to food, was a major contributor to uncertainty and dramatically decreased model performance (R 2  = 0.4, S e  = 1). In all, this study provides a rapid migration modelling approach to estimate exposure to chemicals in food packaging for emerging screening and prioritization approaches. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Lumped Parameter Model (LPM) for Light-Duty Vehicles

    EPA Pesticide Factsheets

    EPA’s Lumped Parameter Model (LPM) is a free, desktop computer application that estimates the effectiveness (CO2 Reduction) of various technology combinations or “packages,” in a manner that accounts for synergies between technologies.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sepke, Scott M.

    In this note, the laser focal plane intensity pro le for a beam modeled using the 3D ray trace package in HYDRA is determined. First, the analytical model is developed followed by a practical numerical model for evaluating the resulting computationally intensive normalization factor for all possible input parameters.

  18. Validation of a RANS transition model using a high-order weighted compact nonlinear scheme

    NASA Astrophysics Data System (ADS)

    Tu, GuoHua; Deng, XiaoGang; Mao, MeiLiang

    2013-04-01

    A modified transition model is given based on the shear stress transport (SST) turbulence model and an intermittency transport equation. The energy gradient term in the original model is replaced by flow strain rate to saving computational costs. The model employs local variables only, and then it can be conveniently implemented in modern computational fluid dynamics codes. The fifth-order weighted compact nonlinear scheme and the fourth-order staggered scheme are applied to discrete the governing equations for the purpose of minimizing discretization errors, so as to mitigate the confusion between numerical errors and transition model errors. The high-order package is compared with a second-order TVD method on simulating the transitional flow of a flat plate. Numerical results indicate that the high-order package give better grid convergence property than that of the second-order method. Validation of the transition model is performed for transitional flows ranging from low speed to hypersonic speed.

  19. Computer Simulation in Social Science.

    ERIC Educational Resources Information Center

    Garson, G. David

    From a base in military models, computer simulation has evolved to provide a wide variety of applications in social science. General purpose simulation packages and languages such as FIRM, DYNAMO, and others have made significant contributions toward policy discussion in the social sciences and have well-documented efficacy in instructional…

  20. Tutorial: Advanced fault tree applications using HARP

    NASA Technical Reports Server (NTRS)

    Dugan, Joanne Bechta; Bavuso, Salvatore J.; Boyd, Mark A.

    1993-01-01

    Reliability analysis of fault tolerant computer systems for critical applications is complicated by several factors. These modeling difficulties are discussed and dynamic fault tree modeling techniques for handling them are described and demonstrated. Several advanced fault tolerant computer systems are described, and fault tree models for their analysis are presented. HARP (Hybrid Automated Reliability Predictor) is a software package developed at Duke University and NASA Langley Research Center that is capable of solving the fault tree models presented.

  1. Quantitative evaluation of software packages for single-molecule localization microscopy.

    PubMed

    Sage, Daniel; Kirshner, Hagai; Pengo, Thomas; Stuurman, Nico; Min, Junhong; Manley, Suliana; Unser, Michael

    2015-08-01

    The quality of super-resolution images obtained by single-molecule localization microscopy (SMLM) depends largely on the software used to detect and accurately localize point sources. In this work, we focus on the computational aspects of super-resolution microscopy and present a comprehensive evaluation of localization software packages. Our philosophy is to evaluate each package as a whole, thus maintaining the integrity of the software. We prepared synthetic data that represent three-dimensional structures modeled after biological components, taking excitation parameters, noise sources, point-spread functions and pixelation into account. We then asked developers to run their software on our data; most responded favorably, allowing us to present a broad picture of the methods available. We evaluated their results using quantitative and user-interpretable criteria: detection rate, accuracy, quality of image reconstruction, resolution, software usability and computational resources. These metrics reflect the various tradeoffs of SMLM software packages and help users to choose the software that fits their needs.

  2. Proceedings of Conference on Variable-Resolution Modeling, Washington, DC, 5-6 May 1992

    DTIC Science & Technology

    1992-05-01

    of powerful new computer architectures for supporting object-oriented computing. Objects, as self -contained data-code packages with orderly...another entity structure. For example, (copy-entstr e:sys- tcm ’ new -system) creates an entity structure named c:new-system that has the same structure...324 Parry, S-H. (1984): A Self -contained Hierarchical Model Construct. In: Systems Analysis and Modeling in Defense (R.K. Huber, Ed.), New York

  3. pulver: an R package for parallel ultra-rapid p-value computation for linear regression interaction terms.

    PubMed

    Molnos, Sophie; Baumbach, Clemens; Wahl, Simone; Müller-Nurasyid, Martina; Strauch, Konstantin; Wang-Sattler, Rui; Waldenberger, Melanie; Meitinger, Thomas; Adamski, Jerzy; Kastenmüller, Gabi; Suhre, Karsten; Peters, Annette; Grallert, Harald; Theis, Fabian J; Gieger, Christian

    2017-09-29

    Genome-wide association studies allow us to understand the genetics of complex diseases. Human metabolism provides information about the disease-causing mechanisms, so it is usual to investigate the associations between genetic variants and metabolite levels. However, only considering genetic variants and their effects on one trait ignores the possible interplay between different "omics" layers. Existing tools only consider single-nucleotide polymorphism (SNP)-SNP interactions, and no practical tool is available for large-scale investigations of the interactions between pairs of arbitrary quantitative variables. We developed an R package called pulver to compute p-values for the interaction term in a very large number of linear regression models. Comparisons based on simulated data showed that pulver is much faster than the existing tools. This is achieved by using the correlation coefficient to test the null-hypothesis, which avoids the costly computation of inversions. Additional tricks are a rearrangement of the order, when iterating through the different "omics" layers, and implementing this algorithm in the fast programming language C++. Furthermore, we applied our algorithm to data from the German KORA study to investigate a real-world problem involving the interplay among DNA methylation, genetic variants, and metabolite levels. The pulver package is a convenient and rapid tool for screening huge numbers of linear regression models for significant interaction terms in arbitrary pairs of quantitative variables. pulver is written in R and C++, and can be downloaded freely from CRAN at https://cran.r-project.org/web/packages/pulver/ .

  4. Prediction of pressure drop in fluid tuned mounts using analytical and computational techniques

    NASA Technical Reports Server (NTRS)

    Lasher, William C.; Khalilollahi, Amir; Mischler, John; Uhric, Tom

    1993-01-01

    A simplified model for predicting pressure drop in fluid tuned isolator mounts was developed. The model is based on an exact solution to the Navier-Stokes equations and was made more general through the use of empirical coefficients. The values of these coefficients were determined by numerical simulation of the flow using the commercial computational fluid dynamics (CFD) package FIDAP.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bastian, Mark; Trigueros, Jose V.

    Phoenix is a Java Virtual Machine (JVM) based library for performing mathematical and astrodynamics calculations. It consists of two primary sub-modules, phoenix-math and phoenix-astrodynamics. The mathematics package has a variety of mathematical classes for performing 3D transformations, geometric reasoning, and numerical analysis. The astrodynamics package has various classes and methods for computing locations, attitudes, accesses, and other values useful for general satellite modeling and simulation. Methods for computing celestial locations, such as the location of the Sun and Moon, are also included. Phoenix is meant to be used as a library within the context of a larger application. For example,more » it could be used for a web service, desktop client, or to compute simple values in a scripting environment.« less

  6. Reverse logistics system planning for recycling computers hardware: A case study

    NASA Astrophysics Data System (ADS)

    Januri, Siti Sarah; Zulkipli, Faridah; Zahari, Siti Meriam; Shamsuri, Siti Hajar

    2014-09-01

    This paper describes modeling and simulation of reverse logistics networks for collection of used computers in one of the company in Selangor. The study focuses on design of reverse logistics network for used computers recycling operation. Simulation modeling, presented in this work allows the user to analyze the future performance of the network and to understand the complex relationship between the parties involved. The findings from the simulation suggest that the model calculates processing time and resource utilization in a predictable manner. In this study, the simulation model was developed by using Arena simulation package.

  7. APINetworks: A general API for the treatment of complex networks in arbitrary computational environments

    NASA Astrophysics Data System (ADS)

    Niño, Alfonso; Muñoz-Caro, Camelia; Reyes, Sebastián

    2015-11-01

    The last decade witnessed a great development of the structural and dynamic study of complex systems described as a network of elements. Therefore, systems can be described as a set of, possibly, heterogeneous entities or agents (the network nodes) interacting in, possibly, different ways (defining the network edges). In this context, it is of practical interest to model and handle not only static and homogeneous networks but also dynamic, heterogeneous ones. Depending on the size and type of the problem, these networks may require different computational approaches involving sequential, parallel or distributed systems with or without the use of disk-based data structures. In this work, we develop an Application Programming Interface (APINetworks) for the modeling and treatment of general networks in arbitrary computational environments. To minimize dependency between components, we decouple the network structure from its function using different packages for grouping sets of related tasks. The structural package, the one in charge of building and handling the network structure, is the core element of the system. In this work, we focus in this API structural component. We apply an object-oriented approach that makes use of inheritance and polymorphism. In this way, we can model static and dynamic networks with heterogeneous elements in the nodes and heterogeneous interactions in the edges. In addition, this approach permits a unified treatment of different computational environments. Tests performed on a C++11 version of the structural package show that, on current standard computers, the system can handle, in main memory, directed and undirected linear networks formed by tens of millions of nodes and edges. Our results compare favorably to those of existing tools.

  8. COLD-SAT dynamic model

    NASA Technical Reports Server (NTRS)

    Adams, Neil S.; Bollenbacher, Gary

    1992-01-01

    This report discusses the development and underlying mathematics of a rigid-body computer model of a proposed cryogenic on-orbit liquid depot storage, acquisition, and transfer spacecraft (COLD-SAT). This model, referred to in this report as the COLD-SAT dynamic model, consists of both a trajectory model and an attitudinal model. All disturbance forces and torques expected to be significant for the actual COLD-SAT spacecraft are modeled to the required degree of accuracy. Control and experimental thrusters are modeled, as well as fluid slosh. The model also computes microgravity disturbance accelerations at any specified point in the spacecraft. The model was developed by using the Boeing EASY5 dynamic analysis package and will run on Apollo, Cray, and other computing platforms.

  9. Automated analysis of biological oscillator models using mode decomposition.

    PubMed

    Konopka, Tomasz

    2011-04-01

    Oscillating signals produced by biological systems have shapes, described by their Fourier spectra, that can potentially reveal the mechanisms that generate them. Extracting this information from measured signals is interesting for the validation of theoretical models, discovery and classification of interaction types, and for optimal experiment design. An automated workflow is described for the analysis of oscillating signals. A software package is developed to match signal shapes to hundreds of a priori viable model structures defined by a class of first-order differential equations. The package computes parameter values for each model by exploiting the mode decomposition of oscillating signals and formulating the matching problem in terms of systems of simultaneous polynomial equations. On the basis of the computed parameter values, the software returns a list of models consistent with the data. In validation tests with synthetic datasets, it not only shortlists those model structures used to generate the data but also shows that excellent fits can sometimes be achieved with alternative equations. The listing of all consistent equations is indicative of how further invalidation might be achieved with additional information. When applied to data from a microarray experiment on mice, the procedure finds several candidate model structures to describe interactions related to the circadian rhythm. This shows that experimental data on oscillators is indeed rich in information about gene regulation mechanisms. The software package is available at http://babylone.ulb.ac.be/autoosc/.

  10. airGRteaching: an R-package designed for teaching hydrology with lumped hydrological models

    NASA Astrophysics Data System (ADS)

    Thirel, Guillaume; Delaigue, Olivier; Coron, Laurent; Andréassian, Vazken; Brigode, Pierre

    2017-04-01

    Lumped hydrological models are useful and convenient tools for research, engineering and educational purposes. They propose catchment-scale representations of the precipitation-discharge relationship. Thanks to their limited data requirements, they can be easily implemented and run. With such models, it is possible to simulate a number of hydrological key processes over the catchment with limited structural and parametric complexity, typically evapotranspiration, runoff, underground losses, etc. The Hydrology Group at Irstea (Antony) has been developing a suite of rainfall-runoff models over the past 30 years. This resulted in a suite of models running at different time steps (from hourly to annual) applicable for various issues including water balance estimation, forecasting, simulation of impacts and scenario testing. Recently, Irstea has developed an easy-to-use R-package (R Core Team, 2016), called airGR (Coron et al., 2016, 2017), to make these models widely available. Although its initial target public was hydrological modellers, the package is already used for educational purposes. Indeed, simple models allow for rapidly visualising the effects of parameterizations and model components on flows hydrographs. In order to avoid the difficulties that students may have when manipulating R and datasets, we developed (Delaigue and Coron, 2016): - Three simplified functions to prepare data, calibrate a model and run a simulation - Simplified and dynamic plot functions - A shiny (Chang et al., 2016) interface that connects this R-package to a browser-based visualisation tool. On this interface, the students can use different hydrological models (including the possibility to use a snow-accounting model), manually modify their parameters and automatically calibrate their parameters with diverse objective functions. One of the visualisation tabs of the interface includes observed precipitation and temperature, simulated snowpack (if any), observed and simulated discharges, which are updated immediately (a calibration only needs a couple of seconds or less, a simulation is almost immediate). In addition, time series of internal variables, live-visualisation of internal variables evolution and performance statistics are provided. This interface allows for hands-on exercises that can include for instance the analysis by students of: - The effects of each parameter and model components on simulated discharge - The effects of objective functions based on high flows- or low flows-focused criteria on simulated discharge - The seasonality of the model components. References Winston Chang, Joe Cheng, JJ Allaire, Yihui Xie and Jonathan McPherson (2016). shiny: Web Application Framework for R. R package version 0.13.2. https://CRAN.R-project.org/package=shiny Coron L., Thirel G., Perrin C., Delaigue O., Andréassian V., airGR: a suite of lumped hydrological models in an R-package, Environmental Modelling and software, 2017, submitted. Coron, L., Perrin, C. and Michel, C. (2016). airGR: Suite of GR hydrological models for precipitation-runoff modelling. R package version 1.0.3. https://webgr.irstea.fr/airGR/?lang=en. Olivier Delaigue and Laurent Coron (2016). airGRteaching: Tools to simplify the use of the airGR hydrological package by students. R package version 0.0.1. https://webgr.irstea.fr/airGR/?lang=en R Core Team (2016). R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. URL https://www.R-project.org/.

  11. scoringRules - A software package for probabilistic model evaluation

    NASA Astrophysics Data System (ADS)

    Lerch, Sebastian; Jordan, Alexander; Krüger, Fabian

    2016-04-01

    Models in the geosciences are generally surrounded by uncertainty, and being able to quantify this uncertainty is key to good decision making. Accordingly, probabilistic forecasts in the form of predictive distributions have become popular over the last decades. With the proliferation of probabilistic models arises the need for decision theoretically principled tools to evaluate the appropriateness of models and forecasts in a generalized way. Various scoring rules have been developed over the past decades to address this demand. Proper scoring rules are functions S(F,y) which evaluate the accuracy of a forecast distribution F , given that an outcome y was observed. As such, they allow to compare alternative models, a crucial ability given the variety of theories, data sources and statistical specifications that is available in many situations. This poster presents the software package scoringRules for the statistical programming language R, which contains functions to compute popular scoring rules such as the continuous ranked probability score for a variety of distributions F that come up in applied work. Two main classes are parametric distributions like normal, t, or gamma distributions, and distributions that are not known analytically, but are indirectly described through a sample of simulation draws. For example, Bayesian forecasts produced via Markov Chain Monte Carlo take this form. Thereby, the scoringRules package provides a framework for generalized model evaluation that both includes Bayesian as well as classical parametric models. The scoringRules package aims to be a convenient dictionary-like reference for computing scoring rules. We offer state of the art implementations of several known (but not routinely applied) formulas, and implement closed-form expressions that were previously unavailable. Whenever more than one implementation variant exists, we offer statistically principled default choices.

  12. Function Package for Computing Quantum Resource Measures

    NASA Astrophysics Data System (ADS)

    Huang, Zhiming

    2018-05-01

    In this paper, we present a function package for to calculate quantum resource measures and dynamics of open systems. Our package includes common operators and operator lists, frequently-used functions for computing quantum entanglement, quantum correlation, quantum coherence, quantum Fisher information and dynamics in noisy environments. We briefly explain the functions of the package and illustrate how to use the package with several typical examples. We expect that this package is a useful tool for future research and education.

  13. Computer-aided-engineering system for modeling and analysis of ECLSS integration testing

    NASA Technical Reports Server (NTRS)

    Sepahban, Sonbol

    1987-01-01

    The accurate modeling and analysis of two-phase fluid networks found in environmental control and life support systems is presently undertaken by computer-aided engineering (CAE) techniques whose generalized fluid dynamics package can solve arbitrary flow networks. The CAE system for integrated test bed modeling and analysis will also furnish interfaces and subsystem/test-article mathematical models. Three-dimensional diagrams of the test bed are generated by the system after performing the requisite simulation and analysis.

  14. Open source molecular modeling.

    PubMed

    Pirhadi, Somayeh; Sunseri, Jocelyn; Koes, David Ryan

    2016-09-01

    The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry. An updated online version of this catalog can be found at https://opensourcemolecularmodeling.github.io. Copyright © 2016 The Author(s). Published by Elsevier Inc. All rights reserved.

  15. Network Meta-Analysis Using R: A Review of Currently Available Automated Packages

    PubMed Central

    Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph

    2014-01-01

    Network meta-analysis (NMA) – a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously – has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA. PMID:25541687

  16. Network meta-analysis using R: a review of currently available automated packages.

    PubMed

    Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph

    2014-01-01

    Network meta-analysis (NMA)--a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously--has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA.

  17. Implementation of radiation shielding calculation methods. Volume 1: Synopsis of methods and summary of results

    NASA Technical Reports Server (NTRS)

    Capo, M. A.; Disney, R. K.

    1971-01-01

    The work performed in the following areas is summarized: (1) Analysis of Realistic nuclear-propelled vehicle was analyzed using the Marshall Space Flight Center computer code package. This code package includes one and two dimensional discrete ordinate transport, point kernel, and single scatter techniques, as well as cross section preparation and data processing codes, (2) Techniques were developed to improve the automated data transfer in the coupled computation method of the computer code package and improve the utilization of this code package on the Univac-1108 computer system. (3) The MSFC master data libraries were updated.

  18. batman: BAsic Transit Model cAlculatioN in Python

    NASA Astrophysics Data System (ADS)

    Kreidberg, Laura

    2015-11-01

    I introduce batman, a Python package for modeling exoplanet transit light curves. The batman package supports calculation of light curves for any radially symmetric stellar limb darkening law, using a new integration algorithm for models that cannot be quickly calculated analytically. The code uses C extension modules to speed up model calculation and is parallelized with OpenMP. For a typical light curve with 100 data points in transit, batman can calculate one million quadratic limb-darkened models in 30 seconds with a single 1.7 GHz Intel Core i5 processor. The same calculation takes seven minutes using the four-parameter nonlinear limb darkening model (computed to 1 ppm accuracy). Maximum truncation error for integrated models is an input parameter that can be set as low as 0.001 ppm, ensuring that the community is prepared for the precise transit light curves we anticipate measuring with upcoming facilities. The batman package is open source and publicly available at https://github.com/lkreidberg/batman .

  19. QEDMOD: Fortran program for calculating the model Lamb-shift operator

    NASA Astrophysics Data System (ADS)

    Shabaev, V. M.; Tupitsyn, I. I.; Yerokhin, V. A.

    2018-02-01

    We present Fortran package QEDMOD for computing the model QED operator hQED that can be used to account for the Lamb shift in accurate atomic-structure calculations. The package routines calculate the matrix elements of hQED with the user-specified one-electron wave functions. The operator can be used to calculate Lamb shift in many-electron atomic systems with a typical accuracy of few percent, either by evaluating the matrix element of hQED with the many-electron wave function, or by adding hQED to the Dirac-Coulomb-Breit Hamiltonian.

  20. Implementation of interconnect simulation tools in spice

    NASA Technical Reports Server (NTRS)

    Satsangi, H.; Schutt-Aine, J. E.

    1993-01-01

    Accurate computer simulation of high speed digital computer circuits and communication circuits requires a multimode approach to simulate both the devices and the interconnects between devices. Classical circuit analysis algorithms (lumped parameter) are needed for circuit devices and the network formed by the interconnected devices. The interconnects, however, have to be modeled as transmission lines which incorporate electromagnetic field analysis. An approach to writing a multimode simulator is to take an existing software package which performs either lumped parameter analysis or field analysis and add the missing type of analysis routines to the package. In this work a traditionally lumped parameter simulator, SPICE, is modified so that it will perform lossy transmission line analysis using a different model approach. Modifying SPICE3E2 or any other large software package is not a trivial task. An understanding of the programming conventions used, simulation software, and simulation algorithms is required. This thesis was written to clarify the procedure for installing a device into SPICE3E2. The installation of three devices is documented and the installations of the first two provide a foundation for installation of the lossy line which is the third device. The details of discussions are specific to SPICE, but the concepts will be helpful when performing installations into other circuit analysis packages.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salama, A.; Mikhail, M.

    Comprehensive software packages have been developed at the Western Research Centre as tools to help coal preparation engineers analyze, evaluate, and control coal cleaning processes. The COal Preparation Software package (COPS) performs three functions: (1) data handling and manipulation, (2) data analysis, including the generation of washability data, performance evaluation and prediction, density and size modeling, evaluation of density and size partition characteristics and attrition curves, and (3) generation of graphics output. The Separation ChARacteristics Estimation software packages (SCARE) are developed to balance raw density or size separation data. The cases of density and size separation data are considered. Themore » generated balanced data can take the balanced or normalized forms. The scaled form is desirable for direct determination of the partition functions (curves). The raw and generated separation data are displayed in tabular and/or graphical forms. The computer softwares described in this paper are valuable tools for coal preparation plant engineers and operators for evaluating process performance, adjusting plant parameters, and balancing raw density or size separation data. These packages have been applied very successfully in many projects carried out by WRC for the Canadian coal preparation industry. The software packages are designed to run on a personal computer (PC).« less

  2. Sarment: Python modules for HMM analysis and partitioning of sequences.

    PubMed

    Guéguen, Laurent

    2005-08-15

    Sarment is a package of Python modules for easy building and manipulation of sequence segmentations. It provides efficient implementation of usual algorithms for hidden Markov Model computation, as well as for maximal predictive partitioning. Owing to its very large variety of criteria for computing segmentations, Sarment can handle many kinds of models. Because of object-oriented programming, the results of the segmentation are very easy tomanipulate.

  3. A System Dynamics Model of the Departmental Deployment of Instructional Resources.

    ERIC Educational Resources Information Center

    Beck, Bruce D.

    This paper reports on the development and testing of a system dynamics model of the departmental deployment of instructional resources at the University of Wisconsin-Madison. A model was developed using the Stella II computer software package. The model describes describes how departments keep student enrollments, number of course sections, and…

  4. Evaluating variability and uncertainty separately in microbial quantitative risk assessment using two R packages.

    PubMed

    Pouillot, Régis; Delignette-Muller, Marie Laure

    2010-09-01

    Quantitative risk assessment has emerged as a valuable tool to enhance the scientific basis of regulatory decisions in the food safety domain. This article introduces the use of two new computing resources (R packages) specifically developed to help risk assessors in their projects. The first package, "fitdistrplus", gathers tools for choosing and fitting a parametric univariate distribution to a given dataset. The data may be continuous or discrete. Continuous data may be right-, left- or interval-censored as is frequently obtained with analytical methods, with the possibility of various censoring thresholds within the dataset. Bootstrap procedures then allow the assessor to evaluate and model the uncertainty around the parameters and to transfer this information into a quantitative risk assessment model. The second package, "mc2d", helps to build and study two dimensional (or second-order) Monte-Carlo simulations in which the estimation of variability and uncertainty in the risk estimates is separated. This package easily allows the transfer of separated variability and uncertainty along a chain of conditional mathematical and probabilistic models. The usefulness of these packages is illustrated through a risk assessment of hemolytic and uremic syndrome in children linked to the presence of Escherichia coli O157:H7 in ground beef. These R packages are freely available at the Comprehensive R Archive Network (cran.r-project.org). Copyright 2010 Elsevier B.V. All rights reserved.

  5. Computer-Aided Training for Transport Planners: Experience with the Pluto Package.

    ERIC Educational Resources Information Center

    Bonsall, P. W.

    1995-01-01

    Describes the PLUTO model, an interactive computer program designed for use in education and training of city planners and engineers. Emphasizes four issues: (1) the balance between realism and simplification; (2) the design of the user interface; (3) comparative advantages of group and solo working; and (4) factors affecting the decision to…

  6. Evaluation of the discrete vortex wake cross flow model using vector computers. Part 2: User's manual for DIVORCE

    NASA Technical Reports Server (NTRS)

    Deffenbaugh, F. D.; Vitz, J. F.

    1979-01-01

    The users manual for the Discrete Vortex Cross flow Evaluator (DIVORCE) computer program is presented. DIVORCE was developed in FORTRAN 4 for the DCD 6600 and CDC 7600 machines. Optimal calls to a NASA vector subroutine package are provided for use with the CDC 7600.

  7. Transient Heat Conduction Simulation around Microprocessor Die

    NASA Astrophysics Data System (ADS)

    Nishi, Koji

    This paper explains about fundamental formula of calculating power consumption of CMOS (Complementary Metal-Oxide-Semiconductor) devices and its voltage and temperature dependency, then introduces equation for estimating power consumption of the microprocessor for notebook PC (Personal Computer). The equation is applied to heat conduction simulation with simplified thermal model and evaluates in sub-millisecond time step calculation. In addition, the microprocessor has two major heat conduction paths; one is from the top of the silicon die via thermal solution and the other is from package substrate and pins via PGA (Pin Grid Array) socket. Even though the dominant factor of heat conduction is the former path, the latter path - from package substrate and pins - plays an important role in transient heat conduction behavior. Therefore, this paper tries to focus the path from package substrate and pins, and to investigate more accurate method of estimating heat conduction paths of the microprocessor. Also, cooling performance expression of heatsink fan is one of key points to assure result with practical accuracy, while finer expression requires more computation resources which results in longer computation time. Then, this paper discusses the expression to minimize computation workload with a practical accuracy of the result.

  8. PCE: web tools to compute protein continuum electrostatics

    PubMed Central

    Miteva, Maria A.; Tufféry, Pierre; Villoutreix, Bruno O.

    2005-01-01

    PCE (protein continuum electrostatics) is an online service for protein electrostatic computations presently based on the MEAD (macroscopic electrostatics with atomic detail) package initially developed by D. Bashford [(2004) Front Biosci., 9, 1082–1099]. This computer method uses a macroscopic electrostatic model for the calculation of protein electrostatic properties, such as pKa values of titratable groups and electrostatic potentials. The MEAD package generates electrostatic energies via finite difference solution to the Poisson–Boltzmann equation. Users submit a PDB file and PCE returns potentials and pKa values as well as color (static or animated) figures displaying electrostatic potentials mapped on the molecular surface. This service is intended to facilitate electrostatics analyses of proteins and thereby broaden the accessibility to continuum electrostatics to the biological community. PCE can be accessed at . PMID:15980492

  9. Parallel Adaptive Mesh Refinement Library

    NASA Technical Reports Server (NTRS)

    Mac-Neice, Peter; Olson, Kevin

    2005-01-01

    Parallel Adaptive Mesh Refinement Library (PARAMESH) is a package of Fortran 90 subroutines designed to provide a computer programmer with an easy route to extension of (1) a previously written serial code that uses a logically Cartesian structured mesh into (2) a parallel code with adaptive mesh refinement (AMR). Alternatively, in its simplest use, and with minimal effort, PARAMESH can operate as a domain-decomposition tool for users who want to parallelize their serial codes but who do not wish to utilize adaptivity. The package builds a hierarchy of sub-grids to cover the computational domain of a given application program, with spatial resolution varying to satisfy the demands of the application. The sub-grid blocks form the nodes of a tree data structure (a quad-tree in two or an oct-tree in three dimensions). Each grid block has a logically Cartesian mesh. The package supports one-, two- and three-dimensional models.

  10. Modifications of the U.S. Geological Survey modular, finite-difference, ground-water flow model to read and write geographic information system files

    USGS Publications Warehouse

    Orzol, Leonard L.; McGrath, Timothy S.

    1992-01-01

    This report documents modifications to the U.S. Geological Survey modular, three-dimensional, finite-difference, ground-water flow model, commonly called MODFLOW, so that it can read and write files used by a geographic information system (GIS). The modified model program is called MODFLOWARC. Simulation programs such as MODFLOW generally require large amounts of input data and produce large amounts of output data. Viewing data graphically, generating head contours, and creating or editing model data arrays such as hydraulic conductivity are examples of tasks that currently are performed either by the use of independent software packages or by tedious manual editing, manipulating, and transferring data. Programs such as GIS programs are commonly used to facilitate preparation of the model input data and analyze model output data; however, auxiliary programs are frequently required to translate data between programs. Data translations are required when different programs use different data formats. Thus, the user might use GIS techniques to create model input data, run a translation program to convert input data into a format compatible with the ground-water flow model, run the model, run a translation program to convert the model output into the correct format for GIS, and use GIS to display and analyze this output. MODFLOWARC, avoids the two translation steps and transfers data directly to and from the ground-water-flow model. This report documents the design and use of MODFLOWARC and includes instructions for data input/output of the Basic, Block-centered flow, River, Recharge, Well, Drain, Evapotranspiration, General-head boundary, and Streamflow-routing packages. The modification to MODFLOW and the Streamflow-Routing package was minimized. Flow charts and computer-program code describe the modifications to the original computer codes for each of these packages. Appendix A contains a discussion on the operation of MODFLOWARC using a sample problem.

  11. MOLNs: A CLOUD PLATFORM FOR INTERACTIVE, REPRODUCIBLE, AND SCALABLE SPATIAL STOCHASTIC COMPUTATIONAL EXPERIMENTS IN SYSTEMS BIOLOGY USING PyURDME.

    PubMed

    Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas

    2016-01-01

    Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments.

  12. The Dynamo package for tomography and subtomogram averaging: components for MATLAB, GPU computing and EC2 Amazon Web Services

    PubMed Central

    Castaño-Díez, Daniel

    2017-01-01

    Dynamo is a package for the processing of tomographic data. As a tool for subtomogram averaging, it includes different alignment and classification strategies. Furthermore, its data-management module allows experiments to be organized in groups of tomograms, while offering specialized three-dimensional tomographic browsers that facilitate visualization, location of regions of interest, modelling and particle extraction in complex geometries. Here, a technical description of the package is presented, focusing on its diverse strategies for optimizing computing performance. Dynamo is built upon mbtools (middle layer toolbox), a general-purpose MATLAB library for object-oriented scientific programming specifically developed to underpin Dynamo but usable as an independent tool. Its structure intertwines a flexible MATLAB codebase with precompiled C++ functions that carry the burden of numerically intensive operations. The package can be delivered as a precompiled standalone ready for execution without a MATLAB license. Multicore parallelization on a single node is directly inherited from the high-level parallelization engine provided for MATLAB, automatically imparting a balanced workload among the threads in computationally intense tasks such as alignment and classification, but also in logistic-oriented tasks such as tomogram binning and particle extraction. Dynamo supports the use of graphical processing units (GPUs), yielding considerable speedup factors both for native Dynamo procedures (such as the numerically intensive subtomogram alignment) and procedures defined by the user through its MATLAB-based GPU library for three-dimensional operations. Cloud-based virtual computing environments supplied with a pre-installed version of Dynamo can be publicly accessed through the Amazon Elastic Compute Cloud (EC2), enabling users to rent GPU computing time on a pay-as-you-go basis, thus avoiding upfront investments in hardware and longterm software maintenance. PMID:28580909

  13. The Dynamo package for tomography and subtomogram averaging: components for MATLAB, GPU computing and EC2 Amazon Web Services.

    PubMed

    Castaño-Díez, Daniel

    2017-06-01

    Dynamo is a package for the processing of tomographic data. As a tool for subtomogram averaging, it includes different alignment and classification strategies. Furthermore, its data-management module allows experiments to be organized in groups of tomograms, while offering specialized three-dimensional tomographic browsers that facilitate visualization, location of regions of interest, modelling and particle extraction in complex geometries. Here, a technical description of the package is presented, focusing on its diverse strategies for optimizing computing performance. Dynamo is built upon mbtools (middle layer toolbox), a general-purpose MATLAB library for object-oriented scientific programming specifically developed to underpin Dynamo but usable as an independent tool. Its structure intertwines a flexible MATLAB codebase with precompiled C++ functions that carry the burden of numerically intensive operations. The package can be delivered as a precompiled standalone ready for execution without a MATLAB license. Multicore parallelization on a single node is directly inherited from the high-level parallelization engine provided for MATLAB, automatically imparting a balanced workload among the threads in computationally intense tasks such as alignment and classification, but also in logistic-oriented tasks such as tomogram binning and particle extraction. Dynamo supports the use of graphical processing units (GPUs), yielding considerable speedup factors both for native Dynamo procedures (such as the numerically intensive subtomogram alignment) and procedures defined by the user through its MATLAB-based GPU library for three-dimensional operations. Cloud-based virtual computing environments supplied with a pre-installed version of Dynamo can be publicly accessed through the Amazon Elastic Compute Cloud (EC2), enabling users to rent GPU computing time on a pay-as-you-go basis, thus avoiding upfront investments in hardware and longterm software maintenance.

  14. Computational Modeling | Photovoltaic Research | NREL

    Science.gov Websites

    performance of single- and multijunction cells and modules. We anticipate the upcoming completion of our next software package for a simplified electronic design of single- and multicrystalline silicon solar cells

  15. Vehicle Sketch Pad: a Parametric Geometry Modeler for Conceptual Aircraft Design

    NASA Technical Reports Server (NTRS)

    Hahn, Andrew S.

    2010-01-01

    The conceptual aircraft designer is faced with a dilemma, how to strike the best balance between productivity and fidelity? Historically, handbook methods have required only the coarsest of geometric parameterizations in order to perform analysis. Increasingly, there has been a drive to upgrade analysis methods, but these require considerably more precise and detailed geometry. Attempts have been made to use computer-aided design packages to fill this void, but their cost and steep learning curve have made them unwieldy at best. Vehicle Sketch Pad (VSP) has been developed over several years to better fill this void. While no substitute for the full feature set of computer-aided design packages, VSP allows even novices to quickly become proficient in defining three-dimensional, watertight aircraft geometries that are adequate for producing multi-disciplinary meta-models for higher order analysis methods, wind tunnel and display models, as well as a starting point for animation models. This paper will give an overview of the development and future course of VSP.

  16. Fragment-Based Docking: Development of the CHARMMing Web User Interface as a Platform for Computer-Aided Drug Design

    PubMed Central

    2015-01-01

    Web-based user interfaces to scientific applications are important tools that allow researchers to utilize a broad range of software packages with just an Internet connection and a browser.1 One such interface, CHARMMing (CHARMM interface and graphics), facilitates access to the powerful and widely used molecular software package CHARMM. CHARMMing incorporates tasks such as molecular structure analysis, dynamics, multiscale modeling, and other techniques commonly used by computational life scientists. We have extended CHARMMing’s capabilities to include a fragment-based docking protocol that allows users to perform molecular docking and virtual screening calculations either directly via the CHARMMing Web server or on computing resources using the self-contained job scripts generated via the Web interface. The docking protocol was evaluated by performing a series of “re-dockings” with direct comparison to top commercial docking software. Results of this evaluation showed that CHARMMing’s docking implementation is comparable to many widely used software packages and validates the use of the new CHARMM generalized force field for docking and virtual screening. PMID:25151852

  17. Fragment-based docking: development of the CHARMMing Web user interface as a platform for computer-aided drug design.

    PubMed

    Pevzner, Yuri; Frugier, Emilie; Schalk, Vinushka; Caflisch, Amedeo; Woodcock, H Lee

    2014-09-22

    Web-based user interfaces to scientific applications are important tools that allow researchers to utilize a broad range of software packages with just an Internet connection and a browser. One such interface, CHARMMing (CHARMM interface and graphics), facilitates access to the powerful and widely used molecular software package CHARMM. CHARMMing incorporates tasks such as molecular structure analysis, dynamics, multiscale modeling, and other techniques commonly used by computational life scientists. We have extended CHARMMing's capabilities to include a fragment-based docking protocol that allows users to perform molecular docking and virtual screening calculations either directly via the CHARMMing Web server or on computing resources using the self-contained job scripts generated via the Web interface. The docking protocol was evaluated by performing a series of "re-dockings" with direct comparison to top commercial docking software. Results of this evaluation showed that CHARMMing's docking implementation is comparable to many widely used software packages and validates the use of the new CHARMM generalized force field for docking and virtual screening.

  18. Introducing Computer Simulation into the High School: An Applied Mathematics Curriculum.

    ERIC Educational Resources Information Center

    Roberts, Nancy

    1981-01-01

    A programing language called DYNAMO, developed especially for writing simulation models, is promoted. Details of six, self-teaching curriculum packages recently developed for simulation-oriented instruction are provided. (MP)

  19. Software for Brain Network Simulations: A Comparative Study

    PubMed Central

    Tikidji-Hamburyan, Ruben A.; Narayana, Vikram; Bozkus, Zeki; El-Ghazawi, Tarek A.

    2017-01-01

    Numerical simulations of brain networks are a critical part of our efforts in understanding brain functions under pathological and normal conditions. For several decades, the community has developed many software packages and simulators to accelerate research in computational neuroscience. In this article, we select the three most popular simulators, as determined by the number of models in the ModelDB database, such as NEURON, GENESIS, and BRIAN, and perform an independent evaluation of these simulators. In addition, we study NEST, one of the lead simulators of the Human Brain Project. First, we study them based on one of the most important characteristics, the range of supported models. Our investigation reveals that brain network simulators may be biased toward supporting a specific set of models. However, all simulators tend to expand the supported range of models by providing a universal environment for the computational study of individual neurons and brain networks. Next, our investigations on the characteristics of computational architecture and efficiency indicate that all simulators compile the most computationally intensive procedures into binary code, with the aim of maximizing their computational performance. However, not all simulators provide the simplest method for module development and/or guarantee efficient binary code. Third, a study of their amenability for high-performance computing reveals that NEST can almost transparently map an existing model on a cluster or multicore computer, while NEURON requires code modification if the model developed for a single computer has to be mapped on a computational cluster. Interestingly, parallelization is the weakest characteristic of BRIAN, which provides no support for cluster computations and limited support for multicore computers. Fourth, we identify the level of user support and frequency of usage for all simulators. Finally, we carry out an evaluation using two case studies: a large network with simplified neural and synaptic models and a small network with detailed models. These two case studies allow us to avoid any bias toward a particular software package. The results indicate that BRIAN provides the most concise language for both cases considered. Furthermore, as expected, NEST mostly favors large network models, while NEURON is better suited for detailed models. Overall, the case studies reinforce our general observation that simulators have a bias in the computational performance toward specific types of the brain network models. PMID:28775687

  20. Software engineering the mixed model for genome-wide association studies on large samples.

    PubMed

    Zhang, Zhiwu; Buckler, Edward S; Casstevens, Terry M; Bradbury, Peter J

    2009-11-01

    Mixed models improve the ability to detect phenotype-genotype associations in the presence of population stratification and multiple levels of relatedness in genome-wide association studies (GWAS), but for large data sets the resource consumption becomes impractical. At the same time, the sample size and number of markers used for GWAS is increasing dramatically, resulting in greater statistical power to detect those associations. The use of mixed models with increasingly large data sets depends on the availability of software for analyzing those models. While multiple software packages implement the mixed model method, no single package provides the best combination of fast computation, ability to handle large samples, flexible modeling and ease of use. Key elements of association analysis with mixed models are reviewed, including modeling phenotype-genotype associations using mixed models, population stratification, kinship and its estimation, variance component estimation, use of best linear unbiased predictors or residuals in place of raw phenotype, improving efficiency and software-user interaction. The available software packages are evaluated, and suggestions made for future software development.

  1. The gputools package enables GPU computing in R.

    PubMed

    Buckner, Joshua; Wilson, Justin; Seligman, Mark; Athey, Brian; Watson, Stanley; Meng, Fan

    2010-01-01

    By default, the R statistical environment does not make use of parallelism. Researchers may resort to expensive solutions such as cluster hardware for large analysis tasks. Graphics processing units (GPUs) provide an inexpensive and computationally powerful alternative. Using R and the CUDA toolkit from Nvidia, we have implemented several functions commonly used in microarray gene expression analysis for GPU-equipped computers. R users can take advantage of the better performance provided by an Nvidia GPU. The package is available from CRAN, the R project's repository of packages, at http://cran.r-project.org/web/packages/gputools More information about our gputools R package is available at http://brainarray.mbni.med.umich.edu/brainarray/Rgpgpu

  2. Practical Issues in Interactive Multimedia Design.

    ERIC Educational Resources Information Center

    James, Jeff

    This paper describes a range of computer assisted learning software models--linear, unstructured, and ideal--and discusses issues such as control, interactivity, and ease-of-programming. It also introduces a "compromise model" used for a package currently under development at the Hong Kong Polytechnic University, which is intended to…

  3. DESIGNING SULFATE-REDUCING BACTERIA FIELD BIOREACTORS USING THE BEST MODEL

    EPA Science Inventory

    BEST (bioreactor economics, size and time of operation) is a spreadsheet-based model that is used in conjunction with a public domain computer software package, PHREEQCI. BEST is intended to be used in the design process of sulfate-reducing bacteria (SRB)field bioreactors to pas...

  4. Three-Dimensional (3D) Nanometrology Based on Scanning Electron Microscope (SEM) Stereophotogrammetry.

    PubMed

    Tondare, Vipin N; Villarrubia, John S; Vlada R, András E

    2017-10-01

    Three-dimensional (3D) reconstruction of a sample surface from scanning electron microscope (SEM) images taken at two perspectives has been known for decades. Nowadays, there exist several commercially available stereophotogrammetry software packages. For testing these software packages, in this study we used Monte Carlo simulated SEM images of virtual samples. A virtual sample is a model in a computer, and its true dimensions are known exactly, which is impossible for real SEM samples due to measurement uncertainty. The simulated SEM images can be used for algorithm testing, development, and validation. We tested two stereophotogrammetry software packages and compared their reconstructed 3D models with the known geometry of the virtual samples used to create the simulated SEM images. Both packages performed relatively well with simulated SEM images of a sample with a rough surface. However, in a sample containing nearly uniform and therefore low-contrast zones, the height reconstruction error was ≈46%. The present stereophotogrammetry software packages need further improvement before they can be used reliably with SEM images with uniform zones.

  5. GAPIT: genome association and prediction integrated tool.

    PubMed

    Lipka, Alexander E; Tian, Feng; Wang, Qishan; Peiffer, Jason; Li, Meng; Bradbury, Peter J; Gore, Michael A; Buckler, Edward S; Zhang, Zhiwu

    2012-09-15

    Software programs that conduct genome-wide association studies and genomic prediction and selection need to use methodologies that maximize statistical power, provide high prediction accuracy and run in a computationally efficient manner. We developed an R package called Genome Association and Prediction Integrated Tool (GAPIT) that implements advanced statistical methods including the compressed mixed linear model (CMLM) and CMLM-based genomic prediction and selection. The GAPIT package can handle large datasets in excess of 10 000 individuals and 1 million single-nucleotide polymorphisms with minimal computational time, while providing user-friendly access and concise tables and graphs to interpret results. http://www.maizegenetics.net/GAPIT. zhiwu.zhang@cornell.edu Supplementary data are available at Bioinformatics online.

  6. Computer aided system engineering and analysis (CASE/A) modeling package for ECLS systems - An overview

    NASA Technical Reports Server (NTRS)

    Dalee, Robert C.; Bacskay, Allen S.; Knox, James C.

    1990-01-01

    An overview of the CASE/A-ECLSS series modeling package is presented. CASE/A is an analytical tool that has supplied engineering productivity accomplishments during ECLSS design activities. A components verification program was performed to assure component modeling validity based on test data from the Phase II comparative test program completed at the Marshall Space Flight Center. An integrated plotting feature has been added to the program which allows the operator to analyze on-screen data trends or get hard copy plots from within the CASE/A operating environment. New command features in the areas of schematic, output, and model management, and component data editing have been incorporated to enhance the engineer's productivity during a modeling program.

  7. Using Q-Chem on the Peregrine System | High-Performance Computing | NREL

    Science.gov Websites

    initio quantum chemistry package with special strengths in excited state methods, non-adiabatic coupling , solvation models, explicitly correlated wavefunction methods, and cutting-edge DFT. Running Q-Chem on

  8. Evaluation of the fishery status for King Soldier Bream Argyrops spinifer in Pakistan using the software CEDA and ASPIC

    NASA Astrophysics Data System (ADS)

    Memon, Aamir Mahmood; Liu, Qun; Memon, Khadim Hussain; Baloch, Wazir Ali; Memon, Asfandyar; Baset, Abdul

    2015-07-01

    Catch and effort data were analyzed to estimate the maximum sustainable yield (MSY) of King Soldier Bream, Argyrops spinifer (Forsskål, 1775, Family: Sparidae), and to evaluate the present status of the fish stocks exploited in Pakistani waters. The catch and effort data for the 25-years period 1985-2009 were analyzed using two computer software packages, CEDA (catch and effort data analysis) and ASPIC (a surplus production model incorporating covariates). The maximum catch of 3 458 t was observed in 1988 and the minimum catch of 1 324 t in 2005, while the average annual catch of A. spinifer over the 25 years was 2 500 t. The surplus production models of Fox, Schaefer, and Pella Tomlinson under three error assumptions of normal, log-normal and gamma are in the CEDA package and the two surplus models of Fox and logistic are in the ASPIC package. In CEDA, the MSY was estimated by applying the initial proportion (IP) of 0.8, because the starting catch was approximately 80% of the maximum catch. Except for gamma, because gamma showed maximization failures, the estimated results of MSY using CEDA with the Fox surplus production model and two error assumptions, were 1 692.08 t ( R 2=0.572) and 1 694.09 t ( R 2=0.606), respectively, and from the Schaefer and the Pella Tomlinson models with two error assumptions were 2 390.95 t ( R 2=0.563), and 2 380.06 t ( R 2=0.605), respectively. The MSY estimated by the Fox model was conservatively compared to the Schaefer and Pella Tomlinson models. The MSY values from Schaefer and Pella Tomlinson models were the same. The computed values of MSY using the ASPIC computer software program with the two surplus production models of Fox and logistic were 1 498 t ( R 2=0.917), and 2 488 t ( R 2=0.897) respectively. The estimated values of MSY using CEDA were about 1 700-2 400 t and the values from ASPIC were 1 500-2 500 t. The estimates output by the CEDA and the ASPIC packages indicate that the stock is overfished, and needs some effective management to reduce the fishing effort of the species in Pakistani waters.

  9. GIS-based channel flow and sediment transport simulation using CCHE1D coupled with AnnAGNPS

    USDA-ARS?s Scientific Manuscript database

    CCHE1D (Center for Computational Hydroscience and Engineering 1-Dimensional model) simulates unsteady free-surface flows with nonequilibrium, nonuniform sediment transport in dendritic channel networks. Since early 1990’s, the model and its software packages have been developed and continuously main...

  10. WHAEM: PROGRAM DOCUMENTATION FOR THE WELLHEAD ANALYTIC ELEMENT MODEL (EPA/600/SR-94/210)

    EPA Science Inventory

    A new computer program has been developed to determine time-of-travel capture zones in relatively simple geohydrological settings. The WhAEM package contains an analytic element model that uses superposition of (many) closed form analytical solutions to generate a groundwater flo...

  11. The importance of data curation on QSAR Modeling - PHYSPROP open data as a case study. (QSAR 2016)

    EPA Science Inventory

    During the last few decades many QSAR models and tools have been developed at the US EPA, including the widely used EPISuite. During this period the arsenal of computational capabilities supporting cheminformatics has broadened dramatically with multiple software packages. These ...

  12. Tight-binding model for materials at mesoscale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tai, Yuan-Yen; Choi, Hongchul; Zhu, Wei

    2016-12-21

    TBM3 is an open source package for computational simulations of quantum materials at multiple scales in length and time. The project originated to investigate the multiferroic behavior in transition-metal oxide heterostructures. The framework has also been designed to study emergent phemona in other quantum materials like 2-dimensional transition-metal dichalcogenides, graphene, topological insulators, and skyrmion in materials, etc. In the long term, we will enable the package for transport and time-resolved phenomena. TBM3 is currently a C++ based numerical tool package and framework for the design and construction of any kind of lattice structures with multi-orbital and spin degrees of freedom.more » The fortran based portion of the package will be added in the near future. The design of TBM3 is in a highly flexible and reusable framework and the tight-binding parameters can be modeled or informed by DFT calculations. It is currently GPU enabled and feature of CPU enabled MPI will be added in the future.« less

  13. Evaluation of Rankine cycle air conditioning system hardware by computer simulation

    NASA Technical Reports Server (NTRS)

    Healey, H. M.; Clark, D.

    1978-01-01

    A computer program for simulating the performance of a variety of solar powered Rankine cycle air conditioning system components (RCACS) has been developed. The computer program models actual equipment by developing performance maps from manufacturers data and is capable of simulating off-design operation of the RCACS components. The program designed to be a subroutine of the Marshall Space Flight Center (MSFC) Solar Energy System Analysis Computer Program 'SOLRAD', is a complete package suitable for use by an occasional computer user in developing performance maps of heating, ventilation and air conditioning components.

  14. General Algebraic Modeling System Tutorial | High-Performance Computing |

    Science.gov Websites

    power generation from two different fuels. The goal is to minimize the cost for one of the fuels while Here's a basic tutorial for modeling optimization problems with the General Algebraic Modeling System (GAMS). Overview The GAMS (General Algebraic Modeling System) package is essentially a compiler for a

  15. Modeling parameterized geometry in GPU-based Monte Carlo particle transport simulation for radiotherapy.

    PubMed

    Chi, Yujie; Tian, Zhen; Jia, Xun

    2016-08-07

    Monte Carlo (MC) particle transport simulation on a graphics-processing unit (GPU) platform has been extensively studied recently due to the efficiency advantage achieved via massive parallelization. Almost all of the existing GPU-based MC packages were developed for voxelized geometry. This limited application scope of these packages. The purpose of this paper is to develop a module to model parametric geometry and integrate it in GPU-based MC simulations. In our module, each continuous region was defined by its bounding surfaces that were parameterized by quadratic functions. Particle navigation functions in this geometry were developed. The module was incorporated to two previously developed GPU-based MC packages and was tested in two example problems: (1) low energy photon transport simulation in a brachytherapy case with a shielded cylinder applicator and (2) MeV coupled photon/electron transport simulation in a phantom containing several inserts of different shapes. In both cases, the calculated dose distributions agreed well with those calculated in the corresponding voxelized geometry. The averaged dose differences were 1.03% and 0.29%, respectively. We also used the developed package to perform simulations of a Varian VS 2000 brachytherapy source and generated a phase-space file. The computation time under the parameterized geometry depended on the memory location storing the geometry data. When the data was stored in GPU's shared memory, the highest computational speed was achieved. Incorporation of parameterized geometry yielded a computation time that was ~3 times of that in the corresponding voxelized geometry. We also developed a strategy to use an auxiliary index array to reduce frequency of geometry calculations and hence improve efficiency. With this strategy, the computational time ranged in 1.75-2.03 times of the voxelized geometry for coupled photon/electron transport depending on the voxel dimension of the auxiliary index array, and in 0.69-1.23 times for photon only transport.

  16. Data flow modeling techniques

    NASA Technical Reports Server (NTRS)

    Kavi, K. M.

    1984-01-01

    There have been a number of simulation packages developed for the purpose of designing, testing and validating computer systems, digital systems and software systems. Complex analytical tools based on Markov and semi-Markov processes have been designed to estimate the reliability and performance of simulated systems. Petri nets have received wide acceptance for modeling complex and highly parallel computers. In this research data flow models for computer systems are investigated. Data flow models can be used to simulate both software and hardware in a uniform manner. Data flow simulation techniques provide the computer systems designer with a CAD environment which enables highly parallel complex systems to be defined, evaluated at all levels and finally implemented in either hardware or software. Inherent in data flow concept is the hierarchical handling of complex systems. In this paper we will describe how data flow can be used to model computer system.

  17. Evaluation of probabilistic forecasts with the scoringRules package

    NASA Astrophysics Data System (ADS)

    Jordan, Alexander; Krüger, Fabian; Lerch, Sebastian

    2017-04-01

    Over the last decades probabilistic forecasts in the form of predictive distributions have become popular in many scientific disciplines. With the proliferation of probabilistic models arises the need for decision-theoretically principled tools to evaluate the appropriateness of models and forecasts in a generalized way in order to better understand sources of prediction errors and to improve the models. Proper scoring rules are functions S(F,y) which evaluate the accuracy of a forecast distribution F , given that an outcome y was observed. In coherence with decision-theoretical principles they allow to compare alternative models, a crucial ability given the variety of theories, data sources and statistical specifications that is available in many situations. This contribution presents the software package scoringRules for the statistical programming language R, which provides functions to compute popular scoring rules such as the continuous ranked probability score for a variety of distributions F that come up in applied work. For univariate variables, two main classes are parametric distributions like normal, t, or gamma distributions, and distributions that are not known analytically, but are indirectly described through a sample of simulation draws. For example, ensemble weather forecasts take this form. The scoringRules package aims to be a convenient dictionary-like reference for computing scoring rules. We offer state of the art implementations of several known (but not routinely applied) formulas, and implement closed-form expressions that were previously unavailable. Whenever more than one implementation variant exists, we offer statistically principled default choices. Recent developments include the addition of scoring rules to evaluate multivariate forecast distributions. The use of the scoringRules package is illustrated in an example on post-processing ensemble forecasts of temperature.

  18. SAHARA: A package of PC computer programs for estimating both log-hyperbolic grain-size parameters and standard moments

    NASA Astrophysics Data System (ADS)

    Christiansen, Christian; Hartmann, Daniel

    This paper documents a package of menu-driven POLYPASCAL87 computer programs for handling grouped observations data from both sieving (increment data) and settling tube procedures (cumulative data). The package is designed deliberately for use on IBM-compatible personal computers. Two of the programs solve the numerical problem of determining the estimates of the four (main) parameters of the log-hyperbolic distribution and their derivatives. The package also contains a program for determining the mean, sorting, skewness. and kurtosis according to the standard moments. Moreover, the package contains procedures for smoothing and grouping of settling tube data. A graphic part of the package plots the data in a log-log plot together with the estimated log-hyperbolic curve. Along with the plot follows all estimated parameters. Another graphic option is a plot of the log-hyperbolic shape triangle with the (χ,ζ) position of the sample.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Driscoll, Frederick; Platt, Andrew; Sirnivas, Senu

    This project was performed under the Work for Others—Funds in Agreement FIA-14-1793 between Statoil and the Alliance for Sustainable Energy, manager and operator of the National Renewable Energy Laboratory (NREL). To support the development of a 6-MW spar-mounted offshore wind turbine, Statoil funded NREL to perform tasks in the following three categories: 1. Design and analysis 2. Wake modeling 3. Concept resource assessment. This summarizes the document reports on the design and analysis work package, which built a FAST [Jonkman & Buhl, 2005] computer model of the Hywind 6-MW floating wind turbine system and uses this tool to evaluate themore » performance of a set of design load cases (DLCs). The FAST model was also used in Work Package 2: Wake Modeling.« less

  20. A methodology for the design of experiments in computational intelligence with multiple regression models.

    PubMed

    Fernandez-Lozano, Carlos; Gestal, Marcos; Munteanu, Cristian R; Dorado, Julian; Pazos, Alejandro

    2016-01-01

    The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable.

  1. A methodology for the design of experiments in computational intelligence with multiple regression models

    PubMed Central

    Gestal, Marcos; Munteanu, Cristian R.; Dorado, Julian; Pazos, Alejandro

    2016-01-01

    The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable. PMID:27920952

  2. Comparison of particle tracking algorithms in commercial CFD packages: sedimentation and diffusion.

    PubMed

    Robinson, Risa J; Snyder, Pam; Oldham, Michael J

    2007-05-01

    Computational fluid dynamic modeling software has enabled microdosimetry patterns of inhaled toxins and toxicants to be predicted and visualized, and is being used in inhalation toxicology and risk assessment. These predicted microdosimetry patterns in airway structures are derived from predicted airflow patterns within these airways and particle tracking algorithms used in computational fluid dynamics (CFD) software packages. Although these commercial CFD codes have been tested for accuracy under various conditions, they have not been well tested for respiratory flows in general. Nor has their particle tracking algorithm accuracy been well studied. In this study, three software packages, Fluent Discrete Phase Model (DPM), Fluent Fine Particle Model (FPM), and ANSYS CFX, were evaluated. Sedimentation and diffusion were each isolated in a straight tube geometry and tested for accuracy. A range of flow rates corresponding to adult low activity (minute ventilation = 10 L/min) and to heavy exertion (minute ventilation = 60 L/min) were tested by varying the range of dimensionless diffusion and sedimentation parameters found using the Weibel symmetric 23 generation lung morphology. Numerical results for fully developed parabolic and uniform (slip) profiles were compared respectively, to Pich (1972) and Yu (1977) analytical sedimentation solutions. Schum and Yeh (1980) equations for sedimentation were also compared. Numerical results for diffusional deposition were compared to analytical solutions of Ingham (1975) for parabolic and uniform profiles. Significant differences were found among the various CFD software packages and between numerical and analytical solutions. Therefore, it is prudent to validate CFD predictions against analytical solutions in idealized geometry before tackling the complex geometries of the respiratory tract.

  3. BCM: toolkit for Bayesian analysis of Computational Models using samplers.

    PubMed

    Thijssen, Bram; Dijkstra, Tjeerd M H; Heskes, Tom; Wessels, Lodewyk F A

    2016-10-21

    Computational models in biology are characterized by a large degree of uncertainty. This uncertainty can be analyzed with Bayesian statistics, however, the sampling algorithms that are frequently used for calculating Bayesian statistical estimates are computationally demanding, and each algorithm has unique advantages and disadvantages. It is typically unclear, before starting an analysis, which algorithm will perform well on a given computational model. We present BCM, a toolkit for the Bayesian analysis of Computational Models using samplers. It provides efficient, multithreaded implementations of eleven algorithms for sampling from posterior probability distributions and for calculating marginal likelihoods. BCM includes tools to simplify the process of model specification and scripts for visualizing the results. The flexible architecture allows it to be used on diverse types of biological computational models. In an example inference task using a model of the cell cycle based on ordinary differential equations, BCM is significantly more efficient than existing software packages, allowing more challenging inference problems to be solved. BCM represents an efficient one-stop-shop for computational modelers wishing to use sampler-based Bayesian statistics.

  4. VoxelStats: A MATLAB Package for Multi-Modal Voxel-Wise Brain Image Analysis.

    PubMed

    Mathotaarachchi, Sulantha; Wang, Seqian; Shin, Monica; Pascoal, Tharick A; Benedet, Andrea L; Kang, Min Su; Beaudry, Thomas; Fonov, Vladimir S; Gauthier, Serge; Labbe, Aurélie; Rosa-Neto, Pedro

    2016-01-01

    In healthy individuals, behavioral outcomes are highly associated with the variability on brain regional structure or neurochemical phenotypes. Similarly, in the context of neurodegenerative conditions, neuroimaging reveals that cognitive decline is linked to the magnitude of atrophy, neurochemical declines, or concentrations of abnormal protein aggregates across brain regions. However, modeling the effects of multiple regional abnormalities as determinants of cognitive decline at the voxel level remains largely unexplored by multimodal imaging research, given the high computational cost of estimating regression models for every single voxel from various imaging modalities. VoxelStats is a voxel-wise computational framework to overcome these computational limitations and to perform statistical operations on multiple scalar variables and imaging modalities at the voxel level. VoxelStats package has been developed in Matlab(®) and supports imaging formats such as Nifti-1, ANALYZE, and MINC v2. Prebuilt functions in VoxelStats enable the user to perform voxel-wise general and generalized linear models and mixed effect models with multiple volumetric covariates. Importantly, VoxelStats can recognize scalar values or image volumes as response variables and can accommodate volumetric statistical covariates as well as their interaction effects with other variables. Furthermore, this package includes built-in functionality to perform voxel-wise receiver operating characteristic analysis and paired and unpaired group contrast analysis. Validation of VoxelStats was conducted by comparing the linear regression functionality with existing toolboxes such as glim_image and RMINC. The validation results were identical to existing methods and the additional functionality was demonstrated by generating feature case assessments (t-statistics, odds ratio, and true positive rate maps). In summary, VoxelStats expands the current methods for multimodal imaging analysis by allowing the estimation of advanced regional association metrics at the voxel level.

  5. MOLNs: A CLOUD PLATFORM FOR INTERACTIVE, REPRODUCIBLE, AND SCALABLE SPATIAL STOCHASTIC COMPUTATIONAL EXPERIMENTS IN SYSTEMS BIOLOGY USING PyURDME

    PubMed Central

    Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas

    2017-01-01

    Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments. PMID:28190948

  6. NMRbox: A Resource for Biomolecular NMR Computation.

    PubMed

    Maciejewski, Mark W; Schuyler, Adam D; Gryk, Michael R; Moraru, Ion I; Romero, Pedro R; Ulrich, Eldon L; Eghbalnia, Hamid R; Livny, Miron; Delaglio, Frank; Hoch, Jeffrey C

    2017-04-25

    Advances in computation have been enabling many recent advances in biomolecular applications of NMR. Due to the wide diversity of applications of NMR, the number and variety of software packages for processing and analyzing NMR data is quite large, with labs relying on dozens, if not hundreds of software packages. Discovery, acquisition, installation, and maintenance of all these packages is a burdensome task. Because the majority of software packages originate in academic labs, persistence of the software is compromised when developers graduate, funding ceases, or investigators turn to other projects. To simplify access to and use of biomolecular NMR software, foster persistence, and enhance reproducibility of computational workflows, we have developed NMRbox, a shared resource for NMR software and computation. NMRbox employs virtualization to provide a comprehensive software environment preconfigured with hundreds of software packages, available as a downloadable virtual machine or as a Platform-as-a-Service supported by a dedicated compute cloud. Ongoing development includes a metadata harvester to regularize, annotate, and preserve workflows and facilitate and enhance data depositions to BioMagResBank, and tools for Bayesian inference to enhance the robustness and extensibility of computational analyses. In addition to facilitating use and preservation of the rich and dynamic software environment for biomolecular NMR, NMRbox fosters the development and deployment of a new class of metasoftware packages. NMRbox is freely available to not-for-profit users. Copyright © 2017 Biophysical Society. All rights reserved.

  7. A "User-Friendly" Program for Vapor-Liquid Equilibrium.

    ERIC Educational Resources Information Center

    Da Silva, Francisco A.; And Others

    1991-01-01

    Described is a computer software package suitable for teaching and research in the area of multicomponent vapor-liquid equilibrium. This program, which has a complete database, can accomplish phase-equilibrium calculations using various models and graph the results. (KR)

  8. Improved Foundry Castings Utilizing CAD/CAM (Computer Aided Design/ Computer Aided Manufacture). Volume 1. Overview

    DTIC Science & Technology

    1988-06-30

    casting. 68 Figure 1-9: Line printer representation of roll solidification. 69 Figure I1-1: Test casting model. 76 Figure 11-2: Division of test casting...writing new casting analysis and design routines. The new routines would take advantage of advanced criteria for predicting casting soundness and cast...properties and technical advances in computer hardware and software. 11 2. CONCLUSIONS UPCAST, a comprehensive software package, has been developed for

  9. Computer program documentation modified version of the JA70 aerodynamic heating computer program H800 (MINIVER with a DISSPLA plot package

    NASA Technical Reports Server (NTRS)

    Olmedo, L.

    1980-01-01

    The changes, modifications, and inclusions which were adapted to the current version of the MINIVER program are discussed. Extensive modifications were made to various subroutines, and a new plot package added. This plot package is the Johnson Space Center DISSPLA Graphics System currently driven under an 1110 EXEC 8 configuration. User instructions on executing the MINIVER program are provided and the plot package is described.

  10. Comparative Effects of Two Modes of Computer-Assisted Instructional Package on Solid Geometry Achievement

    ERIC Educational Resources Information Center

    Gambari, Isiaka Amosa; Ezenwa, Victoria Ifeoma; Anyanwu, Romanus Chogozie

    2014-01-01

    The study examined the effects of two modes of computer-assisted instructional package on solid geometry achievement amongst senior secondary school students in Minna, Niger State, Nigeria. Also, the influence of gender on the performance of students exposed to CAI(AT) and CAI(AN) packages were examined. This study adopted a pretest-posttest…

  11. Computers and Writing. Learning Package No. 33.

    ERIC Educational Resources Information Center

    Simic, Marge, Comp.; Smith, Carl, Ed.

    Originally developed as part of a project for the Department of Defense Schools (DoDDS) system, this learning package on computers and writing is designed for teachers who wish to upgrade or expand their teaching skills on their own. The package includes an overview of the project; a comprehensive search of the ERIC database; a lecture giving an…

  12. An Interactive Computer Aided Design and Analysis Package.

    DTIC Science & Technology

    1986-03-01

    Al-A167 114 AN INTERACTIVE COMPUTER AIDED DESIGN MUD ANAILYSIS 1/𔃼 PACKAGE(U) NAVAL POSTGRADUATE SCHOOL NONTEREY CA T L EUALD "AR 86 UNCLSSIFIED F... SCHOOL Monterey, California DTIC .LECTE MAYOS THESIS AN INTERACTIVE COMPUTER AIDED DESIGN AND ANALYSIS PACKAGE by Terrence L. Ewald March 1986 jThesis...ORGANIZATION Naval Postgraduate School (if dAp90h81111) Naval Postgraduate School . 62A 6C. ADDRESS (0ty. State, and ZIP Code) 7b. ADDRESS (City State. and

  13. A randomized, controlled, single-blind trial of teaching provided by a computer-based multimedia package versus lecture.

    PubMed

    Williams, C; Aubin, S; Harkin, P; Cottrell, D

    2001-09-01

    Computer-based teaching may allow effective teaching of important psychiatric knowledge and skills. To investigate the effectiveness and acceptability of computer-based teaching. A single-blind, randomized, controlled study of 166 undergraduate medical students at the University of Leeds, involving an educational intervention of either a structured lecture or a computer-based teaching package (both of equal duration). There was no difference in knowledge between the groups at baseline or immediately after teaching. Both groups made significant gains in knowledge after teaching. Students who attended the lecture rated their subjective knowledge and skills at a statistically significantly higher level than students who had used the computers. Students who had used the computer package scored higher on an objective measure of assessment skills. Students did not perceive the computer package to be as useful as the traditional lecture format, despite finding it easy to use and recommending its use to other students. Medical students rate themselves subjectively as learning less from computer-based as compared with lecture-based teaching. Objective measures suggest equivalence in knowledge acquisition and significantly greater skills acquisition for computer-based teaching.

  14. Modeling Antimicrobial Activity of Clorox(R) Using an Agar-Diffusion Test: A New Twist On an Old Experiment.

    ERIC Educational Resources Information Center

    Mitchell, James K.; Carter, William E.

    2000-01-01

    Describes using a computer statistical software package called Minitab to model the sensitivity of several microbes to the disinfectant NaOCl (Clorox') using the Kirby-Bauer technique. Each group of students collects data from one microbe, conducts regression analyses, then chooses the best-fit model based on the highest r-values obtained.…

  15. Environmental Modeling Packages for the MSTDCL TDP: Review and Recommendations (Trousses de Modelisation Environnementale Pour le PDT DCLTCM: Revue et Recommendations)

    DTIC Science & Technology

    2009-09-01

    frequency shallow water scenarios, and DRDC has ready access to a well-established PE model ( PECan ). In those spectral areas below 1 kHz, where the PE...PCs Personnel Computers PE Parabolic Equation PECan PE Model developed by DRDC SPADES/ICE Sensor Performance and Acoustic Detection Evaluation

  16. Computer Aided Drafting Packages for Secondary Education. Edition 1. Apple II and Macintosh. A MicroSIFT Quarterly Report.

    ERIC Educational Resources Information Center

    Pollard, Jim

    This report reviews software packages for Apple Macintosh and Apple II computers available to secondary schools to teach computer-aided drafting (CAD). Products for the report were gathered through reviews of CAD periodicals, computers in education periodicals, advertisements, and teacher recommendations. The first section lists the primary…

  17. Extension of the MIRS computer package for the modeling of molecular spectra: From effective to full ab initio ro-vibrational Hamiltonians in irreducible tensor form

    NASA Astrophysics Data System (ADS)

    Nikitin, A. V.; Rey, M.; Champion, J. P.; Tyuterev, Vl. G.

    2012-07-01

    The MIRS software for the modeling of ro-vibrational spectra of polyatomic molecules was considerably extended and improved. The original version [Nikitin AV, Champion JP, Tyuterev VlG. The MIRS computer package for modeling the rovibrational spectra of polyatomic molecules. J Quant Spectrosc Radiat Transf 2003;82:239-49.] was especially designed for separate or simultaneous treatments of complex band systems of polyatomic molecules. It was set up in the frame of effective polyad models by using algorithms based on advanced group theory algebra to take full account of symmetry properties. It has been successfully used for predictions and data fitting (positions and intensities) of numerous spectra of symmetric and spherical top molecules within the vibration extrapolation scheme. The new version offers more advanced possibilities for spectra calculations and modeling by getting rid of several previous limitations particularly for the size of polyads and the number of tensors involved. It allows dealing with overlapping polyads and includes more efficient and faster algorithms for the calculation of coefficients related to molecular symmetry properties (6C, 9C and 12C symbols for C3v, Td, and Oh point groups) and for better convergence of least-square-fit iterations as well. The new version is not limited to polyad effective models. It also allows direct predictions using full ab initio ro-vibrational normal mode Hamiltonians converted into the irreducible tensor form. Illustrative examples on CH3D, CH4, CH3Cl, CH3F and PH3 are reported reflecting the present status of data available. It is written in C++ for standard PC computer operating under Windows. The full package including on-line documentation and recent data are freely available at http://www.iao.ru/mirs/mirs.htm or http://xeon.univ-reims.fr/Mirs/ or http://icb.u-bourgogne.fr/OMR/SMA/SHTDS/MIRS.html and as supplementary data from the online version of the article.

  18. Complex optimization for big computational and experimental neutron datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bao, Feng; Oak Ridge National Lab.; Archibald, Richard

    Here, we present a framework to use high performance computing to determine accurate solutions to the inverse optimization problem of big experimental data against computational models. We demonstrate how image processing, mathematical regularization, and hierarchical modeling can be used to solve complex optimization problems on big data. We also demonstrate how both model and data information can be used to further increase solution accuracy of optimization by providing confidence regions for the processing and regularization algorithms. Finally, we use the framework in conjunction with the software package SIMPHONIES to analyze results from neutron scattering experiments on silicon single crystals, andmore » refine first principles calculations to better describe the experimental data.« less

  19. Complex optimization for big computational and experimental neutron datasets

    DOE PAGES

    Bao, Feng; Oak Ridge National Lab.; Archibald, Richard; ...

    2016-11-07

    Here, we present a framework to use high performance computing to determine accurate solutions to the inverse optimization problem of big experimental data against computational models. We demonstrate how image processing, mathematical regularization, and hierarchical modeling can be used to solve complex optimization problems on big data. We also demonstrate how both model and data information can be used to further increase solution accuracy of optimization by providing confidence regions for the processing and regularization algorithms. Finally, we use the framework in conjunction with the software package SIMPHONIES to analyze results from neutron scattering experiments on silicon single crystals, andmore » refine first principles calculations to better describe the experimental data.« less

  20. An Integrated Software Package to Enable Predictive Simulation Capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Fitzhenry, Erin B.; Jin, Shuangshuang

    The power grid is increasing in complexity due to the deployment of smart grid technologies. Such technologies vastly increase the size and complexity of power grid systems for simulation and modeling. This increasing complexity necessitates not only the use of high-performance-computing (HPC) techniques, but a smooth, well-integrated interplay between HPC applications. This paper presents a new integrated software package that integrates HPC applications and a web-based visualization tool based on a middleware framework. This framework can support the data communication between different applications. Case studies with a large power system demonstrate the predictive capability brought by the integrated software package,more » as well as the better situational awareness provided by the web-based visualization tool in a live mode. Test results validate the effectiveness and usability of the integrated software package.« less

  1. Assessment of Dose to the Nursing Infant from Radionuclides in Breast Milk

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leggett, Richard Wayne; Eckerman, Keith F

    A computer software package was developed to predict tissue doses to an infant due to intake of radionuclides in breast milk based on bioassay measurements and exposure data for the mother. The package is intended mainly to aid in decisions regarding the safety of breast feeding by a mother who has been acutely exposed to a radionuclide during lactation or pregnancy, but it may be applied to previous intakes during the mother s adult life. The package includes biokinetic and dosimetric information needed to address intake of Co-60, Sr-90, Cs-134, Cs-137, Ir-192, Pu-238, Pu-239, Am-241, or Cf-252 by the mother.more » It has been designed so that the library of biokinetic and dosimetric files can be expanded to address a more comprehensive set of radionuclides without modifying the basic computational module. The methods and models build on the approach used in Publication 95 of the International Commission on Radiological Protection (ICRP 2004), Doses to Infants from Ingestion of Radionuclides in Mothers Milk . The software package allows input of case-specific information or judgments such as chemical form or particle size of an inhaled aerosol. The package is expected to be more suitable than ICRP Publication 95 for dose assessment for real events or realistic planning scenarios in which measurements of the mother s excretion or body burden are available.« less

  2. Extending R packages to support 64-bit compiled code: An illustration with spam64 and GIMMS NDVI3g data

    NASA Astrophysics Data System (ADS)

    Gerber, Florian; Mösinger, Kaspar; Furrer, Reinhard

    2017-07-01

    Software packages for spatial data often implement a hybrid approach of interpreted and compiled programming languages. The compiled parts are usually written in C, C++, or Fortran, and are efficient in terms of computational speed and memory usage. Conversely, the interpreted part serves as a convenient user-interface and calls the compiled code for computationally demanding operations. The price paid for the user friendliness of the interpreted component is-besides performance-the limited access to low level and optimized code. An example of such a restriction is the 64-bit vector support of the widely used statistical language R. On the R side, users do not need to change existing code and may not even notice the extension. On the other hand, interfacing 64-bit compiled code efficiently is challenging. Since many R packages for spatial data could benefit from 64-bit vectors, we investigate strategies to efficiently pass 64-bit vectors to compiled languages. More precisely, we show how to simply extend existing R packages using the foreign function interface to seamlessly support 64-bit vectors. This extension is shown with the sparse matrix algebra R package spam. The new capabilities are illustrated with an example of GIMMS NDVI3g data featuring a parametric modeling approach for a non-stationary covariance matrix.

  3. Documentation of a computer program to simulate lake-aquifer interaction using the MODFLOW ground water flow model and the MOC3D solute-transport model

    USGS Publications Warehouse

    Merritt, Michael L.; Konikow, Leonard F.

    2000-01-01

    Heads and flow patterns in surficial aquifers can be strongly influenced by the presence of stationary surface-water bodies (lakes) that are in direct contact, vertically and laterally, with the aquifer. Conversely, lake stages can be significantly affected by the volume of water that seeps through the lakebed that separates the lake from the aquifer. For these reasons, a set of computer subroutines called the Lake Package (LAK3) was developed to represent lake/aquifer interaction in numerical simulations using the U.S. Geological Survey three-dimensional, finite-difference, modular ground-water flow model MODFLOW and the U.S. Geological Survey three-dimensional method-of-characteristics solute-transport model MOC3D. In the Lake Package described in this report, a lake is represented as a volume of space within the model grid which consists of inactive cells extending downward from the upper surface of the grid. Active model grid cells bordering this space, representing the adjacent aquifer, exchange water with the lake at a rate determined by the relative heads and by conductances that are based on grid cell dimensions, hydraulic conductivities of the aquifer material, and user-specified leakance distributions that represent the resistance to flow through the material of the lakebed. Parts of the lake may become ?dry? as upper layers of the model are dewatered, with a concomitant reduction in lake surface area, and may subsequently rewet when aquifer heads rise. An empirical approximation has been encoded to simulate the rewetting of a lake that becomes completely dry. The variations of lake stages are determined by independent water budgets computed for each lake in the model grid. This lake budget process makes the package a simulator of the response of lake stage to hydraulic stresses applied to the aquifer. Implementation of a lake water budget requires input of parameters including those representing the rate of lake atmospheric recharge and evaporation, overland runoff, and the rate of any direct withdrawal from, or augmentation of, the lake volume. The lake/aquifer interaction may be simulated in both transient and steady-state flow conditions, and the user may specify that lake stages be computed explicitly, semi-implicitly, or fully-implicitly in transient simulations. The lakes, and all sources of water entering the lakes, may have solute concentrations associated with them for use in solute-transport simulations using MOC3D. The Stream Package of MODFLOW-2000 and MOC3D represents stream connections to lakes, either as inflows or outflows. Because lakes with irregular bathymetry can exist as separate pools of water at lower stages, that coalesce to become a single body of water at higher stages, logic was added to the Lake Package to allow the representation of this process as a user option. If this option is selected, a system of linked pools (sublakes) is identified in each time step and stages are equalized based on current relative sublake surface areas.

  4. Advance Directives and Do Not Resuscitate Orders

    MedlinePlus

    ... a form. Call a lawyer. Use a computer software package for legal documents. Advance directives and living ... you write by yourself or with a computer software package should follow your state laws. You may ...

  5. The Computer as an Aid to Reading Instruction. Learning Package No. 27.

    ERIC Educational Resources Information Center

    Simic, Marge, Comp.; Smith, Carl, Ed.

    Originally developed for the Department of Defense Schools (DoDDS) system, this learning package on computer use in reading is designed for teachers who wish to upgrade or expand their teaching skills on their own. The package includes an overview of the project; a comprehensive search of the ERIC database; a lecture giving an overview on the…

  6. Development of a cross-section based stream package for MODFLOW

    NASA Astrophysics Data System (ADS)

    Ou, G.; Chen, X.; Irmak, A.

    2012-12-01

    Accurate simulation of stream-aquifer interactions for wide rivers using the streamflow routing package in MODFLOW is very challenging. To better represent a wide river spanning over multiple model grid cells, a Cross-Section based streamflow Routing (CSR) package is developed and incorporated into MODFLOW to simulate the interaction between streams and aquifers. In the CSR package, a stream segment is represented as a four-point polygon instead of a polyline which is traditionally used in streamflow routing simulation. Each stream segment is composed of upstream and downstream cross-sections. A cross-section consists of a number of streambed points possessing coordinates, streambed thicknesses and streambed hydraulic conductivities to describe the streambed geometry and hydraulic properties. The left and right end points are used to determine the locations of the stream segments. According to the cross-section geometry and hydraulic properties, CSR calculates the new stream stage at the cross-section using the Brent's method to solve the Manning's Equation. A module is developed to automatically compute the area of the stream segment polygon on each intersected MODFLOW grid cell as the upstream and downstream stages change. The stream stage and streambed hydraulic properties of model grids are interpolated based on the streambed points. Streambed leakage is computed as a function of streambed conductance and difference between the groundwater level and stream stage. The Muskingum-Cunge flow routing scheme with variable parameters is used to simulate the streamflow as the groundwater (discharge or recharge) contributes as lateral flows. An example is used to illustrate the capabilities of the CSR package. The result shows that the CSR is applicable to describing the spatial and temporal variation in the interaction between streams and aquifers. The input data become simple due to that the internal program automatically interpolates the cross-section data to each model grid cell.

  7. A cross-validation package driving Netica with python

    USGS Publications Warehouse

    Fienen, Michael N.; Plant, Nathaniel G.

    2014-01-01

    Bayesian networks (BNs) are powerful tools for probabilistically simulating natural systems and emulating process models. Cross validation is a technique to avoid overfitting resulting from overly complex BNs. Overfitting reduces predictive skill. Cross-validation for BNs is known but rarely implemented due partly to a lack of software tools designed to work with available BN packages. CVNetica is open-source, written in Python, and extends the Netica software package to perform cross-validation and read, rebuild, and learn BNs from data. Insights gained from cross-validation and implications on prediction versus description are illustrated with: a data-driven oceanographic application; and a model-emulation application. These examples show that overfitting occurs when BNs become more complex than allowed by supporting data and overfitting incurs computational costs as well as causing a reduction in prediction skill. CVNetica evaluates overfitting using several complexity metrics (we used level of discretization) and its impact on performance metrics (we used skill).

  8. BioNetFit: a fitting tool compatible with BioNetGen, NFsim and distributed computing environments

    DOE PAGES

    Thomas, Brandon R.; Chylek, Lily A.; Colvin, Joshua; ...

    2015-11-09

    Rule-based models are analyzed with specialized simulators, such as those provided by the BioNetGen and NFsim open-source software packages. Here in this paper, we present BioNetFit, a general-purpose fitting tool that is compatible with BioNetGen and NFsim. BioNetFit is designed to take advantage of distributed computing resources. This feature facilitates fitting (i.e. optimization of parameter values for consistency with data) when simulations are computationally expensive.

  9. Interactive Visualization of Complex Seismic Data and Models Using Bokeh

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chai, Chengping; Ammon, Charles J.; Maceira, Monica

    Visualizing multidimensional data and models becomes more challenging as the volume and resolution of seismic data and models increase. But thanks to the development of powerful and accessible computer systems, a model web browser can be used to visualize complex scientific data and models dynamically. In this paper, we present four examples of seismic model visualization using an open-source Python package Bokeh. One example is a visualization of a surface-wave dispersion data set, another presents a view of three-component seismograms, and two illustrate methods to explore a 3D seismic-velocity model. Unlike other 3D visualization packages, our visualization approach has amore » minimum requirement on users and is relatively easy to develop, provided you have reasonable programming skills. Finally, utilizing familiar web browsing interfaces, the dynamic tools provide us an effective and efficient approach to explore large data sets and models.« less

  10. Interactive Visualization of Complex Seismic Data and Models Using Bokeh

    DOE PAGES

    Chai, Chengping; Ammon, Charles J.; Maceira, Monica; ...

    2018-02-14

    Visualizing multidimensional data and models becomes more challenging as the volume and resolution of seismic data and models increase. But thanks to the development of powerful and accessible computer systems, a model web browser can be used to visualize complex scientific data and models dynamically. In this paper, we present four examples of seismic model visualization using an open-source Python package Bokeh. One example is a visualization of a surface-wave dispersion data set, another presents a view of three-component seismograms, and two illustrate methods to explore a 3D seismic-velocity model. Unlike other 3D visualization packages, our visualization approach has amore » minimum requirement on users and is relatively easy to develop, provided you have reasonable programming skills. Finally, utilizing familiar web browsing interfaces, the dynamic tools provide us an effective and efficient approach to explore large data sets and models.« less

  11. Integrating Commercial Off-The-Shelf (COTS) graphics and extended memory packages with CLIPS

    NASA Technical Reports Server (NTRS)

    Callegari, Andres C.

    1990-01-01

    This paper addresses the question of how to mix CLIPS with graphics and how to overcome PC's memory limitations by using the extended memory available in the computer. By adding graphics and extended memory capabilities, CLIPS can be converted into a complete and powerful system development tool, on the other most economical and popular computer platform. New models of PCs have amazing processing capabilities and graphic resolutions that cannot be ignored and should be used to the fullest of their resources. CLIPS is a powerful expert system development tool, but it cannot be complete without the support of a graphics package needed to create user interfaces and general purpose graphics, or without enough memory to handle large knowledge bases. Now, a well known limitation on the PC's is the usage of real memory which limits CLIPS to use only 640 Kb of real memory, but now that problem can be solved by developing a version of CLIPS that uses extended memory. The user has access of up to 16 MB of memory on 80286 based computers and, practically, all the available memory (4 GB) on computers that use the 80386 processor. So if we give CLIPS a self-configuring graphics package that will automatically detect the graphics hardware and pointing device present in the computer, and we add the availability of the extended memory that exists in the computer (with no special hardware needed), the user will be able to create more powerful systems at a fraction of the cost and on the most popular, portable, and economic platform available such as the PC platform.

  12. Re-evaluation of model-based light-scattering spectroscopy for tissue spectroscopy

    PubMed Central

    Lau, Condon; Šćepanović, Obrad; Mirkovic, Jelena; McGee, Sasha; Yu, Chung-Chieh; Fulghum, Stephen; Wallace, Michael; Tunnell, James; Bechtel, Kate; Feld, Michael

    2009-01-01

    Model-based light scattering spectroscopy (LSS) seemed a promising technique for in-vivo diagnosis of dysplasia in multiple organs. In the studies, the residual spectrum, the difference between the observed and modeled diffuse reflectance spectra, was attributed to single elastic light scattering from epithelial nuclei, and diagnostic information due to nuclear changes was extracted from it. We show that this picture is incorrect. The actual single scattering signal arising from epithelial nuclei is much smaller than the previously computed residual spectrum, and does not have the wavelength dependence characteristic of Mie scattering. Rather, the residual spectrum largely arises from assuming a uniform hemoglobin distribution. In fact, hemoglobin is packaged in blood vessels, which alters the reflectance. When we include vessel packaging, which accounts for an inhomogeneous hemoglobin distribution, in the diffuse reflectance model, the reflectance is modeled more accurately, greatly reducing the amplitude of the residual spectrum. These findings are verified via numerical estimates based on light propagation and Mie theory, tissue phantom experiments, and analysis of published data measured from Barrett’s esophagus. In future studies, vessel packaging should be included in the model of diffuse reflectance and use of model-based LSS should be discontinued. PMID:19405760

  13. Comparing Classical Water Models Using Molecular Dynamics to Find Bulk Properties

    ERIC Educational Resources Information Center

    Kinnaman, Laura J.; Roller, Rachel M.; Miller, Carrie S.

    2018-01-01

    A computational chemistry exercise for the undergraduate physical chemistry laboratory is described. In this exercise, students use the molecular dynamics package Amber to generate trajectories of bulk liquid water for 4 different water models (TIP3P, OPC, SPC/E, and TIP4Pew). Students then process the trajectory to calculate structural (radial…

  14. User's manual for generalized ILSGLD-ILS glide slope performance prediction : multipath scattering

    DOT National Transportation Integrated Search

    1976-11-01

    This manual presents the computer program package for the generalized ILSGLD scattering model. The text includes a complete description of the program itself as well as 3 brief descriptions : of the ILS system and antenna patterns. The program listin...

  15. Using 3D Geometric Models to Teach Spatial Geometry Concepts.

    ERIC Educational Resources Information Center

    Bertoline, Gary R.

    1991-01-01

    An explanation of 3-D Computer Aided Design (CAD) usage to teach spatial geometry concepts using nontraditional techniques is presented. The software packages CADKEY and AutoCAD are described as well as their usefulness in solving space geometry problems. (KR)

  16. Transfer of computer software technology through workshops: The case of fish bioenergetics modeling

    USGS Publications Warehouse

    Johnson, B.L.

    1992-01-01

    A three-part program is proposed to promote the availability and use of computer software packages to fishery managers and researchers. The approach consists of journal articles that announce new technologies, technical reports that serve as user's guides, and hands-on workshops that provide direct instruction to new users. Workshops, which allow experienced users to directly instruct novices in software operation and application are important, but often neglected. The author's experience with organizing and conducting bioenergetics modeling workshops suggests the optimal workshop would take 2 days, have 10-15 participants, one computer for every two users, and one instructor for every 5-6 people.

  17. Solvation Structure and Thermodynamic Mapping (SSTMap): An Open-Source, Flexible Package for the Analysis of Water in Molecular Dynamics Trajectories.

    PubMed

    Haider, Kamran; Cruz, Anthony; Ramsey, Steven; Gilson, Michael K; Kurtzman, Tom

    2018-01-09

    We have developed SSTMap, a software package for mapping structural and thermodynamic water properties in molecular dynamics trajectories. The package introduces automated analysis and mapping of local measures of frustration and enhancement of water structure. The thermodynamic calculations are based on Inhomogeneous Fluid Solvation Theory (IST), which is implemented using both site-based and grid-based approaches. The package also extends the applicability of solvation analysis calculations to multiple molecular dynamics (MD) simulation programs by using existing cross-platform tools for parsing MD parameter and trajectory files. SSTMap is implemented in Python and contains both command-line tools and a Python module to facilitate flexibility in setting up calculations and for automated generation of large data sets involving analysis of multiple solutes. Output is generated in formats compatible with popular Python data science packages. This tool will be used by the molecular modeling community for computational analysis of water in problems of biophysical interest such as ligand binding and protein function.

  18. Modeling and simulation in biomedicine.

    PubMed Central

    Aarts, J.; Möller, D.; van Wijk van Brievingh, R.

    1991-01-01

    A group of researchers and educators in The Netherlands, Germany and Czechoslovakia have developed and adapted mathematical computer models of phenomena in the field of physiology and biomedicine for use in higher education. The models are graphical and highly interactive, and are all written in TurboPascal or the mathematical simulation language PSI. An educational shell has been developed to launch the models. The shell allows students to interact with the models and teachers to edit the models, to add new models and to monitor the achievements of the students. The models and the shell have been implemented on a MS-DOS personal computer. This paper describes the features of the modeling package and presents the modeling and simulation of the heart muscle as an example. PMID:1807745

  19. A clustering package for nucleotide sequences using Laplacian Eigenmaps and Gaussian Mixture Model.

    PubMed

    Bruneau, Marine; Mottet, Thierry; Moulin, Serge; Kerbiriou, Maël; Chouly, Franz; Chretien, Stéphane; Guyeux, Christophe

    2018-02-01

    In this article, a new Python package for nucleotide sequences clustering is proposed. This package, freely available on-line, implements a Laplacian eigenmap embedding and a Gaussian Mixture Model for DNA clustering. It takes nucleotide sequences as input, and produces the optimal number of clusters along with a relevant visualization. Despite the fact that we did not optimise the computational speed, our method still performs reasonably well in practice. Our focus was mainly on data analytics and accuracy and as a result, our approach outperforms the state of the art, even in the case of divergent sequences. Furthermore, an a priori knowledge on the number of clusters is not required here. For the sake of illustration, this method is applied on a set of 100 DNA sequences taken from the mitochondrially encoded NADH dehydrogenase 3 (ND3) gene, extracted from a collection of Platyhelminthes and Nematoda species. The resulting clusters are tightly consistent with the phylogenetic tree computed using a maximum likelihood approach on gene alignment. They are coherent too with the NCBI taxonomy. Further test results based on synthesized data are then provided, showing that the proposed approach is better able to recover the clusters than the most widely used software, namely Cd-hit-est and BLASTClust. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. ELF/VLF/LF Radio Propagation and Systems Aspects (La Propagation des Ondes Radio ELF/VLF/LF et les Aspects Systemes)

    DTIC Science & Technology

    1993-05-01

    limitation of the software package would not allow DATE/I’ME FREQUENCY (kHz) the program to run over 2359 to 0001 UT. This was 18.1 19.0 21.4 24.0...Capability (LWPC), software package devel- oped at NOSC (FERGUSON et al 1989) and adapted by us to the Macintosh personal computer. We find that this... software works very well. Our investigations are to I evaluate and devise geophysical models to be used with . LWPC in assessing VLF communications and

  1. Gas flow calculation method of a ramjet engine

    NASA Astrophysics Data System (ADS)

    Kostyushin, Kirill; Kagenov, Anuar; Eremin, Ivan; Zhiltsov, Konstantin; Shuvarikov, Vladimir

    2017-11-01

    At the present study calculation methodology of gas dynamics equations in ramjet engine is presented. The algorithm is based on Godunov`s scheme. For realization of calculation algorithm, the system of data storage is offered, the system does not depend on mesh topology, and it allows using the computational meshes with arbitrary number of cell faces. The algorithm of building a block-structured grid is given. Calculation algorithm in the software package "FlashFlow" is implemented. Software package is verified on the calculations of simple configurations of air intakes and scramjet models.

  2. Optimization and Control of Burning Plasmas Through High Performance Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pankin, Alexei

    This project has revived the FACETS code, that has been developed under SciDAC fund- ing in 2008-2012. The code has been dormant for a number of years after the SciDAC funding stopped. FACETS depends on external packages. The external packages and libraries such as PETSc, FFTW, HDF5 and NETCDF that are included in FACETS have evolved during these years. Some packages in FACETS are also parts of other codes such as PlasmaState, NUBEAM, GACODES, and UEDGE. These packages have been also evolved together with their host codes which include TRANSP, TGYRO and XPTOR. Finally, there is also a set ofmore » packages in FACETS that are being developed and maintained by Tech-X. These packages include BILDER, SciMake, and FcioWrappers. Many of these packages evolved significantly during the last several years and FACETS had to be updated to synchronize with the re- cent progress in the external packages. The PI has introduced new changes to the BILDER package to support the updated interfaces to the external modules. During the last year of the project, the FACETS version of the UEDGE code has been extracted from FACETS as a standalone package. The PI collaborates with the scientists from LLNL on the updated UEDGE model in FACETS. Drs. T. Rognlien, M. Umansky and A. Dimits from LLNL are contributing to this task.« less

  3. Simulation of the Two Stages Stretch-Blow Molding Process: Infrared Heating and Blowing Modeling

    NASA Astrophysics Data System (ADS)

    Bordival, M.; Schmidt, F. M.; Le Maoult, Y.; Velay, V.

    2007-05-01

    In the Stretch-Blow Molding (SBM) process, the temperature distribution of the reheated perform affects drastically the blowing kinematic, the bottle thickness distribution, as well as the orientation induced by stretching. Consequently, mechanical and optical properties of the final bottle are closely related to heating conditions. In order to predict the 3D temperature distribution of a rotating preform, numerical software using control-volume method has been developed. Since PET behaves like a semi-transparent medium, the radiative flux absorption was computed using Beer Lambert law. In a second step, 2D axi-symmetric simulations of the SBM have been developed using the finite element package ABAQUS®. Temperature profiles through the preform wall thickness and along its length were computed and applied as initial condition. Air pressure inside the preform was not considered as an input variable, but was automatically computed using a thermodynamic model. The heat transfer coefficient applied between the mold and the polymer was also measured. Finally, the G'sell law was used for modeling PET behavior. For both heating and blowing stage simulations, a good agreement has been observed with experimental measurements. This work is part of the European project "APT_PACK" (Advanced knowledge of Polymer deformation for Tomorrow's PACKaging).

  4. User Guide for HUFPrint, A Tabulation and Visualization Utility for the Hydrogeologic-Unit Flow (HUF) Package of MODFLOW

    USGS Publications Warehouse

    Banta, Edward R.; Provost, Alden M.

    2008-01-01

    This report documents HUFPrint, a computer program that extracts and displays information about model structure and hydraulic properties from the input data for a model built using the Hydrogeologic-Unit Flow (HUF) Package of the U.S. Geological Survey's MODFLOW program for modeling ground-water flow. HUFPrint reads the HUF Package and other MODFLOW input files, processes the data by hydrogeologic unit and by model layer, and generates text and graphics files useful for visualizing the data or for further processing. For hydrogeologic units, HUFPrint outputs such hydraulic properties as horizontal hydraulic conductivity along rows, horizontal hydraulic conductivity along columns, horizontal anisotropy, vertical hydraulic conductivity or anisotropy, specific storage, specific yield, and hydraulic-conductivity depth-dependence coefficient. For model layers, HUFPrint outputs such effective hydraulic properties as horizontal hydraulic conductivity along rows, horizontal hydraulic conductivity along columns, horizontal anisotropy, specific storage, primary direction of anisotropy, and vertical conductance. Text files tabulating hydraulic properties by hydrogeologic unit, by model layer, or in a specified vertical section may be generated. Graphics showing two-dimensional cross sections and one-dimensional vertical sections at specified locations also may be generated. HUFPrint reads input files designed for MODFLOW-2000 or MODFLOW-2005.

  5. Human Factors Model

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Jack is an advanced human factors software package that provides a three dimensional model for predicting how a human will interact with a given system or environment. It can be used for a broad range of computer-aided design applications. Jack was developed by the computer Graphics Research Laboratory of the University of Pennsylvania with assistance from NASA's Johnson Space Center, Ames Research Center and the Army. It is the University's first commercial product. Jack is still used for academic purposes at the University of Pennsylvania. Commercial rights were given to Transom Technologies, Inc.

  6. Parallel Monte Carlo transport modeling in the context of a time-dependent, three-dimensional multi-physics code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Procassini, R.J.

    1997-12-31

    The fine-scale, multi-space resolution that is envisioned for accurate simulations of complex weapons systems in three spatial dimensions implies flop-rate and memory-storage requirements that will only be obtained in the near future through the use of parallel computational techniques. Since the Monte Carlo transport models in these simulations usually stress both of these computational resources, they are prime candidates for parallelization. The MONACO Monte Carlo transport package, which is currently under development at LLNL, will utilize two types of parallelism within the context of a multi-physics design code: decomposition of the spatial domain across processors (spatial parallelism) and distribution ofmore » particles in a given spatial subdomain across additional processors (particle parallelism). This implementation of the package will utilize explicit data communication between domains (message passing). Such a parallel implementation of a Monte Carlo transport model will result in non-deterministic communication patterns. The communication of particles between subdomains during a Monte Carlo time step may require a significant level of effort to achieve a high parallel efficiency.« less

  7. The R package "sperrorest" : Parallelized spatial error estimation and variable importance assessment for geospatial machine learning

    NASA Astrophysics Data System (ADS)

    Schratz, Patrick; Herrmann, Tobias; Brenning, Alexander

    2017-04-01

    Computational and statistical prediction methods such as the support vector machine have gained popularity in remote-sensing applications in recent years and are often compared to more traditional approaches like maximum-likelihood classification. However, the accuracy assessment of such predictive models in a spatial context needs to account for the presence of spatial autocorrelation in geospatial data by using spatial cross-validation and bootstrap strategies instead of their now more widely used non-spatial equivalent. The R package sperrorest by A. Brenning [IEEE International Geoscience and Remote Sensing Symposium, 1, 374 (2012)] provides a generic interface for performing (spatial) cross-validation of any statistical or machine-learning technique available in R. Since spatial statistical models as well as flexible machine-learning algorithms can be computationally expensive, parallel computing strategies are required to perform cross-validation efficiently. The most recent major release of sperrorest therefore comes with two new features (aside from improved documentation): The first one is the parallelized version of sperrorest(), parsperrorest(). This function features two parallel modes to greatly speed up cross-validation runs. Both parallel modes are platform independent and provide progress information. par.mode = 1 relies on the pbapply package and calls interactively (depending on the platform) parallel::mclapply() or parallel::parApply() in the background. While forking is used on Unix-Systems, Windows systems use a cluster approach for parallel execution. par.mode = 2 uses the foreach package to perform parallelization. This method uses a different way of cluster parallelization than the parallel package does. In summary, the robustness of parsperrorest() is increased with the implementation of two independent parallel modes. A new way of partitioning the data in sperrorest is provided by partition.factor.cv(). This function gives the user the possibility to perform cross-validation at the level of some grouping structure. As an example, in remote sensing of agricultural land uses, pixels from the same field contain nearly identical information and will thus be jointly placed in either the test set or the training set. Other spatial sampling resampling strategies are already available and can be extended by the user.

  8. Multi-scale fracture networks within layered shallow water tight carbonates

    NASA Astrophysics Data System (ADS)

    Panza, Elisa; Agosta, Fabrizio; Rustichelli, Andrea; Vinciguerra, Sergio; Zambrano, Miller; Prosser, Giacomo; Tondi, Emanuele

    2015-04-01

    The work is aimed at deciphering the contribution of background deformation and persistent fracture zones on the fluid flow properties of tight platform carbonates. Taking advantage of 3D exposures present in the Murge area of southern Italy, the fracture networks crosscutting at different scales the layered Cretaceous limestone of the Altamura Fm. were analyzed. The rock multi-layer is characterized by 10's of cm-thick, sub-horizontal, laterally continuous carbonate beds. Each bed commonly represents a shallowing-upward peritidal cycle made up of homogeneous micritic limestones grading upward to cm-thick stromatolitic limestones and/or fenestral limestones. The bed interfaces are formed by sharp maximum flooding surfaces. Porosity measurements carried out on 40 limestone samples collected from a single carbonate bed show values ranging between 0,5% and 5,5%. Background deformation includes both stratabound and non-stratabound fractures. The former elements consist of bed-perpendicular joints and sheared joints, which are confined within a single bed and often displace small, bed-parallel stylolites. Non-stratabound fractures consist of incipient, cm offset, sub-vertical strike-slip faults, which crosscut the bed interfaces. The aforementioned elements are often confined within individual bed-packages, which are identified by presence of pronounced surfaces locally marked by veneers of reddish clayey paleosoils. Persistent fracture zones consist of 10's of m-high, 10's of cm-offset strike-slip faults that offset the bed-package interfaces and are confined within individual bed-packages association. Laterally discontinuous, cm- to a few m-thick paleokarstic breccia levels separate the different bed-packages associations. Persistent fracture zones include asymmetric fractured damage zones and mm-thick veneers of discontinuous fault rocks. The fracture networks that pervasively crosscut the study limestone multi-layer are investigated by mean of scanline and scanarea methodologies. The dimensional, spatial and scaling properties of both stratabound and non-stratabound fractures are documented along single beds and bed-packages, respectively. Persistent fracture zones are studied from individual bed-package associations. By computing the intensity, height distribution, aspect ratio, aperture of each fracture/fault set, DFN (Discrete Fracture Network) models are built for the aforementioned different scales of observation. DFN models of single beds and bed-packages include stratabound and non-stratabound fractures. Differently, the DFN model of a bed-packages association also includes persistent fracture zones and related damage zones. To check the results of our computations, we also build up a smaller scale, 1m3 geocellular volume in which fractures are inserted one at time in the model. All DFN models do not include the matrix porosity. Porosity and 3D permeability (Kx, Ky, Kz) values are obtained as outputs of the DFN models. The results are consistent with the most prominet set of non-stratabound fractures being the major control on the petrophysical properties of both single beds and bed-packages. As expected, the persistent fractures zones strongly affect both porosity and permeability of the bed-packages association. The results of ongoing laboratory analyses on representative limestone samples not only will provide a quantitative assessment of the physical properties of the matrix in terms of porosity and permeability, but also will shed new light on the geometry, density and anisotropy of microfractures and their role on fluid flow properties.

  9. GPU-powered model analysis with PySB/cupSODA.

    PubMed

    Harris, Leonard A; Nobile, Marco S; Pino, James C; Lubbock, Alexander L R; Besozzi, Daniela; Mauri, Giancarlo; Cazzaniga, Paolo; Lopez, Carlos F

    2017-11-01

    A major barrier to the practical utilization of large, complex models of biochemical systems is the lack of open-source computational tools to evaluate model behaviors over high-dimensional parameter spaces. This is due to the high computational expense of performing thousands to millions of model simulations required for statistical analysis. To address this need, we have implemented a user-friendly interface between cupSODA, a GPU-powered kinetic simulator, and PySB, a Python-based modeling and simulation framework. For three example models of varying size, we show that for large numbers of simulations PySB/cupSODA achieves order-of-magnitude speedups relative to a CPU-based ordinary differential equation integrator. The PySB/cupSODA interface has been integrated into the PySB modeling framework (version 1.4.0), which can be installed from the Python Package Index (PyPI) using a Python package manager such as pip. cupSODA source code and precompiled binaries (Linux, Mac OS/X, Windows) are available at github.com/aresio/cupSODA (requires an Nvidia GPU; developer.nvidia.com/cuda-gpus). Additional information about PySB is available at pysb.org. paolo.cazzaniga@unibg.it or c.lopez@vanderbilt.edu. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  10. Computer Applications in Social Studies.

    ERIC Educational Resources Information Center

    White, Charles S.

    1988-01-01

    Examines "Decisions, Decisions-Revolutionary Wars: Choosing Sides," an Apple II software package that emphasizes student decision-making about the nature of revolutions. Targeted at grades 5-12, the product covers a broad range of issues. Concludes that "Decisions, Decisions" models an effective decision-making process and has…

  11. The Impact of New Technology on Accounting Education.

    ERIC Educational Resources Information Center

    Shaoul, Jean

    The introduction of computers in the Department of Accounting and Finance at Manchester University is described. General background outlining the increasing need for microcomputers in the accounting curriculum (including financial modelling tools and decision support systems such as linear programming, statistical packages, and simulation) is…

  12. DEMONSTRATION OF THE ANALYTIC ELEMENT METHOD FOR WELLHEAD PROTECTION

    EPA Science Inventory

    A new computer program has been developed to determine time-of-travel capture zones in relatively simple geohydrological settings. The WhAEM package contains an analytic element model that uses superposition of (many) closed form analytical solutions to generate a ground-water fl...

  13. A summer blender camp: modeling, rendering, and animation for high school students.

    PubMed

    Bailey, Mike; Law, Cathy

    2014-01-01

    At Camp Blender, high-school students of varying backgrounds learned how to use the Blender software package to create computer graphics content. In a postclass survey, most of them indicated that the camp affected how they thought about their career path.

  14. AstroBlend: An astrophysical visualization package for Blender

    NASA Astrophysics Data System (ADS)

    Naiman, J. P.

    2016-04-01

    The rapid growth in scale and complexity of both computational and observational astrophysics over the past decade necessitates efficient and intuitive methods for examining and visualizing large datasets. Here, I present AstroBlend, an open-source Python library for use within the three dimensional modeling software, Blender. While Blender has been a popular open-source software among animators and visual effects artists, in recent years it has also become a tool for visualizing astrophysical datasets. AstroBlend combines the three dimensional capabilities of Blender with the analysis tools of the widely used astrophysical toolset, yt, to afford both computational and observational astrophysicists the ability to simultaneously analyze their data and create informative and appealing visualizations. The introduction of this package includes a description of features, work flow, and various example visualizations. A website - www.astroblend.com - has been developed which includes tutorials, and a gallery of example images and movies, along with links to downloadable data, three dimensional artistic models, and various other resources.

  15. MODFLOW-2000 Ground-Water Model?User Guide to the Subsidence and Aquifer-System Compaction (SUB) Package

    USGS Publications Warehouse

    Hoffmann, Jörn; Leake, S.A.; Galloway, D.L.; Wilson, Alicia M.

    2003-01-01

    This report documents a computer program, the Subsidence and Aquifer-System Compaction (SUB) Package, to simulate aquifer-system compaction and land subsidence using the U.S. Geological Survey modular finite-difference ground-water flow model, MODFLOW-2000. The SUB Package simulates elastic (recoverable) compaction and expansion, and inelastic (permanent) compaction of compressible fine-grained beds (interbeds) within the aquifers. The deformation of the interbeds is caused by head or pore-pressure changes, and thus by changes in effective stress, within the interbeds. If the stress is less than the preconsolidation stress of the sediments, the deformation is elastic; if the stress is greater than the preconsolidation stress, the deformation is inelastic. The propagation of head changes within the interbeds is defined by a transient, one-dimensional (vertical) diffusion equation. This equation accounts for delayed release of water from storage or uptake of water into storage in the interbeds. Properties that control the timing of the storage changes are vertical hydraulic diffusivity and interbed thickness. The SUB Package supersedes the Interbed Storage Package (IBS1) for MODFLOW, which assumes that water is released from or taken into storage with changes in head in the aquifer within a single model time step and, therefore, can be reasonably used to simulate only thin interbeds. The SUB Package relaxes this assumption and can be used to simulate time-dependent drainage and compaction of thick interbeds and confining units. The time-dependent drainage can be turned off, in which case the SUB Package gives results identical to those from IBS1. Three sample problems illustrate the usefulness of the SUB Package. One sample problem verifies that the package works correctly. This sample problem simulates the drainage of a thick interbed in response to a step change in head in the adjacent aquifer and closely matches the analytical solution. A second sample problem illustrates the effects of seasonally varying discharge and recharge to an aquifer system with a thick interbed. A third sample problem simulates a multilayered regional ground-water basin. Model input files for the third sample problem are included in the appendix.

  16. Computer Managed Instruction: An Application in Teaching Introductory Statistics.

    ERIC Educational Resources Information Center

    Hudson, Walter W.

    1985-01-01

    This paper describes a computer managed instruction package for teaching introductory or advanced statistics. The instructional package is described and anecdotal information concerning its performance and student responses to its use over two semesters are given. (Author/BL)

  17. Computational Approaches to Chemical Hazard Assessment

    PubMed Central

    Luechtefeld, Thomas; Hartung, Thomas

    2018-01-01

    Summary Computational prediction of toxicity has reached new heights as a result of decades of growth in the magnitude and diversity of biological data. Public packages for statistics and machine learning make model creation faster. New theory in machine learning and cheminformatics enables integration of chemical structure, toxicogenomics, simulated and physical data in the prediction of chemical health hazards, and other toxicological information. Our earlier publications have characterized a toxicological dataset of unprecedented scale resulting from the European REACH legislation (Registration Evaluation Authorisation and Restriction of Chemicals). These publications dove into potential use cases for regulatory data and some models for exploiting this data. This article analyzes the options for the identification and categorization of chemicals, moves on to the derivation of descriptive features for chemicals, discusses different kinds of targets modeled in computational toxicology, and ends with a high-level perspective of the algorithms used to create computational toxicology models. PMID:29101769

  18. Parallel Computation of the Regional Ocean Modeling System (ROMS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, P; Song, Y T; Chao, Y

    2005-04-05

    The Regional Ocean Modeling System (ROMS) is a regional ocean general circulation modeling system solving the free surface, hydrostatic, primitive equations over varying topography. It is free software distributed world-wide for studying both complex coastal ocean problems and the basin-to-global scale ocean circulation. The original ROMS code could only be run on shared-memory systems. With the increasing need to simulate larger model domains with finer resolutions and on a variety of computer platforms, there is a need in the ocean-modeling community to have a ROMS code that can be run on any parallel computer ranging from 10 to hundreds ofmore » processors. Recently, we have explored parallelization for ROMS using the MPI programming model. In this paper, an efficient parallelization strategy for such a large-scale scientific software package, based on an existing shared-memory computing model, is presented. In addition, scientific applications and data-performance issues on a couple of SGI systems, including Columbia, the world's third-fastest supercomputer, are discussed.« less

  19. DREAMTools: a Python package for scoring collaborative challenges

    PubMed Central

    Cokelaer, Thomas; Bansal, Mukesh; Bare, Christopher; Bilal, Erhan; Bot, Brian M.; Chaibub Neto, Elias; Eduati, Federica; de la Fuente, Alberto; Gönen, Mehmet; Hill, Steven M.; Hoff, Bruce; Karr, Jonathan R.; Küffner, Robert; Menden, Michael P.; Meyer, Pablo; Norel, Raquel; Pratap, Abhishek; Prill, Robert J.; Weirauch, Matthew T.; Costello, James C.; Stolovitzky, Gustavo; Saez-Rodriguez, Julio

    2016-01-01

    DREAM challenges are community competitions designed to advance computational methods and address fundamental questions in system biology and translational medicine. Each challenge asks participants to develop and apply computational methods to either predict unobserved outcomes or to identify unknown model parameters given a set of training data. Computational methods are evaluated using an automated scoring metric, scores are posted to a public leaderboard, and methods are published to facilitate community discussions on how to build improved methods. By engaging participants from a wide range of science and engineering backgrounds, DREAM challenges can comparatively evaluate a wide range of statistical, machine learning, and biophysical methods. Here, we describe DREAMTools, a Python package for evaluating DREAM challenge scoring metrics. DREAMTools provides a command line interface that enables researchers to test new methods on past challenges, as well as a framework for scoring new challenges. As of March 2016, DREAMTools includes more than 80% of completed DREAM challenges. DREAMTools complements the data, metadata, and software tools available at the DREAM website http://dreamchallenges.org and on the Synapse platform at https://www.synapse.org. Availability:  DREAMTools is a Python package. Releases and documentation are available at http://pypi.python.org/pypi/dreamtools. The source code is available at http://github.com/dreamtools/dreamtools. PMID:27134723

  20. Computer package for the design and optimization of absorption air conditioning system operated by solar energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sofrata, H.; Khoshaim, B.; Megahed, M.

    1980-12-01

    In this paper a computer package for the design and optimization of the simple Li-Br absorption air conditioning system, operated by solar energy, is developed in order to study its performance. This was necessary, as a first step, before carrying out any computations regarding the dual system (1-3). The computer package has the facilities of examining any parameter which may control the system; namely generator, evaporator, condenser, absorber temperatures and pumping factor. The output may be tabulated and also fed to the graph plotter. The flow chart of the programme is explained in an easy way and a typical examplemore » is included.« less

  1. DarkBit: a GAMBIT module for computing dark matter observables and likelihoods

    NASA Astrophysics Data System (ADS)

    Bringmann, Torsten; Conrad, Jan; Cornell, Jonathan M.; Dal, Lars A.; Edsjö, Joakim; Farmer, Ben; Kahlhoefer, Felix; Kvellestad, Anders; Putze, Antje; Savage, Christopher; Scott, Pat; Weniger, Christoph; White, Martin; Wild, Sebastian

    2017-12-01

    We introduce DarkBit, an advanced software code for computing dark matter constraints on various extensions to the Standard Model of particle physics, comprising both new native code and interfaces to external packages. This release includes a dedicated signal yield calculator for gamma-ray observations, which significantly extends current tools by implementing a cascade-decay Monte Carlo, as well as a dedicated likelihood calculator for current and future experiments ( gamLike). This provides a general solution for studying complex particle physics models that predict dark matter annihilation to a multitude of final states. We also supply a direct detection package that models a large range of direct detection experiments ( DDCalc), and that provides the corresponding likelihoods for arbitrary combinations of spin-independent and spin-dependent scattering processes. Finally, we provide custom relic density routines along with interfaces to DarkSUSY, micrOMEGAs, and the neutrino telescope likelihood package nulike. DarkBit is written in the framework of the Global And Modular Beyond the Standard Model Inference Tool ( GAMBIT), providing seamless integration into a comprehensive statistical fitting framework that allows users to explore new models with both particle and astrophysics constraints, and a consistent treatment of systematic uncertainties. In this paper we describe its main functionality, provide a guide to getting started quickly, and show illustrative examples for results obtained with DarkBit (both as a stand-alone tool and as a GAMBIT module). This includes a quantitative comparison between two of the main dark matter codes ( DarkSUSY and micrOMEGAs), and application of DarkBit 's advanced direct and indirect detection routines to a simple effective dark matter model.

  2. Design Tool

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Developed under a Small Business Innovation Research (SBIR) contract, RAMPANT is a CFD software package for computing flow around complex shapes. The package is flexible, fast and easy to use. It has found a great number of applications, including computation of air flow around a Nordic ski jumper, prediction of flow over an airfoil and computation of the external aerodynamics of motor vehicles.

  3. Potential Flow Theory and Operation Guide for the Panel Code PMARC. Version 14

    NASA Technical Reports Server (NTRS)

    Ashby, Dale L.

    1999-01-01

    The theoretical basis for PMARC, a low-order panel code for modeling complex three-dimensional bodies, in potential flow, is outlined. PMARC can be run on a wide variety of computer platforms, including desktop machines, workstations, and supercomputers. Execution times for PMARC vary tremendously depending on the computer resources used, but typically range from several minutes for simple or moderately complex cases to several hours for very large complex cases. Several of the advanced features currently included in the code, such as internal flow modeling, boundary layer analysis, and time-dependent flow analysis, including problems involving relative motion, are discussed in some detail. The code is written in Fortran77, using adjustable-size arrays so that it can be easily redimensioned to match problem requirements and computer hardware constraints. An overview of the program input is presented. A detailed description of the input parameters is provided in the appendices. PMARC results for several test cases are presented along with analytic or experimental data, where available. The input files for these test cases are given in the appendices. PMARC currently supports plotfile output formats for several commercially available graphics packages. The supported graphics packages are Plot3D, Tecplot, and PmarcViewer.

  4. Computer Simulation Performed for Columbia Project Cooling System

    NASA Technical Reports Server (NTRS)

    Ahmad, Jasim

    2005-01-01

    This demo shows a high-fidelity simulation of the air flow in the main computer room housing the Columbia (10,024 intel titanium processors) system. The simulation asseses the performance of the cooling system and identified deficiencies, and recommended modifications to eliminate them. It used two in house software packages on NAS supercomputers: Chimera Grid tools to generate a geometric model of the computer room, OVERFLOW-2 code for fluid and thermal simulation. This state-of-the-art technology can be easily extended to provide a general capability for air flow analyses on any modern computer room. Columbia_CFD_black.tiff

  5. Flowing Hot or Cold: User-Friendly Computational Models of Terrestrial and Planetary Lava Channels and Lakes

    NASA Astrophysics Data System (ADS)

    Sakimoto, S. E. H.

    2016-12-01

    Planetary volcanism has redefined what is considered volcanism. "Magma" now may be considered to be anything from the molten rock familiar at terrestrial volcanoes to cryovolcanic ammonia-water mixes erupted on an outer solar system moon. However, even with unfamiliar compositions and source mechanisms, we find familiar landforms such as volcanic channels, lakes, flows, and domes and thus a multitude of possibilities for modeling. As on Earth, these landforms lend themselves to analysis for estimating storage, eruption and/or flow rates. This has potential pitfalls, as extension of the simplified analytic models we often use for terrestrial features into unfamiliar parameter space might yield misleading results. Our most commonly used tools for estimating flow and cooling have tended to lag significantly behind state-of-the-art; the easiest methods to use are neither realistic or accurate, but the more realistic and accurate computational methods are not simple to use. Since the latter computational tools tend to be both expensive and require a significant learning curve, there is a need for a user-friendly approach that still takes advantage of their accuracy. One method is use of the computational package for generation of a server-based tool that allows less computationally inclined users to get accurate results over their range of input parameters for a given problem geometry. A second method is to use the computational package for the generation of a polynomial empirical solution for each class of flow geometry that can be fairly easily solved by anyone with a spreadsheet. In this study, we demonstrate both approaches for several channel flow and lava lake geometries with terrestrial and extraterrestrial examples and compare their results. Specifically, we model cooling rectangular channel flow with a yield strength material, with applications to Mauna Loa, Kilauea, Venus, and Mars. This approach also shows promise with model applications to lava lakes, magma flow through cracks, and volcanic dome formation.

  6. A Freeware Path to Neutron Computed Tomography

    NASA Astrophysics Data System (ADS)

    Schillinger, Burkhard; Craft, Aaron E.

    Neutron computed tomography has become a routine method at many neutron sources due to the availability of digital detection systems, powerful computers and advanced software. The commercial packages Octopus by Inside Matters and VGStudio by Volume Graphics have been established as a quasi-standard for high-end computed tomography. However, these packages require a stiff investment and are available to the users only on-site at the imaging facility to do their data processing. There is a demand from users to have image processing software at home to do further data processing; in addition, neutron computed tomography is now being introduced even at smaller and older reactors. Operators need to show a first working tomography setup before they can obtain a budget to build an advanced tomography system. Several packages are available on the web for free; however, these have been developed for X-rays or synchrotron radiation and are not immediately useable for neutron computed tomography. Three reconstruction packages and three 3D-viewers have been identified and used even for Gigabyte datasets. This paper is not a scientific publication in the classic sense, but is intended as a review to provide searchable help to make the described packages usable for the tomography community. It presents the necessary additional preprocessing in ImageJ, some workarounds for bugs in the software, and undocumented or badly documented parameters that need to be adapted for neutron computed tomography. The result is a slightly complicated, but surprisingly high-quality path to neutron computed tomography images in 3D, but not a replacement for the even more powerful commercial software mentioned above.

  7. Computer assisted learning (CAL) of oral manifestations of HIV disease.

    PubMed

    Porter, S R; Telford, A; Chandler, K; Furber, S; Williams, J; Price, S; Scully, C; Triantos, D; Bain, L

    1996-09-07

    General dental practitioners (GDPs) in the UK may wish additional education on relevant aspects of human immunodeficiency virus (HIV) disease. The aim of the present study was to develop and assess a computer assisted learning package on the oral manifestations of HIV disease of relevance to GDPs. A package was developed using a commercially-available software development tool and assessed by a group of 75 GDPs interested in education and computers. Fifty-four (72%) of the GDPs completed a self-administered questionnaire of their opinions of the package. The majority reported the package to be easy to load and run, that it provided clear instructions and displays, and that it was a more effective educational tool than videotapes, audiotapes, professional journals and textbooks, and of similar benefit as post-graduate courses. The GDPs often commented favourably on the effectiveness of the clinical images and use of questions and answers, although some had criticisms of these and other aspects of the package. As a consequence of this investigation the package has been modified and distributed to GDPs in England and Wales.

  8. The Combined Use of Video Modeling and Social Stories in Teaching Social Skills for Individuals with Intellectual Disability

    ERIC Educational Resources Information Center

    Gül, Seray Olçay

    2016-01-01

    There are many studies in the literature in which individuals with intellectual disabilities exhibit social skills deficits and which show the need for teaching these skills systematically. This study aims to investigate the effects of an intervention package of consisting computer-presented video modeling and Social Stories on individuals with…

  9. Development of seismic tomography software for hybrid supercomputers

    NASA Astrophysics Data System (ADS)

    Nikitin, Alexandr; Serdyukov, Alexandr; Duchkov, Anton

    2015-04-01

    Seismic tomography is a technique used for computing velocity model of geologic structure from first arrival travel times of seismic waves. The technique is used in processing of regional and global seismic data, in seismic exploration for prospecting and exploration of mineral and hydrocarbon deposits, and in seismic engineering for monitoring the condition of engineering structures and the surrounding host medium. As a consequence of development of seismic monitoring systems and increasing volume of seismic data, there is a growing need for new, more effective computational algorithms for use in seismic tomography applications with improved performance, accuracy and resolution. To achieve this goal, it is necessary to use modern high performance computing systems, such as supercomputers with hybrid architecture that use not only CPUs, but also accelerators and co-processors for computation. The goal of this research is the development of parallel seismic tomography algorithms and software package for such systems, to be used in processing of large volumes of seismic data (hundreds of gigabytes and more). These algorithms and software package will be optimized for the most common computing devices used in modern hybrid supercomputers, such as Intel Xeon CPUs, NVIDIA Tesla accelerators and Intel Xeon Phi co-processors. In this work, the following general scheme of seismic tomography is utilized. Using the eikonal equation solver, arrival times of seismic waves are computed based on assumed velocity model of geologic structure being analyzed. In order to solve the linearized inverse problem, tomographic matrix is computed that connects model adjustments with travel time residuals, and the resulting system of linear equations is regularized and solved to adjust the model. The effectiveness of parallel implementations of existing algorithms on target architectures is considered. During the first stage of this work, algorithms were developed for execution on supercomputers using multicore CPUs only, with preliminary performance tests showing good parallel efficiency on large numerical grids. Porting of the algorithms to hybrid supercomputers is currently ongoing.

  10. `spup' - An R Package for Analysis of Spatial Uncertainty Propagation and Application to Trace Gas Emission Simulations

    NASA Astrophysics Data System (ADS)

    Sawicka, K.; Breuer, L.; Houska, T.; Santabarbara Ruiz, I.; Heuvelink, G. B. M.

    2016-12-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Advances in uncertainty propagation analysis and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability, including case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the `spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo techniques, as well as several uncertainty visualization functions. Here we will demonstrate that the 'spup' package is an effective and easy-to-use tool to be applied even in a very complex study case, and that it can be used in multi-disciplinary research and model-based decision support. As an example, we use the ecological LandscapeDNDC model to analyse propagation of uncertainties associated with spatial variability of the model driving forces such as rainfall, nitrogen deposition and fertilizer inputs. The uncertainty propagation is analysed for the prediction of emissions of N2O and CO2 for a German low mountainous, agriculturally developed catchment. The study tests the effect of spatial correlations on spatially aggregated model outputs, and could serve as an advice for developing best management practices and model improvement strategies.

  11. Accounting utility for determining individual usage of production level software systems

    NASA Technical Reports Server (NTRS)

    Garber, S. C.

    1984-01-01

    An accounting package was developed which determines the computer resources utilized by a user during the execution of a particular program and updates a file containing accumulated resource totals. The accounting package is divided into two separate programs. The first program determines the total amount of computer resources utilized by a user during the execution of a particular program. The second program uses these totals to update a file containing accumulated totals of computer resources utilized by a user for a particular program. This package is useful to those persons who have several other users continually accessing and running programs from their accounts. The package provides the ability to determine which users are accessing and running specified programs along with their total level of usage.

  12. SPOTting Model Parameters Using a Ready-Made Python Package

    NASA Astrophysics Data System (ADS)

    Houska, Tobias; Kraft, Philipp; Chamorro-Chavez, Alejandro; Breuer, Lutz

    2017-04-01

    The choice for specific parameter estimation methods is often more dependent on its availability than its performance. We developed SPOTPY (Statistical Parameter Optimization Tool), an open source python package containing a comprehensive set of methods typically used to calibrate, analyze and optimize parameters for a wide range of ecological models. SPOTPY currently contains eight widely used algorithms, 11 objective functions, and can sample from eight parameter distributions. SPOTPY has a model-independent structure and can be run in parallel from the workstation to large computation clusters using the Message Passing Interface (MPI). We tested SPOTPY in five different case studies to parameterize the Rosenbrock, Griewank and Ackley functions, a one-dimensional physically based soil moisture routine, where we searched for parameters of the van Genuchten-Mualem function and a calibration of a biogeochemistry model with different objective functions. The case studies reveal that the implemented SPOTPY methods can be used for any model with just a minimal amount of code for maximal power of parameter optimization. They further show the benefit of having one package at hand that includes number of well performing parameter search methods, since not every case study can be solved sufficiently with every algorithm or every objective function.

  13. SPOTting Model Parameters Using a Ready-Made Python Package.

    PubMed

    Houska, Tobias; Kraft, Philipp; Chamorro-Chavez, Alejandro; Breuer, Lutz

    2015-01-01

    The choice for specific parameter estimation methods is often more dependent on its availability than its performance. We developed SPOTPY (Statistical Parameter Optimization Tool), an open source python package containing a comprehensive set of methods typically used to calibrate, analyze and optimize parameters for a wide range of ecological models. SPOTPY currently contains eight widely used algorithms, 11 objective functions, and can sample from eight parameter distributions. SPOTPY has a model-independent structure and can be run in parallel from the workstation to large computation clusters using the Message Passing Interface (MPI). We tested SPOTPY in five different case studies to parameterize the Rosenbrock, Griewank and Ackley functions, a one-dimensional physically based soil moisture routine, where we searched for parameters of the van Genuchten-Mualem function and a calibration of a biogeochemistry model with different objective functions. The case studies reveal that the implemented SPOTPY methods can be used for any model with just a minimal amount of code for maximal power of parameter optimization. They further show the benefit of having one package at hand that includes number of well performing parameter search methods, since not every case study can be solved sufficiently with every algorithm or every objective function.

  14. SPOTting Model Parameters Using a Ready-Made Python Package

    PubMed Central

    Houska, Tobias; Kraft, Philipp; Chamorro-Chavez, Alejandro; Breuer, Lutz

    2015-01-01

    The choice for specific parameter estimation methods is often more dependent on its availability than its performance. We developed SPOTPY (Statistical Parameter Optimization Tool), an open source python package containing a comprehensive set of methods typically used to calibrate, analyze and optimize parameters for a wide range of ecological models. SPOTPY currently contains eight widely used algorithms, 11 objective functions, and can sample from eight parameter distributions. SPOTPY has a model-independent structure and can be run in parallel from the workstation to large computation clusters using the Message Passing Interface (MPI). We tested SPOTPY in five different case studies to parameterize the Rosenbrock, Griewank and Ackley functions, a one-dimensional physically based soil moisture routine, where we searched for parameters of the van Genuchten-Mualem function and a calibration of a biogeochemistry model with different objective functions. The case studies reveal that the implemented SPOTPY methods can be used for any model with just a minimal amount of code for maximal power of parameter optimization. They further show the benefit of having one package at hand that includes number of well performing parameter search methods, since not every case study can be solved sufficiently with every algorithm or every objective function. PMID:26680783

  15. DEMONSTRATION OF THE ANALYTIC ELEMENT METHOD FOR WELLHEAD PROJECTION - PROJECT SUMMARY

    EPA Science Inventory

    A new computer program has been developed to determine time-of-travel capture zones in relatively simple geohydrological settings. The WhAEM package contains an analytic element model that uses superposition of (many) closed form analytical solutions to generate a ground-water fl...

  16. SIMWEST: A simulation model for wind and photovoltaic energy storage systems (CDC user's manual), volume 1

    NASA Technical Reports Server (NTRS)

    Warren, A. W.; Esinger, A. W.

    1979-01-01

    Procedures are given for using the SIMWEST program on CDC 6000 series computers. This expanded software package includes wind and/or photovoltaic systems utilizing any combination of five types of storage (pumped hydro, battery, thermal, flywheel, and pneumatic).

  17. Hypertext-based computer vision teaching packages

    NASA Astrophysics Data System (ADS)

    Marshall, A. David

    1994-10-01

    The World Wide Web Initiative has provided a means for providing hypertext and multimedia based information across the whole INTERNET. Many applications have been developed on such http servers. At Cardiff we have developed a http hypertext based multimedia server, the Cardiff Information Server, using the widely available Mosaic system. The server provides a variety of information ranging from the provision of teaching modules, on- line documentation, timetables for departmental activities to more light hearted hobby interests. One important and novel development to the server has been the development of courseware facilities. This ranges from the provision of on-line lecture notes, exercises and their solutions to more interactive teaching packages. A variety of disciplines have benefitted notably Computer Vision, and Image Processing but also C programming, X Windows, Computer Graphics and Parallel Computing. This paper will address the issues of the implementation of the Computer Vision and Image Processing packages, the advantages gained from using a hypertext based system and also will relate practical experiences of using the packages in a class environment. The paper addresses issues of how best to provide information in such a hypertext based system and how interactive image processing packages can be developed and integrated into courseware. The suite of tools developed facilitates a flexible and powerful courseware package that has proved popular in the classroom and over the Internet. The paper will also detail many future developments we see possible. One of the key points raised in the paper is that Mosaic's hypertext language (html) is extremely powerful and yet relatively straightforward to use. It is also possible to link in Unix calls so that programs and shells can be executed. This provides a powerful suite of utilities that can be exploited to develop many packages.

  18. Chromatin Computation

    PubMed Central

    Bryant, Barbara

    2012-01-01

    In living cells, DNA is packaged along with protein and RNA into chromatin. Chemical modifications to nucleotides and histone proteins are added, removed and recognized by multi-functional molecular complexes. Here I define a new computational model, in which chromatin modifications are information units that can be written onto a one-dimensional string of nucleosomes, analogous to the symbols written onto cells of a Turing machine tape, and chromatin-modifying complexes are modeled as read-write rules that operate on a finite set of adjacent nucleosomes. I illustrate the use of this “chromatin computer” to solve an instance of the Hamiltonian path problem. I prove that chromatin computers are computationally universal – and therefore more powerful than the logic circuits often used to model transcription factor control of gene expression. Features of biological chromatin provide a rich instruction set for efficient computation of nontrivial algorithms in biological time scales. Modeling chromatin as a computer shifts how we think about chromatin function, suggests new approaches to medical intervention, and lays the groundwork for the engineering of a new class of biological computing machines. PMID:22567109

  19. Computational statistics using the Bayesian Inference Engine

    NASA Astrophysics Data System (ADS)

    Weinberg, Martin D.

    2013-09-01

    This paper introduces the Bayesian Inference Engine (BIE), a general parallel, optimized software package for parameter inference and model selection. This package is motivated by the analysis needs of modern astronomical surveys and the need to organize and reuse expensive derived data. The BIE is the first platform for computational statistics designed explicitly to enable Bayesian update and model comparison for astronomical problems. Bayesian update is based on the representation of high-dimensional posterior distributions using metric-ball-tree based kernel density estimation. Among its algorithmic offerings, the BIE emphasizes hybrid tempered Markov chain Monte Carlo schemes that robustly sample multimodal posterior distributions in high-dimensional parameter spaces. Moreover, the BIE implements a full persistence or serialization system that stores the full byte-level image of the running inference and previously characterized posterior distributions for later use. Two new algorithms to compute the marginal likelihood from the posterior distribution, developed for and implemented in the BIE, enable model comparison for complex models and data sets. Finally, the BIE was designed to be a collaborative platform for applying Bayesian methodology to astronomy. It includes an extensible object-oriented and easily extended framework that implements every aspect of the Bayesian inference. By providing a variety of statistical algorithms for all phases of the inference problem, a scientist may explore a variety of approaches with a single model and data implementation. Additional technical details and download details are available from http://www.astro.umass.edu/bie. The BIE is distributed under the GNU General Public License.

  20. Conjugate Compressible Fluid Flow and Heat Transfer in Ducts

    NASA Technical Reports Server (NTRS)

    Cross, M. F.

    2011-01-01

    A computational approach to modeling transient, compressible fluid flow with heat transfer in long, narrow ducts is presented. The primary application of the model is for analyzing fluid flow and heat transfer in solid propellant rocket motor nozzle joints during motor start-up, but the approach is relevant to a wide range of analyses involving rapid pressurization and filling of ducts. Fluid flow is modeled through solution of the spatially one-dimensional, transient Euler equations. Source terms are included in the governing equations to account for the effects of wall friction and heat transfer. The equation solver is fully-implicit, thus providing greater flexibility than an explicit solver. This approach allows for resolution of pressure wave effects on the flow as well as for fast calculation of the steady-state solution when a quasi-steady approach is sufficient. Solution of the one-dimensional Euler equations with source terms significantly reduces computational run times compared to general purpose computational fluid dynamics packages solving the Navier-Stokes equations with resolved boundary layers. In addition, conjugate heat transfer is more readily implemented using the approach described in this paper than with most general purpose computational fluid dynamics packages. The compressible flow code has been integrated with a transient heat transfer solver to analyze heat transfer between the fluid and surrounding structure. Conjugate fluid flow and heat transfer solutions are presented. The author is unaware of any previous work available in the open literature which uses the same approach described in this paper.

  1. Analysis of reference transactions using packaged computer programs.

    PubMed

    Calabretta, N; Ross, R

    1984-01-01

    Motivated by a continuing education class attended by the authors on the measurement of reference desk activities, the reference department at Scott Memorial Library initiated a project to gather data on reference desk transactions and to analyze the data by using packaged computer programs. The programs utilized for the project were SPSS (Statistical Package for the Social Sciences) and SAS (Statistical Analysis System). The planning, implementation and development of the project are described.

  2. Integrated Electronic Warfare System Advanced Development Model (ADM); Appendix 1 - Functional Requirement Specification.

    DTIC Science & Technology

    1977-10-01

    APPROVED DATE FUNCTION APPROVED jDATE WRITER J . K-olanek 2/6/76 REVISIONS CHK DESCRIPTION REV CHK DESCRIPTION IREV REVISION jJ ~ ~ ~~~ _ II SHEET NO...DOCUMENT (CDBDD) 45 5.5 COMPUTER PROGRAM PACKAGE (CPP)- j 45 5.6 COMPUTER PROGRAM OPERATOR’S MANUAL (CPOM) 45 5.7 COMPUTER PROGRAM TEST PLAN (CPTPL) 45...I LIST OF FIGURES Number Page 1 JEWS Simplified Block Diagram 4 2 System Controller Architecture 5 SIZE CODE IDENT NO DRAWING NO. A 49956 SCALE REV J

  3. CFD RANS Simulations on a Generic Conventional Scale Model Submarine: Comparison between Fluent and OpenFOAM

    DTIC Science & Technology

    2015-09-01

    lift and drag forces on two model car geometries (designated as the VRAK model and the S80 model). For the VRAK model the OpenFOAM drag coefficient was...lift coefficient was 16.5% higher than the Fluent value. Both model car geometries were meshed using Harpoon, which is a commercial software package...2. Clarke, G., Vun, S., Giacobello, M. and Reddy, R., “Estimation of ARH Tiger Fuselage Aerodynamic Characteristics Using Computational Fluid

  4. Virginia Transit Performance Evaluation Package (VATPEP).

    DOT National Transportation Integrated Search

    1987-01-01

    The Virginia Transit Performance Evaluation Package (VATPEP), a computer software package, is documented. This is the computerized version of the methodology used by the Virginia Department of Transportation to evaluate the performance of public tran...

  5. 21 CFR 1314.110 - Reports for mail-order sales.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Administration, submit the report in electronic form, either via computer disk or direct electronic data... units (e.g., 100 doses per package); (11) Package type (blister pack, etc.); (12) Number of packages...

  6. 21 CFR 1314.110 - Reports for mail-order sales.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Administration, submit the report in electronic form, either via computer disk or direct electronic data... units (e.g., 100 doses per package); (11) Package type (blister pack, etc.); (12) Number of packages...

  7. 21 CFR 1314.110 - Reports for mail-order sales.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... Administration, submit the report in electronic form, either via computer disk or direct electronic data... units (e.g., 100 doses per package); (11) Package type (blister pack, etc.); (12) Number of packages...

  8. 21 CFR 1314.110 - Reports for mail-order sales.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Administration, submit the report in electronic form, either via computer disk or direct electronic data... units (e.g., 100 doses per package); (11) Package type (blister pack, etc.); (12) Number of packages...

  9. 21 CFR 1314.110 - Reports for mail-order sales.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Administration, submit the report in electronic form, either via computer disk or direct electronic data... units (e.g., 100 doses per package); (11) Package type (blister pack, etc.); (12) Number of packages...

  10. Structural Modeling Using "Scanning and Mapping" Technique

    NASA Technical Reports Server (NTRS)

    Amos, Courtney L.; Dash, Gerald S.; Shen, J. Y.; Ferguson, Frederick; Noga, Donald F. (Technical Monitor)

    2000-01-01

    Supported by NASA Glenn Center, we are in the process developing a structural damage diagnostic and monitoring system for rocket engines, which consists of five modules: Structural Modeling, Measurement Data Pre-Processor, Structural System Identification, Damage Detection Criterion, and Computer Visualization. The function of the system is to detect damage as it is incurred by the engine structures. The scientific principle to identify damage is to utilize the changes in the vibrational properties between the pre-damaged and post-damaged structures. The vibrational properties of the pre-damaged structure can be obtained based on an analytic computer model of the structure. Thus, as the first stage of the whole research plan, we currently focus on the first module - Structural Modeling. Three computer software packages are selected, and will be integrated for this purpose. They are PhotoModeler-Pro, AutoCAD-R14, and MSC/NASTRAN. AutoCAD is the most popular PC-CAD system currently available in the market. For our purpose, it plays like an interface to generate structural models of any particular engine parts or assembly, which is then passed to MSC/NASTRAN for extracting structural dynamic properties. Although AutoCAD is a powerful structural modeling tool, the complexity of engine components requires a further improvement in structural modeling techniques. We are working on a so-called "scanning and mapping" technique, which is a relatively new technique. The basic idea is to producing a full and accurate 3D structural model by tracing on multiple overlapping photographs taken from different angles. There is no need to input point positions, angles, distances or axes. Photographs can be taken by any types of cameras with different lenses. With the integration of such a modeling technique, the capability of structural modeling will be enhanced. The prototypes of any complex structural components will be produced by PhotoModeler first based on existing similar components, then passed to AutoCAD for modification and correction of any discrepancies seen in the Photomodeler version of the 3Dmodel. These three software packages are fully compatible. The DXF file can be used to transfer drawings among those packages. To begin this entire process, we are using a small replica of an actual engine blade as a test object. This paper introduces the accomplishment of our recent work.

  11. Study of the TRAC Airfoil Table Computational System

    NASA Technical Reports Server (NTRS)

    Hu, Hong

    1999-01-01

    The report documents the study of the application of the TRAC airfoil table computational package (TRACFOIL) to the prediction of 2D airfoil force and moment data over a wide range of angle of attack and Mach number. The TRACFOIL generates the standard C-81 airfoil table for input into rotorcraft comprehensive codes such as CAM- RAD. The existing TRACFOIL computer package is successfully modified to run on Digital alpha workstations and on Cray-C90 supercomputers. A step-by-step instruction for using the package on both computer platforms is provided. Application of the newer version of TRACFOIL is made for two airfoil sections. The C-81 data obtained using the TRACFOIL method are compared with those of wind-tunnel data and results are presented.

  12. Naima: a Python package for inference of particle distribution properties from nonthermal spectra

    NASA Astrophysics Data System (ADS)

    Zabalza, V.

    2015-07-01

    The ultimate goal of the observation of nonthermal emission from astrophysical sources is to understand the underlying particle acceleration and evolution processes, and few tools are publicly available to infer the particle distribution properties from the observed photon spectra from X-ray to VHE gamma rays. Here I present naima, an open source Python package that provides models for nonthermal radiative emission from homogeneous distribution of relativistic electrons and protons. Contributions from synchrotron, inverse Compton, nonthermal bremsstrahlung, and neutral-pion decay can be computed for a series of functional shapes of the particle energy distributions, with the possibility of using user-defined particle distribution functions. In addition, naima provides a set of functions that allow to use these models to fit observed nonthermal spectra through an MCMC procedure, obtaining probability distribution functions for the particle distribution parameters. Here I present the models and methods available in naima and an example of their application to the understanding of a galactic nonthermal source. naima's documentation, including how to install the package, is available at http://naima.readthedocs.org.

  13. Computer-Based Mathematics Instructions for Engineering Students

    NASA Technical Reports Server (NTRS)

    Khan, Mustaq A.; Wall, Curtiss E.

    1996-01-01

    Almost every engineering course involves mathematics in one form or another. The analytical process of developing mathematical models is very important for engineering students. However, the computational process involved in the solution of some mathematical problems may be very tedious and time consuming. There is a significant amount of mathematical software such as Mathematica, Mathcad, and Maple designed to aid in the solution of these instructional problems. The use of these packages in classroom teaching can greatly enhance understanding, and save time. Integration of computer technology in mathematics classes, without de-emphasizing the traditional analytical aspects of teaching, has proven very successful and is becoming almost essential. Sample computer laboratory modules are developed for presentation in the classroom setting. This is accomplished through the use of overhead projectors linked to graphing calculators and computers. Model problems are carefully selected from different areas.

  14. The Hidden Cost of Buying a Computer.

    ERIC Educational Resources Information Center

    Johnson, Michael

    1983-01-01

    In order to process data in a computer, application software must be either developed or purchased. Costs for modifications of the software package and maintenance are often hidden. The decision to buy or develop software packages should be based upon factors of time and maintenance. (MLF)

  15. RRegrs: an R package for computer-aided model selection with multiple regression models.

    PubMed

    Tsiliki, Georgia; Munteanu, Cristian R; Seoane, Jose A; Fernandez-Lozano, Carlos; Sarimveis, Haralambos; Willighagen, Egon L

    2015-01-01

    Predictive regression models can be created with many different modelling approaches. Choices need to be made for data set splitting, cross-validation methods, specific regression parameters and best model criteria, as they all affect the accuracy and efficiency of the produced predictive models, and therefore, raising model reproducibility and comparison issues. Cheminformatics and bioinformatics are extensively using predictive modelling and exhibit a need for standardization of these methodologies in order to assist model selection and speed up the process of predictive model development. A tool accessible to all users, irrespectively of their statistical knowledge, would be valuable if it tests several simple and complex regression models and validation schemes, produce unified reports, and offer the option to be integrated into more extensive studies. Additionally, such methodology should be implemented as a free programming package, in order to be continuously adapted and redistributed by others. We propose an integrated framework for creating multiple regression models, called RRegrs. The tool offers the option of ten simple and complex regression methods combined with repeated 10-fold and leave-one-out cross-validation. Methods include Multiple Linear regression, Generalized Linear Model with Stepwise Feature Selection, Partial Least Squares regression, Lasso regression, and Support Vector Machines Recursive Feature Elimination. The new framework is an automated fully validated procedure which produces standardized reports to quickly oversee the impact of choices in modelling algorithms and assess the model and cross-validation results. The methodology was implemented as an open source R package, available at https://www.github.com/enanomapper/RRegrs, by reusing and extending on the caret package. The universality of the new methodology is demonstrated using five standard data sets from different scientific fields. Its efficiency in cheminformatics and QSAR modelling is shown with three use cases: proteomics data for surface-modified gold nanoparticles, nano-metal oxides descriptor data, and molecular descriptors for acute aquatic toxicity data. The results show that for all data sets RRegrs reports models with equal or better performance for both training and test sets than those reported in the original publications. Its good performance as well as its adaptability in terms of parameter optimization could make RRegrs a popular framework to assist the initial exploration of predictive models, and with that, the design of more comprehensive in silico screening applications.Graphical abstractRRegrs is a computer-aided model selection framework for R multiple regression models; this is a fully validated procedure with application to QSAR modelling.

  16. Thermal Analysis on the Shipment of Russian Plutonium Fuel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Or, Chuen T; Skrabek, Emanuel A; Carpenter, Robert T

    Paper presented at the 12th Symposium on Space Nuclear Power and Propulsion in Albuquerque, NM in January 1995. The Mound 9516 shipping package was designed for the shipment of Plutonium-238 fuel. One of the shipping configurations is the Russian Pu-238 powder can. Computer models using SINDA were created to predict the temperatures of the package under normal conditions of transport (NCT: 38oC ambient temperature), under hypothetical accident conditions (HAC: engulfed in fire for 30 minutes), and inside a standard cargo container. Pressure increases inside the package due to the expansion of the trapped gases and helium gas generation from isotopemore » decay were also analyzed. There is a duplicate copy and also a copy in the ESD Files.« less

  17. Towards a C2 Poly-Visualization Tool: Leveraging the Power of Social-Network Analysis and GIS

    DTIC Science & Technology

    2011-06-01

    from Magsino.14 AutoMap, a product of CASOS at Carnegie Mellon University, is a text-mining tool that enables the extraction of network data from...enables community leaders to prepare for biological attacks using computational models. BioWar is a CASOS package that combines many factors into a...models, demographically accurate agent modes, wind dispersion models, and an error-diagnostic model. Construct, also developed by CASOS , is a

  18. Verification of component mode techniques for flexible multibody systems

    NASA Technical Reports Server (NTRS)

    Wiens, Gloria J.

    1990-01-01

    Investigations were conducted in the modeling aspects of flexible multibodies undergoing large angular displacements. Models were to be generated and analyzed through application of computer simulation packages employing the 'component mode synthesis' techniques. Multibody Modeling, Verification and Control Laboratory (MMVC) plan was implemented, which includes running experimental tests on flexible multibody test articles. From these tests, data was to be collected for later correlation and verification of the theoretical results predicted by the modeling and simulation process.

  19. PLATSIM: A Simulation and Analysis Package for Large-Order Flexible Systems. Version 2.0

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Kenny, Sean P.; Giesy, Daniel P.

    1997-01-01

    The software package PLATSIM provides efficient time and frequency domain analysis of large-order generic space platforms. PLATSIM can perform open-loop analysis or closed-loop analysis with linear or nonlinear control system models. PLATSIM exploits the particular form of sparsity of the plant matrices for very efficient linear and nonlinear time domain analysis, as well as frequency domain analysis. A new, original algorithm for the efficient computation of open-loop and closed-loop frequency response functions for large-order systems has been developed and is implemented within the package. Furthermore, a novel and efficient jitter analysis routine which determines jitter and stability values from time simulations in a very efficient manner has been developed and is incorporated in the PLATSIM package. In the time domain analysis, PLATSIM simulates the response of the space platform to disturbances and calculates the jitter and stability values from the response time histories. In the frequency domain analysis, PLATSIM calculates frequency response function matrices and provides the corresponding Bode plots. The PLATSIM software package is written in MATLAB script language. A graphical user interface is developed in the package to provide convenient access to its various features.

  20. Implementation of building information modeling in Malaysian construction industry

    NASA Astrophysics Data System (ADS)

    Memon, Aftab Hameed; Rahman, Ismail Abdul; Harman, Nur Melly Edora

    2014-10-01

    This study has assessed the implementation level of Building Information Modeling (BIM) in the construction industry of Malaysia. It also investigated several computer software packages facilitating BIM and challenges affecting its implementation. Data collection for this study was carried out using questionnaire survey among the construction practitioners. 95 completed forms of questionnaire received against 150 distributed questionnaire sets from consultant, contractor and client organizations were analyzed statistically. Analysis findings indicated that the level of implementation of BIM in the construction industry of Malaysia is very low. Average index method employed to assess the effectiveness of various software packages of BIM highlighted that Bentley construction, AutoCAD and ArchiCAD are three most popular and effective software packages. Major challenges to BIM implementation are it requires enhanced collaboration, add work to a designer, interoperability and needs enhanced collaboration. For improving the level of implementing BIM in Malaysian industry, it is recommended that a flexible training program of BIM for all practitioners must be created.

  1. Interactive graphical system for small-angle scattering analysis of polydisperse systems

    NASA Astrophysics Data System (ADS)

    Konarev, P. V.; Volkov, V. V.; Svergun, D. I.

    2016-09-01

    A program suite for one-dimensional small-angle scattering analysis of polydisperse systems and multiple data sets is presented. The main program, POLYSAS, has a menu-driven graphical user interface calling computational modules from ATSAS package to perform data treatment and analysis. The graphical menu interface allows one to process multiple (time, concentration or temperature-dependent) data sets and interactively change the parameters for the data modelling using sliders. The graphical representation of the data is done via the Winteracter-based program SASPLOT. The package is designed for the analysis of polydisperse systems and mixtures, and permits one to obtain size distributions and evaluate the volume fractions of the components using linear and non-linear fitting algorithms as well as model-independent singular value decomposition. The use of the POLYSAS package is illustrated by the recent examples of its application to study concentration-dependent oligomeric states of proteins and time kinetics of polymer micelles for anticancer drug delivery.

  2. SARA - SURE/ASSIST RELIABILITY ANALYSIS WORKSTATION (VAX VMS VERSION)

    NASA Technical Reports Server (NTRS)

    Butler, R. W.

    1994-01-01

    SARA, the SURE/ASSIST Reliability Analysis Workstation, is a bundle of programs used to solve reliability problems. The mathematical approach chosen to solve a reliability problem may vary with the size and nature of the problem. The Systems Validation Methods group at NASA Langley Research Center has created a set of four software packages that form the basis for a reliability analysis workstation, including three for use in analyzing reconfigurable, fault-tolerant systems and one for analyzing non-reconfigurable systems. The SARA bundle includes the three for reconfigurable, fault-tolerant systems: SURE reliability analysis program (COSMIC program LAR-13789, LAR-14921); the ASSIST specification interface program (LAR-14193, LAR-14923), and PAWS/STEM reliability analysis programs (LAR-14165, LAR-14920). As indicated by the program numbers in parentheses, each of these three packages is also available separately in two machine versions. The fourth package, which is only available separately, is FTC, the Fault Tree Compiler (LAR-14586, LAR-14922). FTC is used to calculate the top-event probability for a fault tree which describes a non-reconfigurable system. PAWS/STEM and SURE are analysis programs which utilize different solution methods, but have a common input language, the SURE language. ASSIST is a preprocessor that generates SURE language from a more abstract definition. ASSIST, SURE, and PAWS/STEM are described briefly in the following paragraphs. For additional details about the individual packages, including pricing, please refer to their respective abstracts. ASSIST, the Abstract Semi-Markov Specification Interface to the SURE Tool program, allows a reliability engineer to describe the failure behavior of a fault-tolerant computer system in an abstract, high-level language. The ASSIST program then automatically generates a corresponding semi-Markov model. A one-page ASSIST-language description may result in a semi-Markov model with thousands of states and transitions. The ASSIST program also includes model-reduction techniques to facilitate efficient modeling of large systems. The semi-Markov model generated by ASSIST is in the format needed for input to SURE and PAWS/STEM. The Semi-Markov Unreliability Range Evaluator, SURE, is an analysis tool for reconfigurable, fault-tolerant systems. SURE provides an efficient means for calculating accurate upper and lower bounds for the death state probabilities for a large class of semi-Markov models, not just those which can be reduced to critical-pair architectures. The calculated bounds are close enough (usually within 5 percent of each other) for use in reliability studies of ultra-reliable computer systems. The SURE bounding theorems have algebraic solutions and are consequently computationally efficient even for large and complex systems. SURE can optionally regard a specified parameter as a variable over a range of values, enabling an automatic sensitivity analysis. SURE output is tabular. The PAWS/STEM package includes two programs for the creation and evaluation of pure Markov models describing the behavior of fault-tolerant reconfigurable computer systems: the Pade Approximation with Scaling (PAWS) and Scaled Taylor Exponential Matrix (STEM) programs. PAWS and STEM produce exact solutions for the probability of system failure and provide a conservative estimate of the number of significant digits in the solution. Markov models of fault-tolerant architectures inevitably lead to numerically stiff differential equations. Both PAWS and STEM have the capability to solve numerically stiff models. These complementary programs use separate methods to determine the matrix exponential in the solution of the model's system of differential equations. In general, PAWS is better suited to evaluate small and dense models. STEM operates at lower precision, but works faster than PAWS for larger models. The programs that comprise the SARA package were originally developed for use on DEC VAX series computers running VMS and were later ported for use on Sun series computers running SunOS. They are written in C-language, Pascal, and FORTRAN 77. An ANSI compliant C compiler is required in order to compile the C portion of the Sun version source code. The Pascal and FORTRAN code can be compiled on Sun computers using Sun Pascal and Sun Fortran. For the VMS version, VAX C, VAX PASCAL, and VAX FORTRAN can be used to recompile the source code. The standard distribution medium for the VMS version of SARA (COS-10041) is a 9-track 1600 BPI magnetic tape in VMSINSTAL format. It is also available on a TK50 tape cartridge in VMSINSTAL format. Executables are included. The standard distribution medium for the Sun version of SARA (COS-10039) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. Both Sun3 and Sun4 executables are included. Electronic copies of the ASSIST user's manual in TeX and PostScript formats are provided on the distribution medium. DEC, VAX, VMS, and TK50 are registered trademarks of Digital Equipment Corporation. Sun, Sun3, Sun4, and SunOS are trademarks of Sun Microsystems, Inc. TeX is a trademark of the American Mathematical Society. PostScript is a registered trademark of Adobe Systems Incorporated.

  3. SARA - SURE/ASSIST RELIABILITY ANALYSIS WORKSTATION (UNIX VERSION)

    NASA Technical Reports Server (NTRS)

    Butler, R. W.

    1994-01-01

    SARA, the SURE/ASSIST Reliability Analysis Workstation, is a bundle of programs used to solve reliability problems. The mathematical approach chosen to solve a reliability problem may vary with the size and nature of the problem. The Systems Validation Methods group at NASA Langley Research Center has created a set of four software packages that form the basis for a reliability analysis workstation, including three for use in analyzing reconfigurable, fault-tolerant systems and one for analyzing non-reconfigurable systems. The SARA bundle includes the three for reconfigurable, fault-tolerant systems: SURE reliability analysis program (COSMIC program LAR-13789, LAR-14921); the ASSIST specification interface program (LAR-14193, LAR-14923), and PAWS/STEM reliability analysis programs (LAR-14165, LAR-14920). As indicated by the program numbers in parentheses, each of these three packages is also available separately in two machine versions. The fourth package, which is only available separately, is FTC, the Fault Tree Compiler (LAR-14586, LAR-14922). FTC is used to calculate the top-event probability for a fault tree which describes a non-reconfigurable system. PAWS/STEM and SURE are analysis programs which utilize different solution methods, but have a common input language, the SURE language. ASSIST is a preprocessor that generates SURE language from a more abstract definition. ASSIST, SURE, and PAWS/STEM are described briefly in the following paragraphs. For additional details about the individual packages, including pricing, please refer to their respective abstracts. ASSIST, the Abstract Semi-Markov Specification Interface to the SURE Tool program, allows a reliability engineer to describe the failure behavior of a fault-tolerant computer system in an abstract, high-level language. The ASSIST program then automatically generates a corresponding semi-Markov model. A one-page ASSIST-language description may result in a semi-Markov model with thousands of states and transitions. The ASSIST program also includes model-reduction techniques to facilitate efficient modeling of large systems. The semi-Markov model generated by ASSIST is in the format needed for input to SURE and PAWS/STEM. The Semi-Markov Unreliability Range Evaluator, SURE, is an analysis tool for reconfigurable, fault-tolerant systems. SURE provides an efficient means for calculating accurate upper and lower bounds for the death state probabilities for a large class of semi-Markov models, not just those which can be reduced to critical-pair architectures. The calculated bounds are close enough (usually within 5 percent of each other) for use in reliability studies of ultra-reliable computer systems. The SURE bounding theorems have algebraic solutions and are consequently computationally efficient even for large and complex systems. SURE can optionally regard a specified parameter as a variable over a range of values, enabling an automatic sensitivity analysis. SURE output is tabular. The PAWS/STEM package includes two programs for the creation and evaluation of pure Markov models describing the behavior of fault-tolerant reconfigurable computer systems: the Pade Approximation with Scaling (PAWS) and Scaled Taylor Exponential Matrix (STEM) programs. PAWS and STEM produce exact solutions for the probability of system failure and provide a conservative estimate of the number of significant digits in the solution. Markov models of fault-tolerant architectures inevitably lead to numerically stiff differential equations. Both PAWS and STEM have the capability to solve numerically stiff models. These complementary programs use separate methods to determine the matrix exponential in the solution of the model's system of differential equations. In general, PAWS is better suited to evaluate small and dense models. STEM operates at lower precision, but works faster than PAWS for larger models. The programs that comprise the SARA package were originally developed for use on DEC VAX series computers running VMS and were later ported for use on Sun series computers running SunOS. They are written in C-language, Pascal, and FORTRAN 77. An ANSI compliant C compiler is required in order to compile the C portion of the Sun version source code. The Pascal and FORTRAN code can be compiled on Sun computers using Sun Pascal and Sun Fortran. For the VMS version, VAX C, VAX PASCAL, and VAX FORTRAN can be used to recompile the source code. The standard distribution medium for the VMS version of SARA (COS-10041) is a 9-track 1600 BPI magnetic tape in VMSINSTAL format. It is also available on a TK50 tape cartridge in VMSINSTAL format. Executables are included. The standard distribution medium for the Sun version of SARA (COS-10039) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. Both Sun3 and Sun4 executables are included. Electronic copies of the ASSIST user's manual in TeX and PostScript formats are provided on the distribution medium. DEC, VAX, VMS, and TK50 are registered trademarks of Digital Equipment Corporation. Sun, Sun3, Sun4, and SunOS are trademarks of Sun Microsystems, Inc. TeX is a trademark of the American Mathematical Society. PostScript is a registered trademark of Adobe Systems Incorporated.

  4. Runway exit designs for capacity improvement demonstrations. Phase 2: Computer model development

    NASA Technical Reports Server (NTRS)

    Trani, A. A.; Hobeika, A. G.; Kim, B. J.; Nunna, V.; Zhong, C.

    1992-01-01

    The development is described of a computer simulation/optimization model to: (1) estimate the optimal locations of existing and proposed runway turnoffs; and (2) estimate the geometric design requirements associated with newly developed high speed turnoffs. The model described, named REDIM 2.0, represents a stand alone application to be used by airport planners, designers, and researchers alike to estimate optimal turnoff locations. The main procedures are described in detail which are implemented in the software package and possible applications are illustrated when using 6 major runway scenarios. The main output of the computer program is the estimation of the weighted average runway occupancy time for a user defined aircraft population. Also, the location and geometric characteristics of each turnoff are provided to the user.

  5. High Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

    DOE PAGES

    Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

    2014-07-28

    The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space, and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing are essential for accurately modeling them. In the past decade, the US Department of Energy's SciDAC program has produced accelerator-modeling tools that have been employed to tackle some of the most difficult accelerator science problems. The authors discuss the Synergia framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable ofmore » handling the entire spectrum of beam dynamics simulations. Our authors present Synergia's design principles and its performance on HPC platforms.« less

  6. Computer-aided light sheet flow visualization using photogrammetry

    NASA Technical Reports Server (NTRS)

    Stacy, Kathryn; Severance, Kurt; Childers, Brooks A.

    1994-01-01

    A computer-aided flow visualization process has been developed to analyze video images acquired from rotating and translating light sheet visualization systems. The computer process integrates a mathematical model for image reconstruction, advanced computer graphics concepts, and digital image processing to provide a quantitative and a visual analysis capability. The image reconstruction model, based on photogrammetry, uses knowledge of the camera and light sheet locations and orientations to project two-dimensional light sheet video images into three-dimensional space. A sophisticated computer visualization package, commonly used to analyze computational fluid dynamics (CFD) results, was chosen to interactively display the reconstructed light sheet images with the numerical surface geometry for the model or aircraft under study. The photogrammetric reconstruction technique and the image processing and computer graphics techniques and equipment are described. Results of the computer-aided process applied to both a wind tunnel translating light sheet experiment and an in-flight rotating light sheet experiment are presented. The capability to compare reconstructed experimental light sheet images with CFD solutions in the same graphics environment is also demonstrated.

  7. Computer-Aided Light Sheet Flow Visualization

    NASA Technical Reports Server (NTRS)

    Stacy, Kathryn; Severance, Kurt; Childers, Brooks A.

    1993-01-01

    A computer-aided flow visualization process has been developed to analyze video images acquired from rotating and translating light sheet visualization systems. The computer process integrates a mathematical model for image reconstruction, advanced computer graphics concepts, and digital image processing to provide a quantitative and visual analysis capability. The image reconstruction model, based on photogrammetry, uses knowledge of the camera and light sheet locations and orientations to project two-dimensional light sheet video images into three-dimensional space. A sophisticated computer visualization package, commonly used to analyze computational fluid dynamics (CFD) data sets, was chosen to interactively display the reconstructed light sheet images, along with the numerical surface geometry for the model or aircraft under study. A description is provided of the photogrammetric reconstruction technique, and the image processing and computer graphics techniques and equipment. Results of the computer aided process applied to both a wind tunnel translating light sheet experiment and an in-flight rotating light sheet experiment are presented. The capability to compare reconstructed experimental light sheet images and CFD solutions in the same graphics environment is also demonstrated.

  8. Computer-aided light sheet flow visualization

    NASA Technical Reports Server (NTRS)

    Stacy, Kathryn; Severance, Kurt; Childers, Brooks A.

    1993-01-01

    A computer-aided flow visualization process has been developed to analyze video images acquired from rotating and translating light sheet visualization systems. The computer process integrates a mathematical model for image reconstruction, advanced computer graphics concepts, and digital image processing to provide a quantitative and visual analysis capability. The image reconstruction model, based on photogrammetry, uses knowledge of the camera and light sheet locations and orientations to project two-dimensional light sheet video images into three-dimensional space. A sophisticated computer visualization package, commonly used to analyze computational fluid dynamics (CFD) data sets, was chosen to interactively display the reconstructed light sheet images, along with the numerical surface geometry for the model or aircraft under study. A description is provided of the photogrammetric reconstruction technique, and the image processing and computer graphics techniques and equipment. Results of the computer aided process applied to both a wind tunnel translating light sheet experiment and an in-flight rotating light sheet experiment are presented. The capability to compare reconstructed experimental light sheet images and CFD solutions in the same graphics environment is also demonstrated.

  9. Recent developments of NASTRAN pre- amd post-processors: Response spectrum analysis (RESPAN) and interactive graphics (GIFTS)

    NASA Technical Reports Server (NTRS)

    Hirt, E. F.; Fox, G. L.

    1982-01-01

    Two specific NASTRAN preprocessors and postprocessors are examined. A postprocessor for dynamic analysis and a graphical interactive package for model generation and review of resuls are presented. A computer program that provides response spectrum analysis capability based on data from NASTRAN finite element model is described and the GIFTS system, a graphic processor to augment NASTRAN is introduced.

  10. Development of a computer-assisted learning software package on dental traumatology.

    PubMed

    Tolidis, K; Crawford, P; Stephens, C; Papadogiannis, Y; Plakias, C

    1998-10-01

    The development of computer-assisted learning software packages is a relatively new field of computer application. The progress made in personal computer technology toward more user-friendly operating systems has stimulated the academic community to develop computer-assisted learning for pre- and postgraduate students. The ability of computers to combine audio and visual data in an interactive form provides a powerful educational tool. The purpose of this study was to develop and evaluate a computer-assisted learning package on dental traumatology. This program contains background information on the diagnosis, classification, and management of dental injuries in both the permanent and the deciduous dentitions. It is structured into chapters according to the nature of the injury and whether injury has occurred in the primary or permanent dentition. At the end of each chapter there is a self-assessment questionnaire as well as references to relevant literature. Extensive use of pictures and video provides a comprehensive overview of the subject.

  11. COMSOL in the Academic Environment at USNA

    DTIC Science & Technology

    2009-10-01

    figure shows the electric field calculated and the right shows the electron density at one point in time. 3.3 Acoustic Detection of Landmines – 3...industries heavy investment in computer graphics and modeling. Packages such as Maya , Zbrush, Mudbox and others excel at this type of modeling. A...like Sketch-Up, Maya or AutoCAD. An extensive library of pre-built models would include all of the Platonic solids, combinations of Platonic

  12. Quantiprot - a Python package for quantitative analysis of protein sequences.

    PubMed

    Konopka, Bogumił M; Marciniak, Marta; Dyrka, Witold

    2017-07-17

    The field of protein sequence analysis is dominated by tools rooted in substitution matrices and alignments. A complementary approach is provided by methods of quantitative characterization. A major advantage of the approach is that quantitative properties defines a multidimensional solution space, where sequences can be related to each other and differences can be meaningfully interpreted. Quantiprot is a software package in Python, which provides a simple and consistent interface to multiple methods for quantitative characterization of protein sequences. The package can be used to calculate dozens of characteristics directly from sequences or using physico-chemical properties of amino acids. Besides basic measures, Quantiprot performs quantitative analysis of recurrence and determinism in the sequence, calculates distribution of n-grams and computes the Zipf's law coefficient. We propose three main fields of application of the Quantiprot package. First, quantitative characteristics can be used in alignment-free similarity searches, and in clustering of large and/or divergent sequence sets. Second, a feature space defined by quantitative properties can be used in comparative studies of protein families and organisms. Third, the feature space can be used for evaluating generative models, where large number of sequences generated by the model can be compared to actually observed sequences.

  13. airGR: a suite of lumped hydrological models in an R-package

    NASA Astrophysics Data System (ADS)

    Coron, Laurent; Perrin, Charles; Delaigue, Olivier; Andréassian, Vazken; Thirel, Guillaume

    2016-04-01

    Lumped hydrological models are useful and convenient tools for research, engineering and educational purposes. They propose catchment-scale representations of the precipitation-discharge relationship. Thanks to their limited data requirements, they can be easily implemented and run. With such models, it is possible to simulate a number of hydrological key processes over the catchment with limited structural and parametric complexity, typically evapotranspiration, runoff, underground losses, etc. The Hydrology Group at Irstea (Antony) has been developing a suite of rainfall-runoff models over the past 30 years with the main objectives of designing models as efficient as possible in terms of streamflow simulation, applicable to a wide range of catchments and having low data requirements. This resulted in a suite of models running at different time steps (from hourly to annual) applicable for various issues including water balance estimation, forecasting, simulation of impacts and scenario testing. Recently, Irstea has developed an easy-to-use R-package (R Core Team, 2015), called airGR, to make these models widely available. It includes: - the water balance annual GR1A (Mouehli et al., 2006), - the monthly GR2M (Mouehli, 2003) models, - three versions of the daily model, namely GR4J (Perrin et al., 2003), GR5J (Le Moine, 2008) and GR6J (Pushpalatha et al., 2011), - the hourly GR4H model (Mathevet, 2005), - a degree-day snow module CemaNeige (Valéry et al., 2014). The airGR package has been designed to facilitate the use by non-expert users and allow the addition of evaluation criteria, models or calibration algorithms selected by the end-user. Each model core is coded in FORTRAN to ensure low computational time. The other package functions (i.e. mainly the calibration algorithm and the efficiency criteria) are coded in R. The package is already used for educational purposes. The presentation will detail the main functionalities of the package and present a case study application. References: - Le Moine, N. (2008), Le bassin versant de surface vu par le souterrain : une voie d'amélioration des performances et du réalisme des modèles pluie-débit ?, PhD thesis (in French), UPMC, Paris, France. - Mathevet, T. (2005), Quels modèles pluie-débit globaux pour le pas de temps horaire ? Développement empirique et comparaison de modèles sur un large échantillon de bassins versants, PhD thesis (in French), ENGREF - Cemagref (Antony), Paris, France. - Mouelhi S. (2003), Vers une chaîne cohérente de modèles pluie-débit conceptuels globaux aux pas de temps pluriannuel, annuel, mensuel et journalier, PhD thesis (in French), ENGREF - Cemagref Antony, Paris, France. - Mouelhi, S., C. Michel, C. Perrin and V. Andréassian (2006), Stepwise development of a two-parameter monthly water balance model, Journal of Hydrology, 318(1-4), 200-214, doi:10.1016/j.jhydrol.2005.06.014. - Perrin, C., C. Michel and V. Andréassian (2003), Improvement of a parsimonious model for streamflow simulation, Journal of Hydrology, 279(1-4), 275-289, doi:10.1016/S0022-1694(03)00225-7. - Pushpalatha, R., C. Perrin, N. Le Moine, T. Mathevet and V. Andréassian (2011), A downward structural sensitivity analysis of hydrological models to improve low-flow simulation, Journal of Hydrology, 411(1-2), 66-76, doi:10.1016/j.jhydrol.2011.09.034. - R Core Team (2015). R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. URL https://www.R-project.org/ - Valéry, A., V. Andréassian and C. Perrin (2014), "As simple as possible but not simpler": What is useful in a temperature-based snow-accounting routine? Part 2 - Sensitivity analysis of the Cemaneige snow accounting routine on 380 catchments, Journal of Hydrology, doi:10.1016/j.jhydrol.2014.04.058.

  14. Literacity: A multimedia adult literacy package combining NASA technology, recursive ID theory, and authentic instruction theory

    NASA Technical Reports Server (NTRS)

    Willis, Jerry; Willis, Dee Anna; Walsh, Clare; Stephens, Elizabeth; Murphy, Timothy; Price, Jerry; Stevens, William; Jackson, Kevin; Villareal, James A.; Way, Bob

    1994-01-01

    An important part of NASA's mission involves the secondary application of its technologies in the public and private sectors. One current application under development is LiteraCity, a simulation-based instructional package for adults who do not have functional reading skills. Using fuzzy logic routines and other technologies developed by NASA's Information Systems Directorate and hypermedia sound, graphics, and animation technologies the project attempts to overcome the limited impact of adult literacy assessment and instruction by involving the adult in an interactive simulation of real-life literacy activities. The project uses a recursive instructional development model and authentic instruction theory. This paper describes one component of a project to design, develop, and produce a series of computer-based, multimedia instructional packages. The packages are being developed for use in adult literacy programs, particularly in correctional education centers. They use the concepts of authentic instruction and authentic assessment to guide development. All the packages to be developed are instructional simulations. The first is a simulation of 'finding a friend a job.'

  15. An Introduction to Research and the Computer: A Self-Instructional Package.

    ERIC Educational Resources Information Center

    Vasu, Ellen Storey; Palmer, Richard I.

    This self-instructional package includes learning objectives, definitions, exercises, and feedback for learning some basic concepts and skills involved in using computers for analyzing data and understanding basic research terminology. Learning activities are divided into four sections: research and research hypotheses; variables, cases, and…

  16. Approximate Bayesian computation for spatial SEIR(S) epidemic models.

    PubMed

    Brown, Grant D; Porter, Aaron T; Oleson, Jacob J; Hinman, Jessica A

    2018-02-01

    Approximate Bayesia n Computation (ABC) provides an attractive approach to estimation in complex Bayesian inferential problems for which evaluation of the kernel of the posterior distribution is impossible or computationally expensive. These highly parallelizable techniques have been successfully applied to many fields, particularly in cases where more traditional approaches such as Markov chain Monte Carlo (MCMC) are impractical. In this work, we demonstrate the application of approximate Bayesian inference to spatially heterogeneous Susceptible-Exposed-Infectious-Removed (SEIR) stochastic epidemic models. These models have a tractable posterior distribution, however MCMC techniques nevertheless become computationally infeasible for moderately sized problems. We discuss the practical implementation of these techniques via the open source ABSEIR package for R. The performance of ABC relative to traditional MCMC methods in a small problem is explored under simulation, as well as in the spatially heterogeneous context of the 2014 epidemic of Chikungunya in the Americas. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Cartographic Modeling: Computer-assisted Analysis of Spatially Defined Neighborhoods

    NASA Technical Reports Server (NTRS)

    Berry, J. K.; Tomlin, C. D.

    1982-01-01

    Cartographic models addressing a wide variety of applications are composed of fundamental map processing operations. These primitive operations are neither data base nor application-specific. By organizing the set of operations into a mathematical-like structure, the basis for a generalized cartographic modeling framework can be developed. Among the major classes of primitive operations are those associated with reclassifying map categories, overlaying maps, determining distance and connectivity, and characterizing cartographic neighborhoods. The conceptual framework of cartographic modeling is established and techniques for characterizing neighborhoods are used as a means of demonstrating some of the more sophisticated procedures of computer-assisted map analysis. A cartographic model for assessing effective roundwood supply is briefly described as an example of a computer analysis. Most of the techniques described have been implemented as part of the map analysis package developed at the Yale School of Forestry and Environmental Studies.

  18. A study of the compatibility of an existing CFD package with a broader class of material constitutions

    NASA Technical Reports Server (NTRS)

    French, K. W., Jr.

    1985-01-01

    The flexibility of the PHOENICS computational fluid dynamics package was assessed along two general avenues; parallel modeling and analog modeling. In parallel modeling the dependent and independent variables retain their identity within some scaling factors, even though the boundary conditions and especially the constitutive relations do not correspond to any realistic fluid dynamic situation. PHOENICS was used to generate a CFD model that should exhibit the physical anomalies of a granular medium and permit reasonable similarity with boundary conditions typical to membrane or porous piston loading. A considerable portion of the study was spent prying into the existing code with a prejudice toward rate type and disarming any inherent fluid behavior. The final stages of the study were directed at the more specific problem of multiaxis loading of cylindrical geometry with a concern for the appearance of bulging, cross slab shear failure modes.

  19. Development of high performance scientific components for interoperability of computing packages

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gulabani, Teena Pratap

    2008-01-01

    Three major high performance quantum chemistry computational packages, NWChem, GAMESS and MPQC have been developed by different research efforts following different design patterns. The goal is to achieve interoperability among these packages by overcoming the challenges caused by the different communication patterns and software design of each of these packages. A chemistry algorithm is hard to develop as well as being a time consuming process; integration of large quantum chemistry packages will allow resource sharing and thus avoid reinvention of the wheel. Creating connections between these incompatible packages is the major motivation of the proposed work. This interoperability is achievedmore » by bringing the benefits of Component Based Software Engineering through a plug-and-play component framework called Common Component Architecture (CCA). In this thesis, I present a strategy and process used for interfacing two widely used and important computational chemistry methodologies: Quantum Mechanics and Molecular Mechanics. To show the feasibility of the proposed approach the Tuning and Analysis Utility (TAU) has been coupled with NWChem code and its CCA components. Results show that the overhead is negligible when compared to the ease and potential of organizing and coping with large-scale software applications.« less

  20. Review and analysis of dense linear system solver package for distributed memory machines

    NASA Technical Reports Server (NTRS)

    Narang, H. N.

    1993-01-01

    A dense linear system solver package recently developed at the University of Texas at Austin for distributed memory machine (e.g. Intel Paragon) has been reviewed and analyzed. The package contains about 45 software routines, some written in FORTRAN, and some in C-language, and forms the basis for parallel/distributed solutions of systems of linear equations encountered in many problems of scientific and engineering nature. The package, being studied by the Computer Applications Branch of the Analysis and Computation Division, may provide a significant computational resource for NASA scientists and engineers in parallel/distributed computing. Since the package is new and not well tested or documented, many of its underlying concepts and implementations were unclear; our task was to review, analyze, and critique the package as a step in the process that will enable scientists and engineers to apply it to the solution of their problems. All routines in the package were reviewed and analyzed. Underlying theory or concepts which exist in the form of published papers or technical reports, or memos, were either obtained from the author, or from the scientific literature; and general algorithms, explanations, examples, and critiques have been provided to explain the workings of these programs. Wherever the things were still unclear, communications were made with the developer (author), either by telephone or by electronic mail, to understand the workings of the routines. Whenever possible, tests were made to verify the concepts and logic employed in their implementations. A detailed report is being separately documented to explain the workings of these routines.

  1. Development of FWIGPR, an open-source package for full-waveform inversion of common-offset GPR data

    NASA Astrophysics Data System (ADS)

    Jazayeri, S.; Kruse, S.

    2017-12-01

    We introduce a package for full-waveform inversion (FWI) of Ground Penetrating Radar (GPR) data based on a combination of open-source programs. The FWI requires a good starting model, based on direct knowledge of field conditions or on traditional ray-based inversion methods. With a good starting model, the FWI can improve resolution of selected subsurface features. The package will be made available for general use in educational and research activities. The FWIGPR package consists of four main components: 3D to 2D data conversion, source wavelet estimation, forward modeling, and inversion. (These four components additionally require the development, by the user, of a good starting model.) A major challenge with GPR data is the unknown form of the waveform emitted by the transmitter held close to the ground surface. We apply a blind deconvolution method to estimate the source wavelet, based on a sparsity assumption about the reflectivity series of the subsurface model (Gholami and Sacchi 2012). The estimated wavelet is deconvolved from the data and the sparsest reflectivity series with fewest reflectors. The gprMax code (www.gprmax.com) is used as the forward modeling tool and the PEST parameter estimation package (www.pesthomepage.com) for the inversion. To reduce computation time, the field data are converted to an effective 2D equivalent, and the gprMax code can be run in 2D mode. In the first step, the user must create a good starting model of the data, presumably using ray-based methods. This estimated model will be introduced to the FWI process as an initial model. Next, the 3D data is converted to 2D, then the user estimates the source wavelet that best fits the observed data by sparsity assumption of the earth's response. Last, PEST runs gprMax with the initial model and calculates the misfit between the synthetic and observed data, and using an iterative algorithm calling gprMax several times ineach iteration, finds successive models that better fit the data. To gauge whether the iterative process has arrived at a local or global minima, the process can be repeated with a range of starting models. Tests have shown that this package can successfully improve estimates of selected subsurface model parameters for simple synthetic and real data. Ongoing research will focus on FWI of more complex scenarios.

  2. Optimization of a centrifugal compressor impeller using CFD: the choice of simulation model parameters

    NASA Astrophysics Data System (ADS)

    Neverov, V. V.; Kozhukhov, Y. V.; Yablokov, A. M.; Lebedev, A. A.

    2017-08-01

    Nowadays the optimization using computational fluid dynamics (CFD) plays an important role in the design process of turbomachines. However, for the successful and productive optimization it is necessary to define a simulation model correctly and rationally. The article deals with the choice of a grid and computational domain parameters for optimization of centrifugal compressor impellers using computational fluid dynamics. Searching and applying optimal parameters of the grid model, the computational domain and solver settings allows engineers to carry out a high-accuracy modelling and to use computational capability effectively. The presented research was conducted using Numeca Fine/Turbo package with Spalart-Allmaras and Shear Stress Transport turbulence models. Two radial impellers was investigated: the high-pressure at ψT=0.71 and the low-pressure at ψT=0.43. The following parameters of the computational model were considered: the location of inlet and outlet boundaries, type of mesh topology, size of mesh and mesh parameter y+. Results of the investigation demonstrate that the choice of optimal parameters leads to the significant reduction of the computational time. Optimal parameters in comparison with non-optimal but visually similar parameters can reduce the calculation time up to 4 times. Besides, it is established that some parameters have a major impact on the result of modelling.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, K.; Tsai, H.; Decision and Information Sciences

    The technical basis for extending the Model 9977 shipping package periodic maintenance beyond the one-year interval to a maximum of five years is based on the performance of the O-ring seals and the environmental conditions. The DOE Packaging Certification Program (PCP) has tasked Argonne National Laboratory to develop a Radio-Frequency Identification (RFID) temperature monitoring system for use by the facility personnel at DAF/NTS. The RFID temperature monitoring system, depicted in the figure below, consists of the Mk-1 RFId tags, a reader, and a control computer mounted on a mobile platform that can operate as a stand-alone system, or it canmore » be connected to the local IT network. As part of the Conditions of Approval of the CoC, the user must complete the prescribed training to become qualified and be certified for operation of the RFID temperature monitoring system. The training course will be administered by Argonne National Laboratory on behalf of the Headquarters Certifying Official. This is a complete documentation package for the RFID temperature monitoring system of the Model 9977 packagings at NTS. The documentation package will be used for training and certification. The table of contents are: Acceptance Testing Procedure of MK-1 RFID Tags for DOE/EM Nuclear Materials Management Applications; Acceptance Testing Result of MK-1 RFID Tags for DOE/EM Nuclear Materials Management Applications; Performance Test of the Single Bolt Seal Sensor for the Model 9977 Packaging; Calibration of Built-in Thermistors in RFID Tags for Nevada Test Site; Results of Calibration of Built-in Thermistors in RFID Tags; Results of Thermal Calibration of Second Batch of MK-I RFID Tags; Procedure for Installing and Removing MK-1 RFID Tag on Model 9977 Drum; User Guide for RFID Reader and Software for Temperature Monitoring of Model 9977 Drums at NTS; Software Quality Assurance Plan (SQAP) for the ARG-US System; Quality Category for the RFID Temperature Monitoring System; The Documentation Package for the RFID Temperature Monitoring System; Software Test Plan and Results for ARG-US OnSite; Configuration Management Plan (CMP) for the ARG-US System; Requirements Management Plan for the ARG-US System; and Design Management Plan for ARG-US.« less

  4. Integrated software environment based on COMKAT for analyzing tracer pharmacokinetics with molecular imaging.

    PubMed

    Fang, Yu-Hua Dean; Asthana, Pravesh; Salinas, Cristian; Huang, Hsuan-Ming; Muzic, Raymond F

    2010-01-01

    An integrated software package, Compartment Model Kinetic Analysis Tool (COMKAT), is presented in this report. COMKAT is an open-source software package with many functions for incorporating pharmacokinetic analysis in molecular imaging research and has both command-line and graphical user interfaces. With COMKAT, users may load and display images, draw regions of interest, load input functions, select kinetic models from a predefined list, or create a novel model and perform parameter estimation, all without having to write any computer code. For image analysis, COMKAT image tool supports multiple image file formats, including the Digital Imaging and Communications in Medicine (DICOM) standard. Image contrast, zoom, reslicing, display color table, and frame summation can be adjusted in COMKAT image tool. It also displays and automatically registers images from 2 modalities. Parametric imaging capability is provided and can be combined with the distributed computing support to enhance computation speeds. For users without MATLAB licenses, a compiled, executable version of COMKAT is available, although it currently has only a subset of the full COMKAT capability. Both the compiled and the noncompiled versions of COMKAT are free for academic research use. Extensive documentation, examples, and COMKAT itself are available on its wiki-based Web site, http://comkat.case.edu. Users are encouraged to contribute, sharing their experience, examples, and extensions of COMKAT. With integrated functionality specifically designed for imaging and kinetic modeling analysis, COMKAT can be used as a software environment for molecular imaging and pharmacokinetic analysis.

  5. Can I Trust This Software Package? An Exercise in Validation of Computational Results

    ERIC Educational Resources Information Center

    Shacham, Mordechai; Brauner, Neima; Ashurst, W. Robert; Cutlip, Michael B.

    2008-01-01

    Mathematical software packages such as Polymath, MATLAB, and Mathcad are currently widely used for engineering problem solving. Applications of several of these packages to typical chemical engineering problems have been demonstrated by Cutlip, et al. The main characteristic of these packages is that they provide a "problem-solving environment…

  6. An Interactive Computer Aided Electrical Engineering Education Package.

    ERIC Educational Resources Information Center

    Cavati, Cicero Romao

    This paper describes an educational package to help the learning process. A case study is presented of an energy distribution course in the Electrical Engineering Department at the Federal University of Espirito Santo (UFES). The advantages of the developed package are shown by comparing it with the traditional academic book. This package presents…

  7. Eddylicious: A Python package for turbulent inflow generation

    NASA Astrophysics Data System (ADS)

    Mukha, Timofey; Liefvendahl, Mattias

    2018-01-01

    A Python package for generating inflow for scale-resolving computer simulations of turbulent flow is presented. The purpose of the package is to unite existing inflow generation methods in a single code-base and make them accessible to users of various Computational Fluid Dynamics (CFD) solvers. The currently existing functionality consists of an accurate inflow generation method suitable for flows with a turbulent boundary layer inflow and input/output routines for coupling with the open-source CFD solver OpenFOAM.

  8. A Maple package for computing Gröbner bases for linear recurrence relations

    NASA Astrophysics Data System (ADS)

    Gerdt, Vladimir P.; Robertz, Daniel

    2006-04-01

    A Maple package for computing Gröbner bases of linear difference ideals is described. The underlying algorithm is based on Janet and Janet-like monomial divisions associated with finite difference operators. The package can be used, for example, for automatic generation of difference schemes for linear partial differential equations and for reduction of multiloop Feynman integrals. These two possible applications are illustrated by simple examples of the Laplace equation and a one-loop scalar integral of propagator type.

  9. Computerised data reduction.

    PubMed

    Datson, D J; Carter, N G

    1988-10-01

    The use of personal computers in accountancy and business generally has been stimulated by the availability of flexible software packages. We describe the implementation of a commercial software package designed for interfacing with laboratory instruments and highlight the ease with which it can be implemented, without the need for specialist computer programming staff.

  10. WiLE: A Mathematica package for weak coupling expansion of Wilson loops in ABJ(M) theory

    NASA Astrophysics Data System (ADS)

    Preti, M.

    2018-06-01

    We present WiLE, a Mathematica® package designed to perform the weak coupling expansion of any Wilson loop in ABJ(M) theory at arbitrary perturbative order. For a given set of fields on the loop and internal vertices, the package displays all the possible Feynman diagrams and their integral representations. The user can also choose to exclude non planar diagrams, tadpoles and self-energies. Through the use of interactive input windows, the package should be easily accessible to users with little or no previous experience. The package manual provides some pedagogical examples and the computation of all ladder diagrams at three-loop relevant for the cusp anomalous dimension in ABJ(M). The latter application gives also support to some recent results computed in different contexts.

  11. [Development of analysis software package for the two kinds of Japanese fluoro-d-glucose-positron emission tomography guideline].

    PubMed

    Matsumoto, Keiichi; Endo, Keigo

    2013-06-01

    Two kinds of Japanese guidelines for the data acquisition protocol of oncology fluoro-D-glucose-positron emission tomography (FDG-PET)/computed tomography (CT) scans were created by the joint task force of the Japanese Society of Nuclear Medicine Technology (JSNMT) and the Japanese Society of Nuclear Medicine (JSNM), and published in Kakuigaku-Gijutsu 27(5): 425-456, 2007 and 29(2): 195-235, 2009. These guidelines aim to standardize PET image quality among facilities and different PET/CT scanner models. The objective of this study was to develop a personal computer-based performance measurement and image quality processor for the two kinds of Japanese guidelines for oncology (18)F-FDG PET/CT scans. We call this software package the "PET quality control tool" (PETquact). Microsoft Corporation's Windows(™) is used as the operating system for PETquact, which requires 1070×720 image resolution and includes 12 different applications. The accuracy was examined for numerous applications of PETquact. For example, in the sensitivity application, the system sensitivity measurement results were equivalent when comparing two PET sinograms obtained from the PETquact and the report. PETquact is suited for analysis of the two kinds of Japanese guideline, and it shows excellent spec to performance measurements and image quality analysis. PETquact can be used at any facility if the software package is installed on a laptop computer.

  12. Common Ada (tradename) Missile Package (CAMP) Project. Missile Software Parts. Volume 8. Detail Design Document

    DTIC Science & Technology

    1988-03-01

    PACKAGE BODY ) TLCSC P661 (CATALOG #P106-0) This package contains the CAMP parts required to do the vaypoint steering portion of navigation. The...3.3.4.1.6 PROCESSING The following describes the processing performed by this part: package body WaypointSteering is package body ...Steering_Vector_Operations is separate; package body Steering_Vector_Operations_with_Arcsin is separate; procedure Compute Turn_Angle_and Direction (UnitNormal C

  13. Packaging printed circuit boards: A production application of interactive graphics

    NASA Technical Reports Server (NTRS)

    Perrill, W. A.

    1975-01-01

    The structure and use of an Interactive Graphics Packaging Program (IGPP), conceived to apply computer graphics to the design of packaging electronic circuits onto printed circuit boards (PCB), were described. The intent was to combine the data storage and manipulative power of the computer with the imaginative, intuitive power of a human designer. The hardware includes a CDC 6400 computer and two CDC 777 terminals with CRT screens, light pens, and keyboards. The program is written in FORTRAN 4 extended with the exception of a few functions coded in COMPASS (assembly language). The IGPP performs four major functions for the designer: (1) data input and display, (2) component placement (automatic or manual), (3) conductor path routing (automatic or manual), and (4) data output. The most complex PCB packaged to date measured 16.5 cm by 19 cm and contained 380 components, two layers of ground planes and four layers of conductors mixed with ground planes.

  14. A model for the distributed storage and processing of large arrays

    NASA Technical Reports Server (NTRS)

    Mehrota, P.; Pratt, T. W.

    1983-01-01

    A conceptual model for parallel computations on large arrays is developed. The model provides a set of language concepts appropriate for processing arrays which are generally too large to fit in the primary memories of a multiprocessor system. The semantic model is used to represent arrays on a concurrent architecture in such a way that the performance realities inherent in the distributed storage and processing can be adequately represented. An implementation of the large array concept as an Ada package is also described.

  15. Enhancement of DFT-calculations at petascale: Nuclear Magnetic Resonance, Hybrid Density Functional Theory and Car-Parrinello calculations

    NASA Astrophysics Data System (ADS)

    Varini, Nicola; Ceresoli, Davide; Martin-Samos, Layla; Girotto, Ivan; Cavazzoni, Carlo

    2013-08-01

    One of the most promising techniques used for studying the electronic properties of materials is based on Density Functional Theory (DFT) approach and its extensions. DFT has been widely applied in traditional solid state physics problems where periodicity and symmetry play a crucial role in reducing the computational workload. With growing compute power capability and the development of improved DFT methods, the range of potential applications is now including other scientific areas such as Chemistry and Biology. However, cross disciplinary combinations of traditional Solid-State Physics, Chemistry and Biology drastically improve the system complexity while reducing the degree of periodicity and symmetry. Large simulation cells containing of hundreds or even thousands of atoms are needed to model these kind of physical systems. The treatment of those systems still remains a computational challenge even with modern supercomputers. In this paper we describe our work to improve the scalability of Quantum ESPRESSO (Giannozzi et al., 2009 [3]) for treating very large cells and huge numbers of electrons. To this end we have introduced an extra level of parallelism, over electronic bands, in three kernels for solving computationally expensive problems: the Sternheimer equation solver (Nuclear Magnetic Resonance, package QE-GIPAW), the Fock operator builder (electronic ground-state, package PWscf) and most of the Car-Parrinello routines (Car-Parrinello dynamics, package CP). Final benchmarks show our success in computing the Nuclear Magnetic Response (NMR) chemical shift of a large biological assembly, the electronic structure of defected amorphous silica with hybrid exchange-correlation functionals and the equilibrium atomic structure of height Porphyrins anchored to a Carbon Nanotube, on many thousands of CPU cores.

  16. Evaluation of the CMI Instructor Role Training Program in the Navy and Air Force. Final Report.

    ERIC Educational Resources Information Center

    McCombs, Barbara L.; And Others

    A computer managed instruction (CMI) instructor role definition and training package was designed to help CMI teachers acquire the skills necessary to perform seven theoretically-based instructor roles: planner, implementer/monitor, evaluator/provider, diagnostician, remediator, counselor/advisor, and tutor/modeler. Data for the evaluation of the…

  17. A Computer-Based Simulation for Teaching Heat Transfer across a Woody Stem

    ERIC Educational Resources Information Center

    Maixner, Michael R.; Noyd, Robert K.; Krueger, Jerome A.

    2010-01-01

    To assist student understanding of heat transfer through woody stems, we developed an instructional package that included an Excel-based, one-dimensional simulation model and a companion instructional worksheet. Guiding undergraduate botany students to applying principles of thermodynamics to plants in nature is fraught with two main obstacles:…

  18. Verification of a VRF Heat Pump Computer Model in EnergyPlus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nigusse, Bereket; Raustad, Richard

    2013-06-15

    This paper provides verification results of the EnergyPlus variable refrigerant flow (VRF) heat pump computer model using manufacturer's performance data. The paper provides an overview of the VRF model, presents the verification methodology, and discusses the results. The verification provides quantitative comparison of full and part-load performance to manufacturer's data in cooling-only and heating-only modes of operation. The VRF heat pump computer model uses dual range bi-quadratic performance curves to represent capacity and Energy Input Ratio (EIR) as a function of indoor and outdoor air temperatures, and dual range quadratic performance curves as a function of part-load-ratio for modeling part-loadmore » performance. These performance curves are generated directly from manufacturer's published performance data. The verification compared the simulation output directly to manufacturer's performance data, and found that the dual range equation fit VRF heat pump computer model predicts the manufacturer's performance data very well over a wide range of indoor and outdoor temperatures and part-load conditions. The predicted capacity and electric power deviations are comparbale to equation-fit HVAC computer models commonly used for packaged and split unitary HVAC equipment.« less

  19. Computational thermochemistry: Automated generation of scale factors for vibrational frequencies calculated by electronic structure model chemistries

    NASA Astrophysics Data System (ADS)

    Yu, Haoyu S.; Fiedler, Lucas J.; Alecu, I. M.; Truhlar, Donald G.

    2017-01-01

    We present a Python program, FREQ, for calculating the optimal scale factors for calculating harmonic vibrational frequencies, fundamental vibrational frequencies, and zero-point vibrational energies from electronic structure calculations. The program utilizes a previously published scale factor optimization model (Alecu et al., 2010) to efficiently obtain all three scale factors from a set of computed vibrational harmonic frequencies. In order to obtain the three scale factors, the user only needs to provide zero-point energies of 15 or 6 selected molecules. If the user has access to the Gaussian 09 or Gaussian 03 program, we provide the option for the user to run the program by entering the keywords for a certain method and basis set in the Gaussian 09 or Gaussian 03 program. Four other Python programs, input.py, input6, pbs.py, and pbs6.py, are also provided for generating Gaussian 09 or Gaussian 03 input and PBS files. The program can also be used with data from any other electronic structure package. A manual of how to use this program is included in the code package.

  20. Simulating Responses of Gravitational-Wave Instrumentation

    NASA Technical Reports Server (NTRS)

    Armstrong, John; Edlund, Jeffrey; Vallisneri. Michele

    2006-01-01

    Synthetic LISA is a computer program for simulating the responses of the instrumentation of the NASA/ESA Laser Interferometer Space Antenna (LISA) mission, the purpose of which is to detect and study gravitational waves. Synthetic LISA generates synthetic time series of the LISA fundamental noises, as filtered through all the time-delay-interferometry (TDI) observables. (TDI is a method of canceling phase noise in temporally varying unequal-arm interferometers.) Synthetic LISA provides a streamlined module to compute the TDI responses to gravitational waves, according to a full model of TDI (including the motion of the LISA array and the temporal and directional dependence of the arm lengths). Synthetic LISA is written in the C++ programming language as a modular package that accommodates the addition of code for specific gravitational wave sources or for new noise models. In addition, time series for waves and noises can be easily loaded from disk storage or electronic memory. The package includes a Python-language interface for easy, interactive steering and scripting. Through Python, Synthetic LISA can read and write data files in Flexible Image Transport System (FITS), which is a commonly used astronomical data format.

  1. MPPhys—A many-particle simulation package for computational physics education

    NASA Astrophysics Data System (ADS)

    Müller, Thomas

    2014-03-01

    In a first course to classical mechanics elementary physical processes like elastic two-body collisions, the mass-spring model, or the gravitational two-body problem are discussed in detail. The continuation to many-body systems, however, is deferred to graduate courses although the underlying equations of motion are essentially the same and although there is a strong motivation for high-school students in particular because of the use of particle systems in computer games. The missing link between the simple and the more complex problem is a basic introduction to solve the equations of motion numerically which could be illustrated, however, by means of the Euler method. The many-particle physics simulation package MPPhys offers a platform to experiment with simple particle simulations. The aim is to give a principle idea how to implement many-particle simulations and how simulation and visualization can be combined for interactive visual explorations. Catalogue identifier: AERR_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERR_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 111327 No. of bytes in distributed program, including test data, etc.: 608411 Distribution format: tar.gz Programming language: C++, OpenGL, GLSL, OpenCL. Computer: Linux and Windows platforms with OpenGL support. Operating system: Linux and Windows. RAM: Source Code 4.5 MB Complete package 242 MB Classification: 14, 16.9. External routines: OpenGL, OpenCL Nature of problem: Integrate N-body simulations, mass-spring models Solution method: Numerical integration of N-body-simulations, 3D-Rendering via OpenGL. Running time: Problem dependent

  2. Next-generation acceleration and code optimization for light transport in turbid media using GPUs

    PubMed Central

    Alerstam, Erik; Lo, William Chun Yip; Han, Tianyi David; Rose, Jonathan; Andersson-Engels, Stefan; Lilge, Lothar

    2010-01-01

    A highly optimized Monte Carlo (MC) code package for simulating light transport is developed on the latest graphics processing unit (GPU) built for general-purpose computing from NVIDIA - the Fermi GPU. In biomedical optics, the MC method is the gold standard approach for simulating light transport in biological tissue, both due to its accuracy and its flexibility in modelling realistic, heterogeneous tissue geometry in 3-D. However, the widespread use of MC simulations in inverse problems, such as treatment planning for PDT, is limited by their long computation time. Despite its parallel nature, optimizing MC code on the GPU has been shown to be a challenge, particularly when the sharing of simulation result matrices among many parallel threads demands the frequent use of atomic instructions to access the slow GPU global memory. This paper proposes an optimization scheme that utilizes the fast shared memory to resolve the performance bottleneck caused by atomic access, and discusses numerous other optimization techniques needed to harness the full potential of the GPU. Using these techniques, a widely accepted MC code package in biophotonics, called MCML, was successfully accelerated on a Fermi GPU by approximately 600x compared to a state-of-the-art Intel Core i7 CPU. A skin model consisting of 7 layers was used as the standard simulation geometry. To demonstrate the possibility of GPU cluster computing, the same GPU code was executed on four GPUs, showing a linear improvement in performance with an increasing number of GPUs. The GPU-based MCML code package, named GPU-MCML, is compatible with a wide range of graphics cards and is released as an open-source software in two versions: an optimized version tuned for high performance and a simplified version for beginners (http://code.google.com/p/gpumcml). PMID:21258498

  3. DMRfinder: efficiently identifying differentially methylated regions from MethylC-seq data.

    PubMed

    Gaspar, John M; Hart, Ronald P

    2017-11-29

    DNA methylation is an epigenetic modification that is studied at a single-base resolution with bisulfite treatment followed by high-throughput sequencing. After alignment of the sequence reads to a reference genome, methylation counts are analyzed to determine genomic regions that are differentially methylated between two or more biological conditions. Even though a variety of software packages is available for different aspects of the bioinformatics analysis, they often produce results that are biased or require excessive computational requirements. DMRfinder is a novel computational pipeline that identifies differentially methylated regions efficiently. Following alignment, DMRfinder extracts methylation counts and performs a modified single-linkage clustering of methylation sites into genomic regions. It then compares methylation levels using beta-binomial hierarchical modeling and Wald tests. Among its innovative attributes are the analyses of novel methylation sites and methylation linkage, as well as the simultaneous statistical analysis of multiple sample groups. To demonstrate its efficiency, DMRfinder is benchmarked against other computational approaches using a large published dataset. Contrasting two replicates of the same sample yielded minimal genomic regions with DMRfinder, whereas two alternative software packages reported a substantial number of false positives. Further analyses of biological samples revealed fundamental differences between DMRfinder and another software package, despite the fact that they utilize the same underlying statistical basis. For each step, DMRfinder completed the analysis in a fraction of the time required by other software. Among the computational approaches for identifying differentially methylated regions from high-throughput bisulfite sequencing datasets, DMRfinder is the first that integrates all the post-alignment steps in a single package. Compared to other software, DMRfinder is extremely efficient and unbiased in this process. DMRfinder is free and open-source software, available on GitHub ( github.com/jsh58/DMRfinder ); it is written in Python and R, and is supported on Linux.

  4. Quantifying High Frequency Skill and Relative Cost of Two Physics Packages for an Operational Spectral Wave Model

    NASA Astrophysics Data System (ADS)

    Edwards, K. L.; Siqueira, S. A.; Rogers, W. E.

    2017-12-01

    Simulating WAves Nearshore (SWAN) is one of the two phase-averaged spectral models used by the U.S. Navy for forecasting waves. The source term package available during SWAN's first validation for operational use in 2002 is referred to here as "SWAN-ST1". Since then, improvements to the source term package were implemented (SWAN-ST6). One outcome of the new package is better skill within the higher frequencies of the spectrum, where SWAN-ST1 gives non-physical response to changes in wind speed and non-physical response to presence of swell in the lower frequencies. However, SWAN-ST6 is typically 25% more expensive computationally than SWAN-ST1, so it is worthwhile to estimate its benefit to operational cases, where computation time is a primary concern. To validate SWAN-ST6 and quantify its improvement over SWAN-ST1, we apply the model to two geographic regions—the Northern Gulf of Mexico (NGoM) and Southern California (SoCal) coasts. The NGoM case is typified by smaller fetch and smaller swell contribution. We analyze the NGoM case for a year (2010 July-2011 Aug.); and the SoCal application for summer (2016 July-Aug.) and winter (2017 Jan.-Feb.) seasons. We compare results to available buoy data. The wave parameter mean square slope (MSS) is included in this study, since it is sensitive to accuracy (or lack thereof) at higher frequencies, as it is weighted by frequency to the fourth power. Model-data comparisons for the NGoM case show little benefit to using SWAN-ST6; both models highly correlate with observations, a result attributable to a) the lack of swell and b) the fact that the shorter dominant periods in this basin imply that available buoys typically measure very little of the spectral tail (e.g. 2 to 4 times the peak frequency). Model-data comparisons for the SoCal test case show that SWAN-ST6 performs at least as well but most often better than SWAN-ST1. The improvement is almost negligible for the lower order moments, significant wave height and period, which are less sensitive to the spectral tail. However, SWAN-ST6 in the SoCal case dramatically improves the comparisons of MSS relative to SWAN-ST1. Thus, results show that, in cases where the wave climate and frequency range of buoy data permit validation of the spectral tail, skill of MSS prediction is dramatically improved with the new source term package.

  5. Climate Ocean Modeling on a Beowulf Class System

    NASA Technical Reports Server (NTRS)

    Cheng, B. N.; Chao, Y.; Wang, P.; Bondarenko, M.

    2000-01-01

    With the growing power and shrinking cost of personal computers. the availability of fast ethernet interconnections, and public domain software packages, it is now possible to combine them to build desktop parallel computers (named Beowulf or PC clusters) at a fraction of what it would cost to buy systems of comparable power front supercomputer companies. This led as to build and assemble our own sys tem. specifically for climate ocean modeling. In this article, we present our experience with such a system, discuss its network performance, and provide some performance comparison data with both HP SPP2000 and Cray T3E for an ocean Model used in present-day oceanographic research.

  6. Reliability modeling of fault-tolerant computer based systems

    NASA Technical Reports Server (NTRS)

    Bavuso, Salvatore J.

    1987-01-01

    Digital fault-tolerant computer-based systems have become commonplace in military and commercial avionics. These systems hold the promise of increased availability, reliability, and maintainability over conventional analog-based systems through the application of replicated digital computers arranged in fault-tolerant configurations. Three tightly coupled factors of paramount importance, ultimately determining the viability of these systems, are reliability, safety, and profitability. Reliability, the major driver affects virtually every aspect of design, packaging, and field operations, and eventually produces profit for commercial applications or increased national security. However, the utilization of digital computer systems makes the task of producing credible reliability assessment a formidable one for the reliability engineer. The root of the problem lies in the digital computer's unique adaptability to changing requirements, computational power, and ability to test itself efficiently. Addressed here are the nuances of modeling the reliability of systems with large state sizes, in the Markov sense, which result from systems based on replicated redundant hardware and to discuss the modeling of factors which can reduce reliability without concomitant depletion of hardware. Advanced fault-handling models are described and methods of acquiring and measuring parameters for these models are delineated.

  7. ExGUtils: A Python Package for Statistical Analysis With the ex-Gaussian Probability Density.

    PubMed

    Moret-Tatay, Carmen; Gamermann, Daniel; Navarro-Pardo, Esperanza; Fernández de Córdoba Castellá, Pedro

    2018-01-01

    The study of reaction times and their underlying cognitive processes is an important field in Psychology. Reaction times are often modeled through the ex-Gaussian distribution, because it provides a good fit to multiple empirical data. The complexity of this distribution makes the use of computational tools an essential element. Therefore, there is a strong need for efficient and versatile computational tools for the research in this area. In this manuscript we discuss some mathematical details of the ex-Gaussian distribution and apply the ExGUtils package, a set of functions and numerical tools, programmed for python, developed for numerical analysis of data involving the ex-Gaussian probability density. In order to validate the package, we present an extensive analysis of fits obtained with it, discuss advantages and differences between the least squares and maximum likelihood methods and quantitatively evaluate the goodness of the obtained fits (which is usually an overlooked point in most literature in the area). The analysis done allows one to identify outliers in the empirical datasets and criteriously determine if there is a need for data trimming and at which points it should be done.

  8. ExGUtils: A Python Package for Statistical Analysis With the ex-Gaussian Probability Density

    PubMed Central

    Moret-Tatay, Carmen; Gamermann, Daniel; Navarro-Pardo, Esperanza; Fernández de Córdoba Castellá, Pedro

    2018-01-01

    The study of reaction times and their underlying cognitive processes is an important field in Psychology. Reaction times are often modeled through the ex-Gaussian distribution, because it provides a good fit to multiple empirical data. The complexity of this distribution makes the use of computational tools an essential element. Therefore, there is a strong need for efficient and versatile computational tools for the research in this area. In this manuscript we discuss some mathematical details of the ex-Gaussian distribution and apply the ExGUtils package, a set of functions and numerical tools, programmed for python, developed for numerical analysis of data involving the ex-Gaussian probability density. In order to validate the package, we present an extensive analysis of fits obtained with it, discuss advantages and differences between the least squares and maximum likelihood methods and quantitatively evaluate the goodness of the obtained fits (which is usually an overlooked point in most literature in the area). The analysis done allows one to identify outliers in the empirical datasets and criteriously determine if there is a need for data trimming and at which points it should be done. PMID:29765345

  9. Aspects on Transfer of Aided - Design Files

    NASA Astrophysics Data System (ADS)

    Goanta, A. M.; Anghelache, D. G.

    2016-08-01

    At this stage of development of hardware and software, each company that makes design software packages has a certain type of file created and customized in time to distinguish that company from its competitors. Thus today are widely known the DWG files belonging AutoCAD, IPT / IAM belonging to Inventor, PAR / ASM of Solid Edge's, PRT from the NX and so on. Behind every type of file there is a mathematical model which is common to more types of files. A specific aspect of the computer -aided design is that all softwares are working with both individual parts and assemblies, but their approach is different in that some use the same type of file both for each part and for the whole (PRT ), while others use different types of files (IPT / IAM, PAR / ASM, etc.). Another aspect of the computer -aided design is to transfer files between different companies which use different software packages or even the same software package but in different versions. Each of these situations generates distinct issues. Thus, to solve the partial reading by a project different from the native one, transfer files of STEP and IGES type are used

  10. Physics Computing '92: Proceedings of the 4th International Conference

    NASA Astrophysics Data System (ADS)

    de Groot, Robert A.; Nadrchal, Jaroslav

    1993-04-01

    The Table of Contents for the book is as follows: * Preface * INVITED PAPERS * Ab Initio Theoretical Approaches to the Structural, Electronic and Vibrational Properties of Small Clusters and Fullerenes: The State of the Art * Neural Multigrid Methods for Gauge Theories and Other Disordered Systems * Multicanonical Monte Carlo Simulations * On the Use of the Symbolic Language Maple in Physics and Chemistry: Several Examples * Nonequilibrium Phase Transitions in Catalysis and Population Models * Computer Algebra, Symmetry Analysis and Integrability of Nonlinear Evolution Equations * The Path-Integral Quantum Simulation of Hydrogen in Metals * Digital Optical Computing: A New Approach of Systolic Arrays Based on Coherence Modulation of Light and Integrated Optics Technology * Molecular Dynamics Simulations of Granular Materials * Numerical Implementation of a K.A.M. Algorithm * Quasi-Monte Carlo, Quasi-Random Numbers and Quasi-Error Estimates * What Can We Learn from QMC Simulations * Physics of Fluctuating Membranes * Plato, Apollonius, and Klein: Playing with Spheres * Steady States in Nonequilibrium Lattice Systems * CONVODE: A REDUCE Package for Differential Equations * Chaos in Coupled Rotators * Symplectic Numerical Methods for Hamiltonian Problems * Computer Simulations of Surfactant Self Assembly * High-dimensional and Very Large Cellular Automata for Immunological Shape Space * A Review of the Lattice Boltzmann Method * Electronic Structure of Solids in the Self-interaction Corrected Local-spin-density Approximation * Dedicated Computers for Lattice Gauge Theory Simulations * Physics Education: A Survey of Problems and Possible Solutions * Parallel Computing and Electronic-Structure Theory * High Precision Simulation Techniques for Lattice Field Theory * CONTRIBUTED PAPERS * Case Study of Microscale Hydrodynamics Using Molecular Dynamics and Lattice Gas Methods * Computer Modelling of the Structural and Electronic Properties of the Supported Metal Catalysis * Ordered Particle Simulations for Serial and MIMD Parallel Computers * "NOLP" -- Program Package for Laser Plasma Nonlinear Optics * Algorithms to Solve Nonlinear Least Square Problems * Distribution of Hydrogen Atoms in Pd-H Computed by Molecular Dynamics * A Ray Tracing of Optical System for Protein Crystallography Beamline at Storage Ring-SIBERIA-2 * Vibrational Properties of a Pseudobinary Linear Chain with Correlated Substitutional Disorder * Application of the Software Package Mathematica in Generalized Master Equation Method * Linelist: An Interactive Program for Analysing Beam-foil Spectra * GROMACS: A Parallel Computer for Molecular Dynamics Simulations * GROMACS Method of Virial Calculation Using a Single Sum * The Interactive Program for the Solution of the Laplace Equation with the Elimination of Singularities for Boundary Functions * Random-Number Generators: Testing Procedures and Comparison of RNG Algorithms * Micro-TOPIC: A Tokamak Plasma Impurities Code * Rotational Molecular Scattering Calculations * Orthonormal Polynomial Method for Calibrating of Cryogenic Temperature Sensors * Frame-based System Representing Basis of Physics * The Role of Massively Data-parallel Computers in Large Scale Molecular Dynamics Simulations * Short-range Molecular Dynamics on a Network of Processors and Workstations * An Algorithm for Higher-order Perturbation Theory in Radiative Transfer Computations * Hydrostochastics: The Master Equation Formulation of Fluid Dynamics * HPP Lattice Gas on Transputers and Networked Workstations * Study on the Hysteresis Cycle Simulation Using Modeling with Different Functions on Intervals * Refined Pruning Techniques for Feed-forward Neural Networks * Random Walk Simulation of the Motion of Transient Charges in Photoconductors * The Optical Hysteresis in Hydrogenated Amorphous Silicon * Diffusion Monte Carlo Analysis of Modern Interatomic Potentials for He * A Parallel Strategy for Molecular Dynamics Simulations of Polar Liquids on Transputer Arrays * Distribution of Ions Reflected on Rough Surfaces * The Study of Step Density Distribution During Molecular Beam Epitaxy Growth: Monte Carlo Computer Simulation * Towards a Formal Approach to the Construction of Large-scale Scientific Applications Software * Correlated Random Walk and Discrete Modelling of Propagation through Inhomogeneous Media * Teaching Plasma Physics Simulation * A Theoretical Determination of the Au-Ni Phase Diagram * Boson and Fermion Kinetics in One-dimensional Lattices * Computational Physics Course on the Technical University * Symbolic Computations in Simulation Code Development and Femtosecond-pulse Laser-plasma Interaction Studies * Computer Algebra and Integrated Computing Systems in Education of Physical Sciences * Coordinated System of Programs for Undergraduate Physics Instruction * Program Package MIRIAM and Atomic Physics of Extreme Systems * High Energy Physics Simulation on the T_Node * The Chapman-Kolmogorov Equation as Representation of Huygens' Principle and the Monolithic Self-consistent Numerical Modelling of Lasers * Authoring System for Simulation Developments * Molecular Dynamics Study of Ion Charge Effects in the Structure of Ionic Crystals * A Computational Physics Introductory Course * Computer Calculation of Substrate Temperature Field in MBE System * Multimagnetical Simulation of the Ising Model in Two and Three Dimensions * Failure of the CTRW Treatment of the Quasicoherent Excitation Transfer * Implementation of a Parallel Conjugate Gradient Method for Simulation of Elastic Light Scattering * Algorithms for Study of Thin Film Growth * Algorithms and Programs for Physics Teaching in Romanian Technical Universities * Multicanonical Simulation of 1st order Transitions: Interface Tension of the 2D 7-State Potts Model * Two Numerical Methods for the Calculation of Periodic Orbits in Hamiltonian Systems * Chaotic Behavior in a Probabilistic Cellular Automata? * Wave Optics Computing by a Networked-based Vector Wave Automaton * Tensor Manipulation Package in REDUCE * Propagation of Electromagnetic Pulses in Stratified Media * The Simple Molecular Dynamics Model for the Study of Thermalization of the Hot Nucleon Gas * Electron Spin Polarization in PdCo Alloys Calculated by KKR-CPA-LSD Method * Simulation Studies of Microscopic Droplet Spreading * A Vectorizable Algorithm for the Multicolor Successive Overrelaxation Method * Tetragonality of the CuAu I Lattice and Its Relation to Electronic Specific Heat and Spin Susceptibility * Computer Simulation of the Formation of Metallic Aggregates Produced by Chemical Reactions in Aqueous Solution * Scaling in Growth Models with Diffusion: A Monte Carlo Study * The Nucleus as the Mesoscopic System * Neural Network Computation as Dynamic System Simulation * First-principles Theory of Surface Segregation in Binary Alloys * Data Smooth Approximation Algorithm for Estimating the Temperature Dependence of the Ice Nucleation Rate * Genetic Algorithms in Optical Design * Application of 2D-FFT in the Study of Molecular Exchange Processes by NMR * Advanced Mobility Model for Electron Transport in P-Si Inversion Layers * Computer Simulation for Film Surfaces and its Fractal Dimension * Parallel Computation Techniques and the Structure of Catalyst Surfaces * Educational SW to Teach Digital Electronics and the Corresponding Text Book * Primitive Trinomials (Mod 2) Whose Degree is a Mersenne Exponent * Stochastic Modelisation and Parallel Computing * Remarks on the Hybrid Monte Carlo Algorithm for the ∫4 Model * An Experimental Computer Assisted Workbench for Physics Teaching * A Fully Implicit Code to Model Tokamak Plasma Edge Transport * EXPFIT: An Interactive Program for Automatic Beam-foil Decay Curve Analysis * Mapping Technique for Solving General, 1-D Hamiltonian Systems * Freeway Traffic, Cellular Automata, and Some (Self-Organizing) Criticality * Photonuclear Yield Analysis by Dynamic Programming * Incremental Representation of the Simply Connected Planar Curves * Self-convergence in Monte Carlo Methods * Adaptive Mesh Technique for Shock Wave Propagation * Simulation of Supersonic Coronal Streams and Their Interaction with the Solar Wind * The Nature of Chaos in Two Systems of Ordinary Nonlinear Differential Equations * Considerations of a Window-shopper * Interpretation of Data Obtained by RTP 4-Channel Pulsed Radar Reflectometer Using a Multi Layer Perceptron * Statistics of Lattice Bosons for Finite Systems * Fractal Based Image Compression with Affine Transformations * Algorithmic Studies on Simulation Codes for Heavy-ion Reactions * An Energy-Wise Computer Simulation of DNA-Ion-Water Interactions Explains the Abnormal Structure of Poly[d(A)]:Poly[d(T)] * Computer Simulation Study of Kosterlitz-Thouless-Like Transitions * Problem-oriented Software Package GUN-EBT for Computer Simulation of Beam Formation and Transport in Technological Electron-Optical Systems * Parallelization of a Boundary Value Solver and its Application in Nonlinear Dynamics * The Symbolic Classification of Real Four-dimensional Lie Algebras * Short, Singular Pulses Generation by a Dye Laser at Two Wavelengths Simultaneously * Quantum Monte Carlo Simulations of the Apex-Oxygen-Model * Approximation Procedures for the Axial Symmetric Static Einstein-Maxwell-Higgs Theory * Crystallization on a Sphere: Parallel Simulation on a Transputer Network * FAMULUS: A Software Product (also) for Physics Education * MathCAD vs. FAMULUS -- A Brief Comparison * First-principles Dynamics Used to Study Dissociative Chemisorption * A Computer Controlled System for Crystal Growth from Melt * A Time Resolved Spectroscopic Method for Short Pulsed Particle Emission * Green's Function Computation in Radiative Transfer Theory * Random Search Optimization Technique for One-criteria and Multi-criteria Problems * Hartley Transform Applications to Thermal Drift Elimination in Scanning Tunneling Microscopy * Algorithms of Measuring, Processing and Interpretation of Experimental Data Obtained with Scanning Tunneling Microscope * Time-dependent Atom-surface Interactions * Local and Global Minima on Molecular Potential Energy Surfaces: An Example of N3 Radical * Computation of Bifurcation Surfaces * Symbolic Computations in Quantum Mechanics: Energies in Next-to-solvable Systems * A Tool for RTP Reactor and Lamp Field Design * Modelling of Particle Spectra for the Analysis of Solid State Surface * List of Participants

  11. A simulation model for wind energy storage systems. Volume 1: Technical report

    NASA Technical Reports Server (NTRS)

    Warren, A. W.; Edsinger, R. W.; Chan, Y. K.

    1977-01-01

    A comprehensive computer program for the modeling of wind energy and storage systems utilizing any combination of five types of storage (pumped hydro, battery, thermal, flywheel and pneumatic) was developed. The level of detail of Simulation Model for Wind Energy Storage (SIMWEST) is consistent with a role of evaluating the economic feasibility as well as the general performance of wind energy systems. The software package consists of two basic programs and a library of system, environmental, and load components. The first program is a precompiler which generates computer models (in FORTRAN) of complex wind source storage application systems, from user specifications using the respective library components. The second program provides the techno-economic system analysis with the respective I/O, the integration of systems dynamics, and the iteration for conveyance of variables. SIMWEST program, as described, runs on the UNIVAC 1100 series computers.

  12. METLIN-PC: An applications-program package for problems of mathematical programming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pshenichnyi, B.N.; Sobolenko, L.A.; Sosnovskii, A.A.

    1994-05-01

    The METLIN-PC applications-program package (APP) was developed at the V.M. Glushkov Institute of Cybernetics of the Academy of Sciences of Ukraine on IBM PC XT and AT computers. The present version of the package was written in Turbo Pascal and Fortran-77. The METLIN-PC is chiefly designed for the solution of smooth problems of mathematical programming and is a further development of the METLIN prototype, which was created earlier on a BESM-6 computer. The principal property of the previous package is retained - the applications modules employ a single approach based on the linearization method of B.N. Pschenichnyi. Hence the namemore » {open_quotes}METLIN.{close_quotes}« less

  13. Language Analysis Package (L.A.P.) Version I System Design.

    ERIC Educational Resources Information Center

    Porch, Ann

    To permit researchers to use the speed and versatility of the computer to process natural language text as well as numerical data without undergoing special training in programing or computer operations, a language analysis package has been developed partially based on several existing programs. An overview of the design is provided and system…

  14. Macintosh Computer Classroom and Laboratory Security: Preventing Unwanted Changes to the System.

    ERIC Educational Resources Information Center

    Senn, Gary J.; Smyth, Thomas J. C.

    Because of the graphical interface and "openness" of the operating system, Macintosh computers are susceptible to undesirable changes by the user. This presentation discusses the advantages and disadvantages of software packages that offer protection for the Macintosh system. The two basic forms of software security packages include a…

  15. Sensitivity of a cloud parameterization package in the National Center for Atmospheric Research Community Climate Model

    NASA Astrophysics Data System (ADS)

    Kao, C.-Y. J.; Smith, W. S.

    1999-05-01

    A physically based cloud parameterization package, which includes the Arakawa-Schubert (AS) scheme for subgrid-scale convective clouds and the Sundqvist (SUN) scheme for nonconvective grid-scale layered clouds (hereafter referred to as the SUNAS cloud package), is incorporated into the National Center for Atmospheric Research (NCAR) Community Climate Model, Version 2 (CCM2). The AS scheme is used for a more reasonable heating distribution due to convective clouds and their associated precipitation. The SUN scheme allows for the prognostic computation of cloud water so that the cloud optical properties are more physically determined for shortwave and longwave radiation calculations. In addition, the formation of anvil-like clouds from deep convective systems is able to be simulated with the SUNAS package. A 10-year simulation spanning the period from 1980 to 1989 is conducted, and the effect of the cloud package on the January climate is assessed by comparing it with various available data sets and the National Center for Environmental Protection/NCAR reanalysis. Strengths and deficiencies of both the SUN and AS methods are identified and discussed. The AS scheme improves some aspects of the model dynamics and precipitation, especially with respect to the Pacific North America (PNA) pattern. CCM2's tendency to produce a westward bias of the 500 mbar stationary wave (time-averaged zonal anomalies) in the PNA sector is remedied apparently because of a less "locked-in" heating pattern in the tropics. The additional degree of freedom added by the prognostic calculation of cloud water in the SUN scheme produces interesting results in the modeled cloud and radiation fields compared with data. In general, too little cloud water forms in the tropics, while excessive cloud cover and cloud liquid water are simulated in midlatitudes. This results in a somewhat degraded simulation of the radiation budget. The overall simulated precipitation by the SUNAS package is, however, substantially improved over the original CCM2.

  16. simBio: a Java package for the development of detailed cell models.

    PubMed

    Sarai, Nobuaki; Matsuoka, Satoshi; Noma, Akinori

    2006-01-01

    Quantitative dynamic computer models, which integrate a variety of molecular functions into a cell model, provide a powerful tool to create and test working hypotheses. We have developed a new modeling tool, the simBio package (freely available from ), which can be used for constructing cell models, such as cardiac cells (the Kyoto model from Matsuoka et al., 2003, 2004 a, b, the LRd model from Faber and Rudy, 2000, and the Noble 98 model from Noble et al., 1998), epithelial cells (Strieter et al., 1990) and pancreatic beta cells (Magnus and Keizer, 1998). The simBio package is written in Java, uses XML and can solve ordinary differential equations. In an attempt to mimic biological functional structures, a cell model is, in simBio, composed of independent functional modules called Reactors, such as ion channels and the sarcoplasmic reticulum, and dynamic variables called Nodes, such as ion concentrations. The interactions between Reactors and Nodes are described by the graph theory and the resulting graph represents a blueprint of an intricate cellular system. Reactors are prepared in a hierarchical order, in analogy to the biological classification. Each Reactor can be composed or improved independently, and can easily be reused for different models. This way of building models, through the combination of various modules, is enabled through the use of object-oriented programming concepts. Thus, simBio is a straightforward system for the creation of a variety of cell models on a common database of functional modules.

  17. Geometric modeling for computer aided design

    NASA Technical Reports Server (NTRS)

    Schwing, James L.

    1993-01-01

    Over the past several years, it has been the primary goal of this grant to design and implement software to be used in the conceptual design of aerospace vehicles. The work carried out under this grant was performed jointly with members of the Vehicle Analysis Branch (VAB) of NASA LaRC, Computer Sciences Corp., and Vigyan Corp. This has resulted in the development of several packages and design studies. Primary among these are the interactive geometric modeling tool, the Solid Modeling Aerospace Research Tool (smart), and the integration and execution tools provided by the Environment for Application Software Integration and Execution (EASIE). In addition, it is the purpose of the personnel of this grant to provide consultation in the areas of structural design, algorithm development, and software development and implementation, particularly in the areas of computer aided design, geometric surface representation, and parallel algorithms.

  18. ATK-ForceField: a new generation molecular dynamics software package

    NASA Astrophysics Data System (ADS)

    Schneider, Julian; Hamaekers, Jan; Chill, Samuel T.; Smidstrup, Søren; Bulin, Johannes; Thesen, Ralph; Blom, Anders; Stokbro, Kurt

    2017-12-01

    ATK-ForceField is a software package for atomistic simulations using classical interatomic potentials. It is implemented as a part of the Atomistix ToolKit (ATK), which is a Python programming environment that makes it easy to create and analyze both standard and highly customized simulations. This paper will focus on the atomic interaction potentials, molecular dynamics, and geometry optimization features of the software, however, many more advanced modeling features are available. The implementation details of these algorithms and their computational performance will be shown. We present three illustrative examples of the types of calculations that are possible with ATK-ForceField: modeling thermal transport properties in a silicon germanium crystal, vapor deposition of selenium molecules on a selenium surface, and a simulation of creep in a copper polycrystal.

  19. Modeling ICF With RAGE, BHR, And The New Laser Package

    NASA Astrophysics Data System (ADS)

    Cliche, Dylan; Welser-Sherrill, Leslie; Haines, Brian; Mancini, Roberto

    2017-10-01

    Inertial Confinement Fusion (ICF) is one method used to obtain thermonuclear burn through the either direct or indirect ablation of a millimeter-scale capsule with several lasers. Although progress has been made in theory, experiment, and diagnostics, the community has yet to reach ignition. A way of investigating this is through the use of high performance computer simulations of the implosion. RAGE is an advanced 1D, 2D, and 3D radiation adaptive grid Eulerian code used to simulate hydrodynamics of a system. Due to the unstable nature of two unequal densities accelerating into one another, it is important to include a turbulence model. BHR is a turbulence model which uses Reynolds-averaged Navier-Stokes (RANS) equations to model the mixing that occurs between the shell and fusion fuel material. Until recently, it was still difficult to model direct drive experiments because there was no laser energy deposition model in RAGE. Recently, a new laser energy deposition model has been implemented using the same ray tracing method as the Mazinisin laser package used at the OMEGA laser facility at the Laboratory for Laser Energetics (LLE) in Rochester, New York. Using the new laser package along with BHR for mixing allows us to more accurately simulate ICF implosions and obtain spatially and temporally resolved information (e.g. position, temperature, density, and mix concentrations) to give insight into what is happening inside the implosion.

  20. lme4qtl: linear mixed models with flexible covariance structure for genetic studies of related individuals.

    PubMed

    Ziyatdinov, Andrey; Vázquez-Santiago, Miquel; Brunel, Helena; Martinez-Perez, Angel; Aschard, Hugues; Soria, Jose Manuel

    2018-02-27

    Quantitative trait locus (QTL) mapping in genetic data often involves analysis of correlated observations, which need to be accounted for to avoid false association signals. This is commonly performed by modeling such correlations as random effects in linear mixed models (LMMs). The R package lme4 is a well-established tool that implements major LMM features using sparse matrix methods; however, it is not fully adapted for QTL mapping association and linkage studies. In particular, two LMM features are lacking in the base version of lme4: the definition of random effects by custom covariance matrices; and parameter constraints, which are essential in advanced QTL models. Apart from applications in linkage studies of related individuals, such functionalities are of high interest for association studies in situations where multiple covariance matrices need to be modeled, a scenario not covered by many genome-wide association study (GWAS) software. To address the aforementioned limitations, we developed a new R package lme4qtl as an extension of lme4. First, lme4qtl contributes new models for genetic studies within a single tool integrated with lme4 and its companion packages. Second, lme4qtl offers a flexible framework for scenarios with multiple levels of relatedness and becomes efficient when covariance matrices are sparse. We showed the value of our package using real family-based data in the Genetic Analysis of Idiopathic Thrombophilia 2 (GAIT2) project. Our software lme4qtl enables QTL mapping models with a versatile structure of random effects and efficient computation for sparse covariances. lme4qtl is available at https://github.com/variani/lme4qtl .

  1. ImagePy: an open-source, Python-based and platform-independent software package for boimage analysis.

    PubMed

    Wang, Anliang; Yan, Xiaolong; Wei, Zhijun

    2018-04-27

    This note presents the design of a scalable software package named ImagePy for analysing biological images. Our contribution is concentrated on facilitating extensibility and interoperability of the software through decoupling the data model from the user interface. Especially with assistance from the Python ecosystem, this software framework makes modern computer algorithms easier to be applied in bioimage analysis. ImagePy is free and open source software, with documentation and code available at https://github.com/Image-Py/imagepy under the BSD license. It has been tested on the Windows, Mac and Linux operating systems. wzjdlut@dlut.edu.cn or yxdragon@imagepy.org.

  2. SPARSKIT: A basic tool kit for sparse matrix computations

    NASA Technical Reports Server (NTRS)

    Saad, Youcef

    1990-01-01

    Presented here are the main features of a tool package for manipulating and working with sparse matrices. One of the goals of the package is to provide basic tools to facilitate the exchange of software and data between researchers in sparse matrix computations. The starting point is the Harwell/Boeing collection of matrices for which the authors provide a number of tools. Among other things, the package provides programs for converting data structures, printing simple statistics on a matrix, plotting a matrix profile, and performing linear algebra operations with sparse matrices.

  3. COPASI and its applications in biotechnology.

    PubMed

    Bergmann, Frank T; Hoops, Stefan; Klahn, Brian; Kummer, Ursula; Mendes, Pedro; Pahle, Jürgen; Sahle, Sven

    2017-11-10

    COPASI is software used for the creation, modification, simulation and computational analysis of kinetic models in various fields. It is open-source, available for all major platforms and provides a user-friendly graphical user interface, but is also controllable via the command line and scripting languages. These are likely reasons for its wide acceptance. We begin this review with a short introduction describing the general approaches and techniques used in computational modeling in the biosciences. Next we introduce the COPASI package, and its capabilities, before looking at typical applications of COPASI in biotechnology. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  4. Computer investigations of the turbulent flow around a NACA2415 airfoil wind turbine

    NASA Astrophysics Data System (ADS)

    Driss, Zied; Chelbi, Tarek; Abid, Mohamed Salah

    2015-12-01

    In this work, computer investigations are carried out to study the flow field developing around a NACA2415 airfoil wind turbine. The Navier-Stokes equations in conjunction with the standard k-ɛ turbulence model are considered. These equations are solved numerically to determine the local characteristics of the flow. The models tested are implemented in the software "SolidWorks Flow Simulation" which uses a finite volume scheme. The numerical results are compared with experiments conducted on an open wind tunnel to validate the numerical results. This will help improving the aerodynamic efficiency in the design of packaged installations of the NACA2415 airfoil type wind turbine.

  5. Performance and Reliability of Bonded Interfaces for High-Temperature Packaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paret, Paul P

    2017-08-02

    Sintered silver has proven to be a promising candidate for use as a die-attach and substrate-attach material in automotive power electronics components. It holds promise of greater reliability than lead-based and lead-free solders, especially at higher temperatures (>200 degrees C). Accurate predictive lifetime models of sintered silver need to be developed and its failure mechanisms thoroughly characterized before it can be deployed as a die-attach or substrate-attach material in wide-bandgap device-based packages. Mechanical characterization tests that result in stress-strain curves and accelerated tests that produce cycles-to-failure result will be conducted. Also, we present a finite element method (FEM) modeling methodologymore » that can offer greater accuracy in predicting the failure of sintered silver under accelerated thermal cycling. A fracture mechanics-based approach is adopted in the FEM model, and J-integral/thermal cycle values are computed.« less

  6. Novel Ruggedized Packaging Technology for VCSELs

    DTIC Science & Technology

    2017-03-01

    Novel Ruggedized Packaging Technology for VCSELs Charlie Kuznia ckuznia@ultracomm-inc.com Ultra Communications, Inc. Vista, CA, USA, 92081...n ac hieve l ow-power, E MI-immune links within hi gh-performance m ilitary computing an d sensor systems. Figure 1. Chip-scale-packaging of

  7. Introduction to Software Packages. [Final Report.

    ERIC Educational Resources Information Center

    Frankel, Sheila, Ed.; And Others

    This document provides an introduction to applications computer software packages that support functional managers in government and encourages the use of such packages as an alternative to in-house development. A review of current application areas includes budget/project management, financial management/accounting, payroll, personnel,…

  8. Program of research in severe storms

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Two modeling areas, the development of a mesoscale chemistry-meteorology interaction model, and the development of a combined urban chemical kinetics-transport model are examined. The problems associated with developing a three dimensional combined meteorological-chemical kinetics computer program package are defined. A similar three dimensional hydrostatic real time model which solves the fundamental Navier-Stokes equations for nonviscous flow is described. An urban air quality simulation model, developed to predict the temporal and spatial distribution of reactive and nonreactive gases in and around an urban area and to support a remote sensor evaluation program is reported.

  9. Development of Boundary Condition Independent Reduced Order Thermal Models using Proper Orthogonal Decomposition

    NASA Astrophysics Data System (ADS)

    Raghupathy, Arun; Ghia, Karman; Ghia, Urmila

    2008-11-01

    Compact Thermal Models (CTM) to represent IC packages has been traditionally developed using the DELPHI-based (DEvelopment of Libraries of PHysical models for an Integrated design) methodology. The drawbacks of this method are presented, and an alternative method is proposed. A reduced-order model that provides the complete thermal information accurately with less computational resources can be effectively used in system level simulations. Proper Orthogonal Decomposition (POD), a statistical method, can be used to reduce the order of the degree of freedom or variables of the computations for such a problem. POD along with the Galerkin projection allows us to create reduced-order models that reproduce the characteristics of the system with a considerable reduction in computational resources while maintaining a high level of accuracy. The goal of this work is to show that this method can be applied to obtain a boundary condition independent reduced-order thermal model for complex components. The methodology is applied to the 1D transient heat equation.

  10. MORTICIA, a statistical analysis software package for determining optical surveillance system effectiveness.

    NASA Astrophysics Data System (ADS)

    Ramkilowan, A.; Griffith, D. J.

    2017-10-01

    Surveillance modelling in terms of the standard Detect, Recognise and Identify (DRI) thresholds remains a key requirement for determining the effectiveness of surveillance sensors. With readily available computational resources it has become feasible to perform statistically representative evaluations of the effectiveness of these sensors. A new capability for performing this Monte-Carlo type analysis is demonstrated in the MORTICIA (Monte- Carlo Optical Rendering for Theatre Investigations of Capability under the Influence of the Atmosphere) software package developed at the Council for Scientific and Industrial Research (CSIR). This first generation, python-based open-source integrated software package, currently in the alpha stage of development aims to provide all the functionality required to perform statistical investigations of the effectiveness of optical surveillance systems in specific or generic deployment theatres. This includes modelling of the mathematical and physical processes that govern amongst other components of a surveillance system; a sensor's detector and optical components, a target and its background as well as the intervening atmospheric influences. In this paper we discuss integral aspects of the bespoke framework that are critical to the longevity of all subsequent modelling efforts. Additionally, some preliminary results are presented.

  11. Improving finite element results in modeling heart valve mechanics.

    PubMed

    Earl, Emily; Mohammadi, Hadi

    2018-06-01

    Finite element analysis is a well-established computational tool which can be used for the analysis of soft tissue mechanics. Due to the structural complexity of the leaflet tissue of the heart valve, the currently available finite element models do not adequately represent the leaflet tissue. A method of addressing this issue is to implement computationally expensive finite element models, characterized by precise constitutive models including high-order and high-density mesh techniques. In this study, we introduce a novel numerical technique that enhances the results obtained from coarse mesh finite element models to provide accuracy comparable to that of fine mesh finite element models while maintaining a relatively low computational cost. Introduced in this study is a method by which the computational expense required to solve linear and nonlinear constitutive models, commonly used in heart valve mechanics simulations, is reduced while continuing to account for large and infinitesimal deformations. This continuum model is developed based on the least square algorithm procedure coupled with the finite difference method adhering to the assumption that the components of the strain tensor are available at all nodes of the finite element mesh model. The suggested numerical technique is easy to implement, practically efficient, and requires less computational time compared to currently available commercial finite element packages such as ANSYS and/or ABAQUS.

  12. Effects of Computer Animation Instructional Package on Students' Achievement in Practical Biology

    ERIC Educational Resources Information Center

    Hamzat, Abdulrasaq; Bello, Ganiyu; Abimbola, Isaac Olakanmi

    2017-01-01

    This study examined the effects of computer animation instructional package on secondary school students' achievement in practical biology in Ilorin, Nigeria. The study adopted a pre-test, post-test, control group, non-randomised and nonequivalent quasi-experimental design, with a 2x2x3 factorial design. Two intact classes from two secondary…

  13. Sigma 2 Graphic Display Software Program Description

    NASA Technical Reports Server (NTRS)

    Johnson, B. T.

    1973-01-01

    A general purpose, user oriented graphic support package was implemented. A comprehensive description of the two software components comprising this package is given: Display Librarian and Display Controller. These programs have been implemented in FORTRAN on the XDS Sigma 2 Computer Facility. This facility consists of an XDS Sigma 2 general purpose computer coupled to a Computek Display Terminal.

  14. Technical Review Report for the Model 9978-96 Package Safety Analysis Report for Packaging (S-SARP-G-00002, Revision 1, March 2009)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    West, M

    2009-03-06

    This Technical Review Report (TRR) documents the review, performed by Lawrence Livermore National Laboratory (LLNL) Staff, at the request of the Department of Energy (DOE), on the 'Safety Analysis Report for Packaging (SARP), Model 9978 B(M)F-96', Revision 1, March 2009 (S-SARP-G-00002). The Model 9978 Package complies with 10 CFR 71, and with 'Regulations for the Safe Transport of Radioactive Material-1996 Edition (As Amended, 2000)-Safety Requirements', International Atomic Energy Agency (IAEA) Safety Standards Series No. TS-R-1. The Model 9978 Packaging is designed, analyzed, fabricated, and tested in accordance with Section III of the American Society of Mechanical Engineers Boiler and Pressuremore » Vessel Code (ASME B&PVC). The review presented in this TRR was performed using the methods outlined in Revision 3 of the DOE's 'Packaging Review Guide (PRG) for Reviewing Safety Analysis Reports for Packages'. The format of the SARP follows that specified in Revision 2 of the Nuclear Regulatory Commission's Regulatory Guide 7.9, i.e., 'Standard Format and Content of Part 71 Applications for Approval of Packages for Radioactive Material'. Although the two documents are similar in their content, they are not identical. Formatting differences have been noted in this TRR, where appropriate. The Model 9978 Packaging is a single containment package, using a 5-inch containment vessel (5CV). It uses a nominal 35-gallon drum package design. In comparison, the Model 9977 Packaging uses a 6-inch containment vessel (6CV). The Model 9977 and Model 9978 Packagings were developed concurrently, and they were referred to as the General Purpose Fissile Material Package, Version 1 (GPFP). Both packagings use General Plastics FR-3716 polyurethane foam as insulation and as impact limiters. The 5CV is used as the Primary Containment Vessel (PCV) in the Model 9975-96 Packaging. The Model 9975-96 Packaging also has the 6CV as its Secondary Containment Vessel (SCV). In comparison, the Model 9975 Packagings use Celotex{trademark} for insulation and as impact limiters. To provide a historical perspective, it is noted that the Model 9975-96 Packaging is a 35-gallon drum package design that has evolved from a family of packages designed by DOE contractors at the Savannah River Site. Earlier package designs, i.e., the Model 9965, the Model 9966, the Model 9967, and the Model 9968 Packagings, were originally designed and certified in the early 1980s. In the 1990s, updated package designs that incorporated design features consistent with the then-newer safety requirements were proposed. The updated package designs at the time were the Model 9972, the Model 9973, the Model 9974, and the Model 9975 Packagings, respectively. The Model 9975 Package was certified by the Packaging Certification Program, under the Office of Safety Management and Operations. The Model 9978 Package has six Content Envelopes: C.1 ({sup 238}Pu Heat Sources), C.2 ( Pu/U Metals), C.3 (Pu/U Oxides, Reserved), C.4 (U Metal or Alloy), C.5 (U Compounds), and C.6 (Samples and Sources). Per 10 CFR 71.59 (Code of Federal Regulations), the value of N is 50 for the Model 9978 Package leading to a Criticality Safety Index (CSI) of 1.0. The Transport Index (TI), based on dose rate, is calculated to be a maximum of 4.1.« less

  15. Prompt gamma neutron activation analysis of toxic elements in radioactive waste packages.

    PubMed

    Ma, J-L; Carasco, C; Perot, B; Mauerhofer, E; Kettler, J; Havenith, A

    2012-07-01

    The French Alternative Energies and Atomic Energy Commission (CEA) and National Radioactive Waste Management Agency (ANDRA) are conducting an R&D program to improve the characterization of long-lived and medium activity (LL-MA) radioactive waste packages. In particular, the amount of toxic elements present in radioactive waste packages must be assessed before they can be accepted in repository facilities in order to avoid pollution of underground water reserves. To this aim, the Nuclear Measurement Laboratory of CEA-Cadarache has started to study the performances of Prompt Gamma Neutron Activation Analysis (PGNAA) for elements showing large capture cross sections such as mercury, cadmium, boron, and chromium. This paper reports a comparison between Monte Carlo calculations performed with the MCNPX computer code using the ENDF/B-VII.0 library and experimental gamma rays measured in the REGAIN PGNAA cell with small samples of nickel, lead, cadmium, arsenic, antimony, chromium, magnesium, zinc, boron, and lithium to verify the validity of a numerical model and gamma-ray production data. The measurement of a ∼20kg test sample of concrete containing toxic elements has also been performed, in collaboration with Forschungszentrum Jülich, to validate the model in view of future performance studies for dense and large LL-MA waste packages. Copyright © 2012 Elsevier Ltd. All rights reserved.

  16. A Re-Engineered Software Interface and Workflow for the Open-Source SimVascular Cardiovascular Modeling Package.

    PubMed

    Lan, Hongzhi; Updegrove, Adam; Wilson, Nathan M; Maher, Gabriel D; Shadden, Shawn C; Marsden, Alison L

    2018-02-01

    Patient-specific simulation plays an important role in cardiovascular disease research, diagnosis, surgical planning and medical device design, as well as education in cardiovascular biomechanics. simvascular is an open-source software package encompassing an entire cardiovascular modeling and simulation pipeline from image segmentation, three-dimensional (3D) solid modeling, and mesh generation, to patient-specific simulation and analysis. SimVascular is widely used for cardiovascular basic science and clinical research as well as education, following increased adoption by users and development of a GATEWAY web portal to facilitate educational access. Initial efforts of the project focused on replacing commercial packages with open-source alternatives and adding increased functionality for multiscale modeling, fluid-structure interaction (FSI), and solid modeling operations. In this paper, we introduce a major SimVascular (SV) release that includes a new graphical user interface (GUI) designed to improve user experience. Additional improvements include enhanced data/project management, interactive tools to facilitate user interaction, new boundary condition (BC) functionality, plug-in mechanism to increase modularity, a new 3D segmentation tool, and new computer-aided design (CAD)-based solid modeling capabilities. Here, we focus on major changes to the software platform and outline features added in this new release. We also briefly describe our recent experiences using SimVascular in the classroom for bioengineering education.

  17. EPA/Navy CERCLA Remedial Action Technology Guide

    DTIC Science & Technology

    1993-11-01

    Pollution 18:25-36, 1988. Control Association, August 19-21, 1985. 11. Nirmalakhandan, N. N. and R. E. Speece. QSAR Model for Predicting Henry’s...Las Vegas , Nevada. May 1988.. 6. Bergstrom, Wayne R., Gray, Donald H. Fly Ash Utilization 12. Handbook - Remedial Action at Waste Disposal Sites in...of the soil piles should be are needed to confirm that the contaminants of concern can be designed as a package. There are computer models available

  18. Urban and regional land use analysis: CARETS and census cities experiment package

    NASA Technical Reports Server (NTRS)

    Alexander, R. (Principal Investigator); Lins, H. F., Jr.; Gallagher, D. B.

    1975-01-01

    The author has identified the following significant results. Temperatures in degrees Celsius were derived from PCM counts using the Pease's modified gray window technique. The Outcalt simulator was setup on the USGS computer. The input data to the model are basically meteorological and geographical in nature. The output data is presented in three matrices.

  19. Implications of Integrated Computational Materials Engineering with Respect to Export Control

    DTIC Science & Technology

    2013-09-01

    domain. The university also advises its staff to ask for any ECCN that may be associated with a procured software package in order to understand the...industry? • Models can transform input data, which can be of various export control levels, and provide new, transformed data. If EAR ECCN 9E991 data is

  20. VENVAL : a plywood mill cost accounting program

    Treesearch

    Henry Spelter

    1991-01-01

    This report documents a package of computer programs called VENVAL. These programs prepare plywood mill data for a linear programming (LP) model that, in turn, calculates the optimum mix of products to make, given a set of technologies and market prices. (The software to solve a linear program is not provided and must be obtained separately.) Linear programming finds...

  1. Paperless Work Package Application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kilgore, Jr., William R.; Morrell, Jr., Otto K.; Morrison, Dan

    2014-07-31

    Paperless Work Package (PWP) System is a computer program process that takes information from Asset Suite, provides a platform for other electronic inputs, Processes the inputs into an electronic package that can be downloaded onto an electronic work tablet or laptop computer, provides a platform for electronic inputs into the work tablet, and then transposes those inputs back into Asset Suite and to permanent SRS records. The PWP System will basically eliminate paper requirements from the maintenance work control system. The program electronically relays the instructions given by the planner to work on a piece of equipment which is currentlymore » relayed via a printed work package. The program does not control/approve what is done. The planner will continue to plan the work package, the package will continue to be routed, approved, and scheduled. The supervisor reviews and approves the work to be performed and assigns work to individuals or to a work group. (The supervisor conducts pre job briefings with the workers involved in the job) The Operations Manager (Work Controlling Entity) approves the work package electronically for the work that will be done in his facility prior to work starting. The PWP System will provide the package in an electronic form. All the reviews, approvals, and safety measures taken by people outside the electronic package does not change from the paper driven work packages.« less

  2. Combined Uncertainty and A-Posteriori Error Bound Estimates for General CFD Calculations: Theory and Software Implementation

    NASA Technical Reports Server (NTRS)

    Barth, Timothy J.

    2014-01-01

    This workshop presentation discusses the design and implementation of numerical methods for the quantification of statistical uncertainty, including a-posteriori error bounds, for output quantities computed using CFD methods. Hydrodynamic realizations often contain numerical error arising from finite-dimensional approximation (e.g. numerical methods using grids, basis functions, particles) and statistical uncertainty arising from incomplete information and/or statistical characterization of model parameters and random fields. The first task at hand is to derive formal error bounds for statistics given realizations containing finite-dimensional numerical error [1]. The error in computed output statistics contains contributions from both realization error and the error resulting from the calculation of statistics integrals using a numerical method. A second task is to devise computable a-posteriori error bounds by numerically approximating all terms arising in the error bound estimates. For the same reason that CFD calculations including error bounds but omitting uncertainty modeling are only of limited value, CFD calculations including uncertainty modeling but omitting error bounds are only of limited value. To gain maximum value from CFD calculations, a general software package for uncertainty quantification with quantified error bounds has been developed at NASA. The package provides implementations for a suite of numerical methods used in uncertainty quantification: Dense tensorization basis methods [3] and a subscale recovery variant [1] for non-smooth data, Sparse tensorization methods[2] utilizing node-nested hierarchies, Sampling methods[4] for high-dimensional random variable spaces.

  3. Packaging strategies for printed circuit board components. Volume I, materials & thermal stresses.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neilsen, Michael K.; Austin, Kevin N.; Adolf, Douglas Brian

    2011-09-01

    Decisions on material selections for electronics packaging can be quite complicated by the need to balance the criteria to withstand severe impacts yet survive deep thermal cycles intact. Many times, material choices are based on historical precedence perhaps ignorant of whether those initial choices were carefully investigated or whether the requirements on the new component match those of previous units. The goal of this program focuses on developing both increased intuition for generic packaging guidelines and computational methodologies for optimizing packaging in specific components. Initial efforts centered on characterization of classes of materials common to packaging strategies and computational analysesmore » of stresses generated during thermal cycling to identify strengths and weaknesses of various material choices. Future studies will analyze the same example problems incorporating the effects of curing stresses as needed and analyzing dynamic loadings to compare trends with the quasi-static conclusions.« less

  4. SMOG 2: A Versatile Software Package for Generating Structure-Based Models.

    PubMed

    Noel, Jeffrey K; Levi, Mariana; Raghunathan, Mohit; Lammert, Heiko; Hayes, Ryan L; Onuchic, José N; Whitford, Paul C

    2016-03-01

    Molecular dynamics simulations with coarse-grained or simplified Hamiltonians have proven to be an effective means of capturing the functionally important long-time and large-length scale motions of proteins and RNAs. Originally developed in the context of protein folding, structure-based models (SBMs) have since been extended to probe a diverse range of biomolecular processes, spanning from protein and RNA folding to functional transitions in molecular machines. The hallmark feature of a structure-based model is that part, or all, of the potential energy function is defined by a known structure. Within this general class of models, there exist many possible variations in resolution and energetic composition. SMOG 2 is a downloadable software package that reads user-designated structural information and user-defined energy definitions, in order to produce the files necessary to use SBMs with high performance molecular dynamics packages: GROMACS and NAMD. SMOG 2 is bundled with XML-formatted template files that define commonly used SBMs, and it can process template files that are altered according to the needs of each user. This computational infrastructure also allows for experimental or bioinformatics-derived restraints or novel structural features to be included, e.g. novel ligands, prosthetic groups and post-translational/transcriptional modifications. The code and user guide can be downloaded at http://smog-server.org/smog2.

  5. ParallelStructure: A R Package to Distribute Parallel Runs of the Population Genetics Program STRUCTURE on Multi-Core Computers

    PubMed Central

    Besnier, Francois; Glover, Kevin A.

    2013-01-01

    This software package provides an R-based framework to make use of multi-core computers when running analyses in the population genetics program STRUCTURE. It is especially addressed to those users of STRUCTURE dealing with numerous and repeated data analyses, and who could take advantage of an efficient script to automatically distribute STRUCTURE jobs among multiple processors. It also consists of additional functions to divide analyses among combinations of populations within a single data set without the need to manually produce multiple projects, as it is currently the case in STRUCTURE. The package consists of two main functions: MPI_structure() and parallel_structure() as well as an example data file. We compared the performance in computing time for this example data on two computer architectures and showed that the use of the present functions can result in several-fold improvements in terms of computation time. ParallelStructure is freely available at https://r-forge.r-project.org/projects/parallstructure/. PMID:23923012

  6. Statistical Models for Predicting Automobile Driving Postures for Men and Women Including Effects of Age.

    PubMed

    Park, Jangwoon; Ebert, Sheila M; Reed, Matthew P; Hallman, Jason J

    2016-03-01

    Previously published statistical models of driving posture have been effective for vehicle design but have not taken into account the effects of age. The present study developed new statistical models for predicting driving posture. Driving postures of 90 U.S. drivers with a wide range of age and body size were measured in laboratory mockup in nine package conditions. Posture-prediction models for female and male drivers were separately developed by employing a stepwise regression technique using age, body dimensions, vehicle package conditions, and two-way interactions, among other variables. Driving posture was significantly associated with age, and the effects of other variables depended on age. A set of posture-prediction models is presented for women and men. The results are compared with a previously developed model. The present study is the first study of driver posture to include a large cohort of older drivers and the first to report a significant effect of age. The posture-prediction models can be used to position computational human models or crash-test dummies for vehicle design and assessment. © 2015, Human Factors and Ergonomics Society.

  7. Implementation of Solute Transport in the Vadose Zone into the `HYDRUS Package for MODFLOW'

    NASA Astrophysics Data System (ADS)

    Simunek, J.; Beegum, S.; Szymkiewicz, A.; Sudheer, K. P.

    2017-12-01

    The 'HYDRUS package for MODFLOW' was developed by Seo et al. (2007) and Twarakavi et al. (2008) to simultaneously evaluate transient water flow in both unsaturated and saturated zones. The package, which is based on the HYDRUS-1D model (Šimůnek et al., 2016) simulating unsaturated water flow in the vadose zone, was incorporated into MODFLOW (Harbaugh et al., 2000) simulating saturated groundwater flow. The HYDRUS package in the coupled model can be used to represent the effects of various unsaturated zone processes, including infiltration, evaporation, root water uptake, capillary rise, and recharge in homogeneous or layered soil profiles. The coupled model is effective in addressing spatially-variable saturated-unsaturated hydrological processes at the regional scale, allowing for complex layering in the unsaturated zone, spatially and temporarily variable water fluxes at the soil surface and in the root zone, and with alternating recharge and discharge fluxes (Twarakavi et al., 2008). One of the major limitations of the coupled model was that it could not be used to simulate at the same time solute transport. However, solute transport is highly dependent on water table fluctuations due to temporal and spatial variations in groundwater recharge. This is an important concern when the coupled model is used for analyzing groundwater contamination due to transport through the unsaturated zone. The objective of this study is to integrate the solute transport model (the solute transport part of HYDRUS-1D for the unsaturated zone and MT3DMS (Zheng and Wang, 1999; Zheng, 2009) for the saturated zone) into an existing coupled water flow model. The unsaturated zone component of the coupled model can consider solute transport involving many biogeochemical processes and reactions, including first-order degradation, volatilization, linear or nonlinear sorption, one-site kinetic sorption, two-site sorption, and two-kinetic sites sorption (Šimůnek and van Genuchten, 2008). Due to complex interactions at the groundwater table, certain modifications of the pressure head (compared to the original coupling) and solute concentration profiles were incorporated into the HYDRUS package. The developed integrated model is verified using HYDRUS-2D and analyzed for its computational time requirements.

  8. SEDA: A software package for the Statistical Earthquake Data Analysis

    NASA Astrophysics Data System (ADS)

    Lombardi, A. M.

    2017-03-01

    In this paper, the first version of the software SEDA (SEDAv1.0), designed to help seismologists statistically analyze earthquake data, is presented. The package consists of a user-friendly Matlab-based interface, which allows the user to easily interact with the application, and a computational core of Fortran codes, to guarantee the maximum speed. The primary factor driving the development of SEDA is to guarantee the research reproducibility, which is a growing movement among scientists and highly recommended by the most important scientific journals. SEDAv1.0 is mainly devoted to produce accurate and fast outputs. Less care has been taken for the graphic appeal, which will be improved in the future. The main part of SEDAv1.0 is devoted to the ETAS modeling. SEDAv1.0 contains a set of consistent tools on ETAS, allowing the estimation of parameters, the testing of model on data, the simulation of catalogs, the identification of sequences and forecasts calculation. The peculiarities of routines inside SEDAv1.0 are discussed in this paper. More specific details on the software are presented in the manual accompanying the program package.

  9. SEDA: A software package for the Statistical Earthquake Data Analysis

    PubMed Central

    Lombardi, A. M.

    2017-01-01

    In this paper, the first version of the software SEDA (SEDAv1.0), designed to help seismologists statistically analyze earthquake data, is presented. The package consists of a user-friendly Matlab-based interface, which allows the user to easily interact with the application, and a computational core of Fortran codes, to guarantee the maximum speed. The primary factor driving the development of SEDA is to guarantee the research reproducibility, which is a growing movement among scientists and highly recommended by the most important scientific journals. SEDAv1.0 is mainly devoted to produce accurate and fast outputs. Less care has been taken for the graphic appeal, which will be improved in the future. The main part of SEDAv1.0 is devoted to the ETAS modeling. SEDAv1.0 contains a set of consistent tools on ETAS, allowing the estimation of parameters, the testing of model on data, the simulation of catalogs, the identification of sequences and forecasts calculation. The peculiarities of routines inside SEDAv1.0 are discussed in this paper. More specific details on the software are presented in the manual accompanying the program package. PMID:28290482

  10. Evidence for an electrostatic mechanism of force generation by the bacteriophage T4 DNA packaging motor.

    PubMed

    Migliori, Amy D; Keller, Nicholas; Alam, Tanfis I; Mahalingam, Marthandan; Rao, Venigalla B; Arya, Gaurav; Smith, Douglas E

    2014-06-17

    How viral packaging motors generate enormous forces to translocate DNA into viral capsids remains unknown. Recent structural studies of the bacteriophage T4 packaging motor have led to a proposed mechanism wherein the gp17 motor protein translocates DNA by transitioning between extended and compact states, orchestrated by electrostatic interactions between complimentarily charged residues across the interface between the N- and C-terminal subdomains. Here we show that site-directed alterations in these residues cause force dependent impairments of motor function including lower translocation velocity, lower stall force and higher frequency of pauses and slips. We further show that the measured impairments correlate with computed changes in free-energy differences between the two states. These findings support the proposed structural mechanism and further suggest an energy landscape model of motor activity that couples the free-energy profile of motor conformational states with that of the ATP hydrolysis cycle.

  11. Evidence for an electrostatic mechanism of force generation by the bacteriophage T4 DNA packaging motor

    NASA Astrophysics Data System (ADS)

    Migliori, Amy D.; Keller, Nicholas; Alam, Tanfis I.; Mahalingam, Marthandan; Rao, Venigalla B.; Arya, Gaurav; Smith, Douglas E.

    2014-06-01

    How viral packaging motors generate enormous forces to translocate DNA into viral capsids remains unknown. Recent structural studies of the bacteriophage T4 packaging motor have led to a proposed mechanism wherein the gp17 motor protein translocates DNA by transitioning between extended and compact states, orchestrated by electrostatic interactions between complimentarily charged residues across the interface between the N- and C-terminal subdomains. Here we show that site-directed alterations in these residues cause force dependent impairments of motor function including lower translocation velocity, lower stall force and higher frequency of pauses and slips. We further show that the measured impairments correlate with computed changes in free-energy differences between the two states. These findings support the proposed structural mechanism and further suggest an energy landscape model of motor activity that couples the free-energy profile of motor conformational states with that of the ATP hydrolysis cycle.

  12. CentiServer: A Comprehensive Resource, Web-Based Application and R Package for Centrality Analysis.

    PubMed

    Jalili, Mahdi; Salehzadeh-Yazdi, Ali; Asgari, Yazdan; Arab, Seyed Shahriar; Yaghmaie, Marjan; Ghavamzadeh, Ardeshir; Alimoghaddam, Kamran

    2015-01-01

    Various disciplines are trying to solve one of the most noteworthy queries and broadly used concepts in biology, essentiality. Centrality is a primary index and a promising method for identifying essential nodes, particularly in biological networks. The newly created CentiServer is a comprehensive online resource that provides over 110 definitions of different centrality indices, their computational methods, and algorithms in the form of an encyclopedia. In addition, CentiServer allows users to calculate 55 centralities with the help of an interactive web-based application tool and provides a numerical result as a comma separated value (csv) file format or a mapped graphical format as a graph modeling language (GML) file. The standalone version of this application has been developed in the form of an R package. The web-based application (CentiServer) and R package (centiserve) are freely available at http://www.centiserver.org/.

  13. CentiServer: A Comprehensive Resource, Web-Based Application and R Package for Centrality Analysis

    PubMed Central

    Jalili, Mahdi; Salehzadeh-Yazdi, Ali; Asgari, Yazdan; Arab, Seyed Shahriar; Yaghmaie, Marjan; Ghavamzadeh, Ardeshir; Alimoghaddam, Kamran

    2015-01-01

    Various disciplines are trying to solve one of the most noteworthy queries and broadly used concepts in biology, essentiality. Centrality is a primary index and a promising method for identifying essential nodes, particularly in biological networks. The newly created CentiServer is a comprehensive online resource that provides over 110 definitions of different centrality indices, their computational methods, and algorithms in the form of an encyclopedia. In addition, CentiServer allows users to calculate 55 centralities with the help of an interactive web-based application tool and provides a numerical result as a comma separated value (csv) file format or a mapped graphical format as a graph modeling language (GML) file. The standalone version of this application has been developed in the form of an R package. The web-based application (CentiServer) and R package (centiserve) are freely available at http://www.centiserver.org/ PMID:26571275

  14. QMCPACK: an open source ab initio quantum Monte Carlo package for the electronic structure of atoms, molecules and solids

    NASA Astrophysics Data System (ADS)

    Kim, Jeongnim; Baczewski, Andrew D.; Beaudet, Todd D.; Benali, Anouar; Chandler Bennett, M.; Berrill, Mark A.; Blunt, Nick S.; Josué Landinez Borda, Edgar; Casula, Michele; Ceperley, David M.; Chiesa, Simone; Clark, Bryan K.; Clay, Raymond C., III; Delaney, Kris T.; Dewing, Mark; Esler, Kenneth P.; Hao, Hongxia; Heinonen, Olle; Kent, Paul R. C.; Krogel, Jaron T.; Kylänpää, Ilkka; Li, Ying Wai; Lopez, M. Graham; Luo, Ye; Malone, Fionn D.; Martin, Richard M.; Mathuriya, Amrita; McMinis, Jeremy; Melton, Cody A.; Mitas, Lubos; Morales, Miguel A.; Neuscamman, Eric; Parker, William D.; Pineda Flores, Sergio D.; Romero, Nichols A.; Rubenstein, Brenda M.; Shea, Jacqueline A. R.; Shin, Hyeondeok; Shulenburger, Luke; Tillack, Andreas F.; Townsend, Joshua P.; Tubman, Norm M.; Van Der Goetz, Brett; Vincent, Jordan E.; ChangMo Yang, D.; Yang, Yubo; Zhang, Shuai; Zhao, Luning

    2018-05-01

    QMCPACK is an open source quantum Monte Carlo package for ab initio electronic structure calculations. It supports calculations of metallic and insulating solids, molecules, atoms, and some model Hamiltonians. Implemented real space quantum Monte Carlo algorithms include variational, diffusion, and reptation Monte Carlo. QMCPACK uses Slater–Jastrow type trial wavefunctions in conjunction with a sophisticated optimizer capable of optimizing tens of thousands of parameters. The orbital space auxiliary-field quantum Monte Carlo method is also implemented, enabling cross validation between different highly accurate methods. The code is specifically optimized for calculations with large numbers of electrons on the latest high performance computing architectures, including multicore central processing unit and graphical processing unit systems. We detail the program’s capabilities, outline its structure, and give examples of its use in current research calculations. The package is available at http://qmcpack.org.

  15. Evidence for an electrostatic mechanism of force generation by the bacteriophage T4 DNA packaging motor

    PubMed Central

    Migliori, Amy D.; Keller, Nicholas; Alam, Tanfis I.; Mahalingam, Marthandan; Rao, Venigalla B.; Arya, Gaurav; Smith, Douglas E

    2014-01-01

    How viral packaging motors generate enormous forces to translocate DNA into viral capsids remains unknown. Recent structural studies of the bacteriophage T4 packaging motor have led to a proposed mechanism wherein the gp17 motor protein translocates DNA by transitioning between extended and compact states, orchestrated by electrostatic interactions between complimentarily charged residues across the interface between the N- and C-terminal subdomains. Here, we show that site-directed alterations in these residues cause force dependent impairments of motor function including lower translocation velocity, lower stall force, and higher frequency of pauses and slips. We further show that the measured impairments correlate with computed changes in free energy differences between the two states. These findings support the proposed structural mechanism and further suggest an energy landscape model of motor activity that couples the free energy profile of motor conformational states with that of the ATP hydrolysis cycle. PMID:24937091

  16. QMCPACK: an open source ab initio quantum Monte Carlo package for the electronic structure of atoms, molecules and solids.

    PubMed

    Kim, Jeongnim; Baczewski, Andrew T; Beaudet, Todd D; Benali, Anouar; Bennett, M Chandler; Berrill, Mark A; Blunt, Nick S; Borda, Edgar Josué Landinez; Casula, Michele; Ceperley, David M; Chiesa, Simone; Clark, Bryan K; Clay, Raymond C; Delaney, Kris T; Dewing, Mark; Esler, Kenneth P; Hao, Hongxia; Heinonen, Olle; Kent, Paul R C; Krogel, Jaron T; Kylänpää, Ilkka; Li, Ying Wai; Lopez, M Graham; Luo, Ye; Malone, Fionn D; Martin, Richard M; Mathuriya, Amrita; McMinis, Jeremy; Melton, Cody A; Mitas, Lubos; Morales, Miguel A; Neuscamman, Eric; Parker, William D; Pineda Flores, Sergio D; Romero, Nichols A; Rubenstein, Brenda M; Shea, Jacqueline A R; Shin, Hyeondeok; Shulenburger, Luke; Tillack, Andreas F; Townsend, Joshua P; Tubman, Norm M; Van Der Goetz, Brett; Vincent, Jordan E; Yang, D ChangMo; Yang, Yubo; Zhang, Shuai; Zhao, Luning

    2018-05-16

    QMCPACK is an open source quantum Monte Carlo package for ab initio electronic structure calculations. It supports calculations of metallic and insulating solids, molecules, atoms, and some model Hamiltonians. Implemented real space quantum Monte Carlo algorithms include variational, diffusion, and reptation Monte Carlo. QMCPACK uses Slater-Jastrow type trial wavefunctions in conjunction with a sophisticated optimizer capable of optimizing tens of thousands of parameters. The orbital space auxiliary-field quantum Monte Carlo method is also implemented, enabling cross validation between different highly accurate methods. The code is specifically optimized for calculations with large numbers of electrons on the latest high performance computing architectures, including multicore central processing unit and graphical processing unit systems. We detail the program's capabilities, outline its structure, and give examples of its use in current research calculations. The package is available at http://qmcpack.org.

  17. Modularized seismic full waveform inversion based on waveform sensitivity kernels - The software package ASKI

    NASA Astrophysics Data System (ADS)

    Schumacher, Florian; Friederich, Wolfgang; Lamara, Samir; Gutt, Phillip; Paffrath, Marcel

    2015-04-01

    We present a seismic full waveform inversion concept for applications ranging from seismological to enineering contexts, based on sensitivity kernels for full waveforms. The kernels are derived from Born scattering theory as the Fréchet derivatives of linearized frequency-domain full waveform data functionals, quantifying the influence of elastic earth model parameters and density on the data values. For a specific source-receiver combination, the kernel is computed from the displacement and strain field spectrum originating from the source evaluated throughout the inversion domain, as well as the Green function spectrum and its strains originating from the receiver. By storing the wavefield spectra of specific sources/receivers, they can be re-used for kernel computation for different specific source-receiver combinations, optimizing the total number of required forward simulations. In the iterative inversion procedure, the solution of the forward problem, the computation of sensitivity kernels and the derivation of a model update is held completely separate. In particular, the model description for the forward problem and the description of the inverted model update are kept independent. Hence, the resolution of the inverted model as well as the complexity of solving the forward problem can be iteratively increased (with increasing frequency content of the inverted data subset). This may regularize the overall inverse problem and optimizes the computational effort of both, solving the forward problem and computing the model update. The required interconnection of arbitrary unstructured volume and point grids is realized by generalized high-order integration rules and 3D-unstructured interpolation methods. The model update is inferred solving a minimization problem in a least-squares sense, resulting in Gauss-Newton convergence of the overall inversion process. The inversion method was implemented in the modularized software package ASKI (Analysis of Sensitivity and Kernel Inversion), which provides a generalized interface to arbitrary external forward modelling codes. So far, the 3D spectral-element code SPECFEM3D (Tromp, Komatitsch and Liu, 2008) and the 1D semi-analytical code GEMINI (Friederich and Dalkolmo, 1995) in both, Cartesian and spherical framework are supported. The creation of interfaces to further forward codes is planned in the near future. ASKI is freely available under the terms of the GPL at www.rub.de/aski . Since the independent modules of ASKI must communicate via file output/input, large storage capacities need to be accessible conveniently. Storing the complete sensitivity matrix to file, however, permits the scientist full manual control over each step in a customized procedure of sensitivity/resolution analysis and full waveform inversion. In the presentation, we will show some aspects of the theory behind the full waveform inversion method and its practical realization by the software package ASKI, as well as synthetic and real-data applications from different scales and geometries.

  18. Bethe-Salpeter Eigenvalue Solver Package (BSEPACK) v0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    SHAO, MEIYEU; YANG, CHAO

    2017-04-25

    The BSEPACK contains a set of subroutines for solving the Bethe-Salpeter Eigenvalue (BSE) problem. This type of problem arises in this study of optical excitation of nanoscale materials. The BSE problem is a structured non-Hermitian eigenvalue problem. The BSEPACK software can be used to compute all or subset of eigenpairs of a BSE Hamiltonian. It can also be used to compute the optical absorption spectrum without computing BSE eigenvalues and eigenvectors explicitly. The package makes use of the ScaLAPACK, LAPACK and BLAS.

  19. Limiting Surface Density Method Adapted to Large Arrays of Heterogeneous Shipping Packages with Nonlinear Responses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stover, Tracy E.; Baker, James S.; Ratliff, Michael D.

    The classic Limiting Surface Density (LSD) method is an empirical calculation technique for analyzing and setting mass limits for fissile items in storage arrays. LSD is a desirable method because it can reduce or eliminate the need for lengthy detailed Monte Carlo models of storage arrays. The original (or classic) method was developed based on idealized arrays of bare spherical metal items in air-spaced cubic units in a water-reflected cubic array. In this case, the geometric and material-based surface densities were acceptably correlated by linear functions. Later updates to the method were made to allow for concrete reflection rather thanmore » water, cylindrical masses rather than spheres, different material forms, and noncubic arrays. However, in the intervening four decades since those updates, little work has been done to update the method, especially for use with contemporary highly heterogeneous shipping packages that are noncubic and stored in noncubic arrays. In this work, the LSD method is reevaluated for application to highly heterogeneous shipping packages for fissile material. The package modeled is the 9975 shipping package, currently the primary package used to store fissile material at Savannah River Site’s K-Area Complex. The package is neither cubic nor rectangular but resembles nested cylinders of stainless steel, lead, aluminum, and Celotex. The fissile content is assumed to be a cylinder of plutonium metal. The packages may be arranged in arrays with both an equal number of packages per side (package cubic) and an unequal number of packages per side (noncubic). The cubic arrangements are used to derive the 9975-specific material and geometry constants for the classic linear form LSD method. The linear form of the LSD, with noncubic array adjustment, is applied and evaluated against computational models for these packages to determine the critical unit fissile mass. Sensitivity equations are derived from the classic method, and these are also used to make projections of the critical unit fissile mass. It was discovered that the heterogeneous packages have a nonlinear surface density versus critical mass relationship compared to the acceptably linear response of bare spherical fissile masses. Methodology is developed to address the nonlinear response. In so doing, the solution to the nonlinear LSD method becomes decoupled from the critical mass of a single unit, adding to its flexibility. The ability of the method to predict changes in neutron multiplication due to perturbations in a parameter is examined to provide a basis for analyzing upset conditions. In conclusion, a full rederivation of the classic LSD method from diffusion theory is also included as this was found to be lacking in the available literature.« less

  20. Limiting Surface Density Method Adapted to Large Arrays of Heterogeneous Shipping Packages with Nonlinear Responses

    DOE PAGES

    Stover, Tracy E.; Baker, James S.; Ratliff, Michael D.; ...

    2018-03-02

    The classic Limiting Surface Density (LSD) method is an empirical calculation technique for analyzing and setting mass limits for fissile items in storage arrays. LSD is a desirable method because it can reduce or eliminate the need for lengthy detailed Monte Carlo models of storage arrays. The original (or classic) method was developed based on idealized arrays of bare spherical metal items in air-spaced cubic units in a water-reflected cubic array. In this case, the geometric and material-based surface densities were acceptably correlated by linear functions. Later updates to the method were made to allow for concrete reflection rather thanmore » water, cylindrical masses rather than spheres, different material forms, and noncubic arrays. However, in the intervening four decades since those updates, little work has been done to update the method, especially for use with contemporary highly heterogeneous shipping packages that are noncubic and stored in noncubic arrays. In this work, the LSD method is reevaluated for application to highly heterogeneous shipping packages for fissile material. The package modeled is the 9975 shipping package, currently the primary package used to store fissile material at Savannah River Site’s K-Area Complex. The package is neither cubic nor rectangular but resembles nested cylinders of stainless steel, lead, aluminum, and Celotex. The fissile content is assumed to be a cylinder of plutonium metal. The packages may be arranged in arrays with both an equal number of packages per side (package cubic) and an unequal number of packages per side (noncubic). The cubic arrangements are used to derive the 9975-specific material and geometry constants for the classic linear form LSD method. The linear form of the LSD, with noncubic array adjustment, is applied and evaluated against computational models for these packages to determine the critical unit fissile mass. Sensitivity equations are derived from the classic method, and these are also used to make projections of the critical unit fissile mass. It was discovered that the heterogeneous packages have a nonlinear surface density versus critical mass relationship compared to the acceptably linear response of bare spherical fissile masses. Methodology is developed to address the nonlinear response. In so doing, the solution to the nonlinear LSD method becomes decoupled from the critical mass of a single unit, adding to its flexibility. The ability of the method to predict changes in neutron multiplication due to perturbations in a parameter is examined to provide a basis for analyzing upset conditions. In conclusion, a full rederivation of the classic LSD method from diffusion theory is also included as this was found to be lacking in the available literature.« less

  1. Thermodynamic model effects on the design and optimization of natural gas plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diaz, S.; Zabaloy, M.; Brignole, E.A.

    1999-07-01

    The design and optimization of natural gas plants is carried out on the basis of process simulators. The physical property package is generally based on cubic equations of state. By rigorous thermodynamics phase equilibrium conditions, thermodynamic functions, equilibrium phase separations, work and heat are computed. The aim of this work is to analyze the NGL turboexpansion process and identify possible process computations that are more sensitive to model predictions accuracy. Three equations of state, PR, SRK and Peneloux modification, are used to study the effect of property predictions on process calculations and plant optimization. It is shown that turboexpander plantsmore » have moderate sensitivity with respect to phase equilibrium computations, but higher accuracy is required for the prediction of enthalpy and turboexpansion work. The effect of modeling CO{sub 2} solubility is also critical in mixtures with high CO{sub 2} content in the feed.« less

  2. Preliminary design data package, appendices C1 and C2

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The HYBRID2 program which computes the fuel and energy consumption of a hybrid vehicle with a bi-modal control strategy over specified component driving cycles is described. Fuel and energy consumption are computed separately for the two modes of operation. The program also computes yearly average fuel and energy consumption using a composite driving cycle which varies as a function of daily travel. The modelling techniques are described, and subroutines and their functions are given. The composition of modern automobiles is discussed along with the energy required to manufacture an American automobile. The energy required to scrap and recycle automobiles is also discussed.

  3. Parallel and Preemptable Dynamically Dimensioned Search Algorithms for Single and Multi-objective Optimization in Water Resources

    NASA Astrophysics Data System (ADS)

    Tolson, B.; Matott, L. S.; Gaffoor, T. A.; Asadzadeh, M.; Shafii, M.; Pomorski, P.; Xu, X.; Jahanpour, M.; Razavi, S.; Haghnegahdar, A.; Craig, J. R.

    2015-12-01

    We introduce asynchronous parallel implementations of the Dynamically Dimensioned Search (DDS) family of algorithms including DDS, discrete DDS, PA-DDS and DDS-AU. These parallel algorithms are unique from most existing parallel optimization algorithms in the water resources field in that parallel DDS is asynchronous and does not require an entire population (set of candidate solutions) to be evaluated before generating and then sending a new candidate solution for evaluation. One key advance in this study is developing the first parallel PA-DDS multi-objective optimization algorithm. The other key advance is enhancing the computational efficiency of solving optimization problems (such as model calibration) by combining a parallel optimization algorithm with the deterministic model pre-emption concept. These two efficiency techniques can only be combined because of the asynchronous nature of parallel DDS. Model pre-emption functions to terminate simulation model runs early, prior to completely simulating the model calibration period for example, when intermediate results indicate the candidate solution is so poor that it will definitely have no influence on the generation of further candidate solutions. The computational savings of deterministic model preemption available in serial implementations of population-based algorithms (e.g., PSO) disappear in synchronous parallel implementations as these algorithms. In addition to the key advances above, we implement the algorithms across a range of computation platforms (Windows and Unix-based operating systems from multi-core desktops to a supercomputer system) and package these for future modellers within a model-independent calibration software package called Ostrich as well as MATLAB versions. Results across multiple platforms and multiple case studies (from 4 to 64 processors) demonstrate the vast improvement over serial DDS-based algorithms and highlight the important role model pre-emption plays in the performance of parallel, pre-emptable DDS algorithms. Case studies include single- and multiple-objective optimization problems in water resources model calibration and in many cases linear or near linear speedups are observed.

  4. Attitudes and Achievement in Introductory Psychological Statistics Classes: Traditional versus Computer-Supported Instruction.

    ERIC Educational Resources Information Center

    Gratz, Zandra S.; And Others

    A study was conducted at a large, state-supported college in the Northeast to establish a mechanism by which a popular software package, Statistical Package for the Social Sciences (SPSS), could be used in psychology program statistics courses in such a way that no prior computer expertise would be needed on the part of the faculty or the…

  5. An initial investigation into methods of computing transonic aerodynamic sensitivity coefficients

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1991-01-01

    The three dimensional quasi-analytical sensitivity analysis and the ancillary driver programs are developed needed to carry out the studies and perform comparisons. The code is essentially contained in one unified package which includes the following: (1) a three dimensional transonic wing analysis program (ZEBRA); (2) a quasi-analytical portion which determines the matrix elements in the quasi-analytical equations; (3) a method for computing the sensitivity coefficients from the resulting quasi-analytical equations; (4) a package to determine for comparison purposes sensitivity coefficients via the finite difference approach; and (5) a graphics package.

  6. The FluxCompensator: Making Radiative Transfer Models of Hydrodynamical Simulations Directly Comparable to Real Observations

    NASA Astrophysics Data System (ADS)

    Koepferl, Christine M.; Robitaille, Thomas P.

    2017-11-01

    When modeling astronomical objects throughout the universe, it is important to correctly treat the limitations of the data, for instance finite resolution and sensitivity. In order to simulate these effects, and to make radiative transfer models directly comparable to real observations, we have developed an open-source Python package called the FluxCompensator that enables the post-processing of the output of 3D Monte Carlo radiative transfer codes, such as Hyperion. With the FluxCompensator, realistic synthetic observations can be generated by modeling the effects of convolution with arbitrary point-spread functions, transmission curves, finite pixel resolution, noise, and reddening. Pipelines can be applied to compute synthetic observations that simulate observatories, such as the Spitzer Space Telescope or the Herschel Space Observatory. Additionally, this tool can read in existing observations (e.g., FITS format) and use the same settings for the synthetic observations. In this paper, we describe the package as well as present examples of such synthetic observations.

  7. The FluxCompensator: Making Radiative Transfer Models of Hydrodynamical Simulations Directly Comparable to Real Observations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koepferl, Christine M.; Robitaille, Thomas P., E-mail: koepferl@usm.lmu.de

    When modeling astronomical objects throughout the universe, it is important to correctly treat the limitations of the data, for instance finite resolution and sensitivity. In order to simulate these effects, and to make radiative transfer models directly comparable to real observations, we have developed an open-source Python package called the FluxCompensator that enables the post-processing of the output of 3D Monte Carlo radiative transfer codes, such as Hyperion. With the FluxCompensator, realistic synthetic observations can be generated by modeling the effects of convolution with arbitrary point-spread functions, transmission curves, finite pixel resolution, noise, and reddening. Pipelines can be applied tomore » compute synthetic observations that simulate observatories, such as the Spitzer Space Telescope or the Herschel Space Observatory . Additionally, this tool can read in existing observations (e.g., FITS format) and use the same settings for the synthetic observations. In this paper, we describe the package as well as present examples of such synthetic observations.« less

  8. COBRApy: COnstraints-Based Reconstruction and Analysis for Python.

    PubMed

    Ebrahim, Ali; Lerman, Joshua A; Palsson, Bernhard O; Hyduke, Daniel R

    2013-08-08

    COnstraint-Based Reconstruction and Analysis (COBRA) methods are widely used for genome-scale modeling of metabolic networks in both prokaryotes and eukaryotes. Due to the successes with metabolism, there is an increasing effort to apply COBRA methods to reconstruct and analyze integrated models of cellular processes. The COBRA Toolbox for MATLAB is a leading software package for genome-scale analysis of metabolism; however, it was not designed to elegantly capture the complexity inherent in integrated biological networks and lacks an integration framework for the multiomics data used in systems biology. The openCOBRA Project is a community effort to promote constraints-based research through the distribution of freely available software. Here, we describe COBRA for Python (COBRApy), a Python package that provides support for basic COBRA methods. COBRApy is designed in an object-oriented fashion that facilitates the representation of the complex biological processes of metabolism and gene expression. COBRApy does not require MATLAB to function; however, it includes an interface to the COBRA Toolbox for MATLAB to facilitate use of legacy codes. For improved performance, COBRApy includes parallel processing support for computationally intensive processes. COBRApy is an object-oriented framework designed to meet the computational challenges associated with the next generation of stoichiometric constraint-based models and high-density omics data sets. http://opencobra.sourceforge.net/

  9. Dual Arm Work Package performance estimates and telerobot task network simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Draper, J.V.; Blair, L.M.

    1997-02-01

    This paper describes the methodology and results of a network simulation study of the Dual Arm Work Package (DAWP), to be employed for dismantling the Argonne National Laboratory CP-5 reactor. The development of the simulation model was based upon the results of a task analysis for the same system. This study was performed by the Oak Ridge National Laboratory (ORNL), in the Robotics and Process Systems Division. Funding was provided the US Department of Energy`s Office of Technology Development, Robotics Technology Development Program (RTDP). The RTDP is developing methods of computer simulation to estimate telerobotic system performance. Data were collectedmore » to provide point estimates to be used in a task network simulation model. Three skilled operators performed six repetitions of a pipe cutting task representative of typical teleoperation cutting operations.« less

  10. Solar heating and cooling technical data and systems analysis

    NASA Technical Reports Server (NTRS)

    Christensen, D. L.

    1976-01-01

    The accomplishments of a project to study solar heating and air conditioning are outlined. Presentation materials (data packages, slides, charts, and visual aids) were developed. Bibliographies and source materials on materials and coatings, solar water heaters, systems analysis computer models, solar collectors and solar projects were developed. Detailed MIRADS computer formats for primary data parameters were developed and updated. The following data were included: climatic, architectural, topography, heating and cooling equipment, thermal loads, and economics. Data sources in each of these areas were identified as well as solar radiation data stations and instruments.

  11. NASA general aviation crashworthiness seat development

    NASA Technical Reports Server (NTRS)

    Fasanella, E. L.; Alfaro-Bou, E.

    1979-01-01

    Three load limiting seat concepts for general aviation aircraft designed to lower the deceleration of the occupant in the event of a crash were sled tested and evaluated with reference to a standard seat. Dummy pelvis accelerations were reduced up to 50 percent with one of the concepts. Computer program MSOMLA (Modified Seat Occupant Model for Light Aircraft) was used to simulate the behavior of a dummy passenger in a NASA full-scale crash test of a twin engine light aircraft. A computer graphics package MANPLOT was developed to pictorially represent the occupant and seat motion.

  12. Design Considerations of a Virtual Laboratory for Advanced X-ray Sources

    NASA Astrophysics Data System (ADS)

    Luginsland, J. W.; Frese, M. H.; Frese, S. D.; Watrous, J. J.; Heileman, G. L.

    2004-11-01

    The field of scientific computation has greatly advanced in the last few years, resulting in the ability to perform complex computer simulations that can predict the performance of real-world experiments in a number of fields of study. Among the forces driving this new computational capability is the advent of parallel algorithms, allowing calculations in three-dimensional space with realistic time scales. Electromagnetic radiation sources driven by high-voltage, high-current electron beams offer an area to further push the state-of-the-art in high fidelity, first-principles simulation tools. The physics of these x-ray sources combine kinetic plasma physics (electron beams) with dense fluid-like plasma physics (anode plasmas) and x-ray generation (bremsstrahlung). There are a number of mature techniques and software packages for dealing with the individual aspects of these sources, such as Particle-In-Cell (PIC), Magneto-Hydrodynamics (MHD), and radiation transport codes. The current effort is focused on developing an object-oriented software environment using the Rational© Unified Process and the Unified Modeling Language (UML) to provide a framework where multiple 3D parallel physics packages, such as a PIC code (ICEPIC), a MHD code (MACH), and a x-ray transport code (ITS) can co-exist in a system-of-systems approach to modeling advanced x-ray sources. Initial software design and assessments of the various physics algorithms' fidelity will be presented.

  13. Key Technology Research on Open Architecture for The Sharing of Heterogeneous Geographic Analysis Models

    NASA Astrophysics Data System (ADS)

    Yue, S. S.; Wen, Y. N.; Lv, G. N.; Hu, D.

    2013-10-01

    In recent years, the increasing development of cloud computing technologies laid critical foundation for efficiently solving complicated geographic issues. However, it is still difficult to realize the cooperative operation of massive heterogeneous geographical models. Traditional cloud architecture is apt to provide centralized solution to end users, while all the required resources are often offered by large enterprises or special agencies. Thus, it's a closed framework from the perspective of resource utilization. Solving comprehensive geographic issues requires integrating multifarious heterogeneous geographical models and data. In this case, an open computing platform is in need, with which the model owners can package and deploy their models into cloud conveniently, while model users can search, access and utilize those models with cloud facility. Based on this concept, the open cloud service strategies for the sharing of heterogeneous geographic analysis models is studied in this article. The key technology: unified cloud interface strategy, sharing platform based on cloud service, and computing platform based on cloud service are discussed in detail, and related experiments are conducted for further verification.

  14. Symbolic computation of equivalence transformations and parameter reduction for nonlinear physical models

    NASA Astrophysics Data System (ADS)

    Cheviakov, Alexei F.

    2017-11-01

    An efficient systematic procedure is provided for symbolic computation of Lie groups of equivalence transformations and generalized equivalence transformations of systems of differential equations that contain arbitrary elements (arbitrary functions and/or arbitrary constant parameters), using the software package GeM for Maple. Application of equivalence transformations to the reduction of the number of arbitrary elements in a given system of equations is discussed, and several examples are considered. The first computational example of generalized equivalence transformations where the transformation of the dependent variable involves an arbitrary constitutive function is presented. As a detailed physical example, a three-parameter family of nonlinear wave equations describing finite anti-plane shear displacements of an incompressible hyperelastic fiber-reinforced medium is considered. Equivalence transformations are computed and employed to radically simplify the model for an arbitrary fiber direction, invertibly reducing the model to a simple form that corresponds to a special fiber direction, and involves no arbitrary elements. The presented computation algorithm is applicable to wide classes of systems of differential equations containing arbitrary elements.

  15. The Application of COMSOL Multiphysics Package on the Modelling of Complex 3-D Lithospheric Electrical Resistivity Structures - A Case Study from the Proterozoic Orogenic belt within the North China Craton

    NASA Astrophysics Data System (ADS)

    Guo, L.; Yin, Y.; Deng, M.; Guo, L.; Yan, J.

    2017-12-01

    At present, most magnetotelluric (MT) forward modelling and inversion codes are based on finite difference method. But its structured mesh gridding cannot be well adapted for the conditions with arbitrary topography or complex tectonic structures. By contrast, the finite element method is more accurate in calculating complex and irregular 3-D region and has lower requirement of function smoothness. However, the complexity of mesh gridding and limitation of computer capacity has been affecting its application. COMSOL Multiphysics is a cross-platform finite element analysis, solver and multiphysics full-coupling simulation software. It achieves highly accurate numerical simulations with high computational performance and outstanding multi-field bi-directional coupling analysis capability. In addition, its AC/DC and RF module can be used to easily calculate the electromagnetic responses of complex geological structures. Using the adaptive unstructured grid, the calculation is much faster. In order to improve the discretization technique of computing area, we use the combination of Matlab and COMSOL Multiphysics to establish a general procedure for calculating the MT responses for arbitrary resistivity models. The calculated responses include the surface electric and magnetic field components, impedance components, magnetic transfer functions and phase tensors. Then, the reliability of this procedure is certificated by 1-D, 2-D and 3-D and anisotropic forward modeling tests. Finally, we establish the 3-D lithospheric resistivity model for the Proterozoic Wutai-Hengshan Mts. within the North China Craton by fitting the real MT data collected there. The reliability of the model is also verified by induced vectors and phase tensors. Our model shows more details and better resolution, compared with the previously published 3-D model based on the finite difference method. In conclusion, COMSOL Multiphysics package is suitable for modeling the 3-D lithospheric resistivity structures under complex tectonic deformation backgrounds, which could be a good complement to the existing finite-difference inversion algorithms.

  16. Drought: A comprehensive R package for drought monitoring, prediction and analysis

    NASA Astrophysics Data System (ADS)

    Hao, Zengchao; Hao, Fanghua; Singh, Vijay P.; Cheng, Hongguang

    2015-04-01

    Drought may impose serious challenges to human societies and ecosystems. Due to complicated causing effects and wide impacts, a universally accepted definition of drought does not exist. The drought indicator is commonly used to characterize drought properties such as duration or severity. Various drought indicators have been developed in the past few decades for the monitoring of a certain aspect of drought condition along with the development of multivariate drought indices for drought characterizations from multiple sources or hydro-climatic variables. Reliable drought prediction with suitable drought indicators is critical to the drought preparedness plan to reduce potential drought impacts. In addition, drought analysis to quantify the risk of drought properties would provide useful information for operation drought managements. The drought monitoring, prediction and risk analysis are important components in drought modeling and assessments. In this study, a comprehensive R package "drought" is developed to aid the drought monitoring, prediction and risk analysis (available from R-Forge and CRAN soon). The computation of a suite of univariate and multivariate drought indices that integrate drought information from various sources such as precipitation, temperature, soil moisture, and runoff is available in the drought monitoring component in the package. The drought prediction/forecasting component consists of statistical drought predictions to enhance the drought early warning for decision makings. Analysis of drought properties such as duration and severity is also provided in this package for drought risk assessments. Based on this package, a drought monitoring and prediction/forecasting system is under development as a decision supporting tool. The package will be provided freely to the public to aid the drought modeling and assessment for researchers and practitioners.

  17. PynPoint code for exoplanet imaging

    NASA Astrophysics Data System (ADS)

    Amara, A.; Quanz, S. P.; Akeret, J.

    2015-04-01

    We announce the public release of PynPoint, a Python package that we have developed for analysing exoplanet data taken with the angular differential imaging observing technique. In particular, PynPoint is designed to model the point spread function of the central star and to subtract its flux contribution to reveal nearby faint companion planets. The current version of the package does this correction by using a principal component analysis method to build a basis set for modelling the point spread function of the observations. We demonstrate the performance of the package by reanalysing publicly available data on the exoplanet β Pictoris b, which consists of close to 24,000 individual image frames. We show that PynPoint is able to analyse this typical data in roughly 1.5 min on a Mac Pro, when the number of images is reduced by co-adding in sets of 5. The main computational work, the calculation of the Singular-Value-Decomposition, parallelises well as a result of a reliance on the SciPy and NumPy packages. For this calculation the peak memory load is 6 GB, which can be run comfortably on most workstations. A simpler calculation, by co-adding over 50, takes 3 s with a peak memory usage of 600 MB. This can be performed easily on a laptop. In developing the package we have modularised the code so that we will be able to extend functionality in future releases, through the inclusion of more modules, without it affecting the users application programming interface. We distribute the PynPoint package under GPLv3 licence through the central PyPI server, and the documentation is available online (http://pynpoint.ethz.ch).

  18. Physical modeling and high-performance GPU computing for characterization, interception, and disruption of hazardous near-Earth objects

    NASA Astrophysics Data System (ADS)

    Kaplinger, Brian Douglas

    For the past few decades, both the scientific community and the general public have been becoming more aware that the Earth lives in a shooting gallery of small objects. We classify all of these asteroids and comets, known or unknown, that cross Earth's orbit as near-Earth objects (NEOs). A look at our geologic history tells us that NEOs have collided with Earth in the past, and we expect that they will continue to do so. With thousands of known NEOs crossing the orbit of Earth, there has been significant scientific interest in developing the capability to deflect an NEO from an impacting trajectory. This thesis applies the ideas of Smoothed Particle Hydrodynamics (SPH) theory to the NEO disruption problem. A simulation package was designed that allows efficacy simulation to be integrated into the mission planning and design process. This is done by applying ideas in high-performance computing (HPC) on the computer graphics processing unit (GPU). Rather than prove a concept through large standalone simulations on a supercomputer, a highly parallel structure allows for flexible, target dependent questions to be resolved. Built around nonclassified data and analysis, this computer package will allow academic institutions to better tackle the issue of NEO mitigation effectiveness.

  19. Application of the Linux cluster for exhaustive window haplotype analysis using the FBAT and Unphased programs.

    PubMed

    Mishima, Hiroyuki; Lidral, Andrew C; Ni, Jun

    2008-05-28

    Genetic association studies have been used to map disease-causing genes. A newly introduced statistical method, called exhaustive haplotype association study, analyzes genetic information consisting of different numbers and combinations of DNA sequence variations along a chromosome. Such studies involve a large number of statistical calculations and subsequently high computing power. It is possible to develop parallel algorithms and codes to perform the calculations on a high performance computing (HPC) system. However, most existing commonly-used statistic packages for genetic studies are non-parallel versions. Alternatively, one may use the cutting-edge technology of grid computing and its packages to conduct non-parallel genetic statistical packages on a centralized HPC system or distributed computing systems. In this paper, we report the utilization of a queuing scheduler built on the Grid Engine and run on a Rocks Linux cluster for our genetic statistical studies. Analysis of both consecutive and combinational window haplotypes was conducted by the FBAT (Laird et al., 2000) and Unphased (Dudbridge, 2003) programs. The dataset consisted of 26 loci from 277 extended families (1484 persons). Using the Rocks Linux cluster with 22 compute-nodes, FBAT jobs performed about 14.4-15.9 times faster, while Unphased jobs performed 1.1-18.6 times faster compared to the accumulated computation duration. Execution of exhaustive haplotype analysis using non-parallel software packages on a Linux-based system is an effective and efficient approach in terms of cost and performance.

  20. Application of the Linux cluster for exhaustive window haplotype analysis using the FBAT and Unphased programs

    PubMed Central

    Mishima, Hiroyuki; Lidral, Andrew C; Ni, Jun

    2008-01-01

    Background Genetic association studies have been used to map disease-causing genes. A newly introduced statistical method, called exhaustive haplotype association study, analyzes genetic information consisting of different numbers and combinations of DNA sequence variations along a chromosome. Such studies involve a large number of statistical calculations and subsequently high computing power. It is possible to develop parallel algorithms and codes to perform the calculations on a high performance computing (HPC) system. However, most existing commonly-used statistic packages for genetic studies are non-parallel versions. Alternatively, one may use the cutting-edge technology of grid computing and its packages to conduct non-parallel genetic statistical packages on a centralized HPC system or distributed computing systems. In this paper, we report the utilization of a queuing scheduler built on the Grid Engine and run on a Rocks Linux cluster for our genetic statistical studies. Results Analysis of both consecutive and combinational window haplotypes was conducted by the FBAT (Laird et al., 2000) and Unphased (Dudbridge, 2003) programs. The dataset consisted of 26 loci from 277 extended families (1484 persons). Using the Rocks Linux cluster with 22 compute-nodes, FBAT jobs performed about 14.4–15.9 times faster, while Unphased jobs performed 1.1–18.6 times faster compared to the accumulated computation duration. Conclusion Execution of exhaustive haplotype analysis using non-parallel software packages on a Linux-based system is an effective and efficient approach in terms of cost and performance. PMID:18541045

  1. Nonparametric Statistics Test Software Package.

    DTIC Science & Technology

    1983-09-01

    statis- tics because of their acceptance in the academic world, the availability of computer support, and flexibility in model builling. Nonparametric...25 I1l,lCELL WRITE(NCF,12 ) IvE (I ,RCCT(I) 122 FORMAT(IlXt 3(H5 9 1) IF( IeLT *NCELL) WRITE (NOF1123 J PARTV(I1J 123 FORMAT( Xll----’,FIo.3J 25 CONT

  2. Development and Implementation of Methods and Means for Achieving a Uniform Functional Coating Thickness

    NASA Astrophysics Data System (ADS)

    Shishlov, A. V.; Sagatelyan, G. R.; Shashurin, V. D.

    2017-12-01

    A mathematical model is proposed to calculate the growth rate of the thin-film coating thickness at various points in a flat substrate surface during planetary motion of the substrate, which makes it possible to calculate an expected coating thickness distribution. Proper software package is developed. The coefficients used for computer simulation are experimentally determined.

  3. Diagnostic Testing Package DX v 2.0 Technical Specification. Methodology Project.

    ERIC Educational Resources Information Center

    McArthur, David

    This paper contains the technical specifications, schematic diagrams, and program printout for a computer software package for the development and administration of diagnostic tests. The second version of the Diagnostic Testing Package DX consists of a PASCAL-based set of modules located in two main programs: (1) EDITTEST creates, modifies, and…

  4. R package PRIMsrc: Bump Hunting by Patient Rule Induction Method for Survival, Regression and Classification

    PubMed Central

    Dazard, Jean-Eudes; Choe, Michael; LeBlanc, Michael; Rao, J. Sunil

    2015-01-01

    PRIMsrc is a novel implementation of a non-parametric bump hunting procedure, based on the Patient Rule Induction Method (PRIM), offering a unified treatment of outcome variables, including censored time-to-event (Survival), continuous (Regression) and discrete (Classification) responses. To fit the model, it uses a recursive peeling procedure with specific peeling criteria and stopping rules depending on the response. To validate the model, it provides an objective function based on prediction-error or other specific statistic, as well as two alternative cross-validation techniques, adapted to the task of decision-rule making and estimation in the three types of settings. PRIMsrc comes as an open source R package, including at this point: (i) a main function for fitting a Survival Bump Hunting model with various options allowing cross-validated model selection to control model size (#covariates) and model complexity (#peeling steps) and generation of cross-validated end-point estimates; (ii) parallel computing; (iii) various S3-generic and specific plotting functions for data visualization, diagnostic, prediction, summary and display of results. It is available on CRAN and GitHub. PMID:26798326

  5. Scoria: a Python module for manipulating 3D molecular data.

    PubMed

    Ropp, Patrick; Friedman, Aaron; Durrant, Jacob D

    2017-09-18

    Third-party packages have transformed the Python programming language into a powerful computational-biology tool. Package installation is easy for experienced users, but novices sometimes struggle with dependencies and compilers. This presents a barrier that can hinder the otherwise broad adoption of new tools. We present Scoria, a Python package for manipulating three-dimensional molecular data. Unlike similar packages, Scoria requires no dependencies, compilation, or system-wide installation. One can incorporate the Scoria source code directly into their own programs. But Scoria is not designed to compete with other similar packages. Rather, it complements them. Our package leverages others (e.g. NumPy, SciPy), if present, to speed and extend its own functionality. To show its utility, we use Scoria to analyze a molecular dynamics trajectory. Our FootPrint script colors the atoms of one chain by the frequency of their contacts with a second chain. We are hopeful that Scoria will be a useful tool for the computational-biology community. A copy is available for download free of charge (Apache License 2.0) at http://durrantlab.com/scoria/ . Graphical abstract .

  6. cisPath: an R/Bioconductor package for cloud users for visualization and management of functional protein interaction networks.

    PubMed

    Wang, Likun; Yang, Luhe; Peng, Zuohan; Lu, Dan; Jin, Yan; McNutt, Michael; Yin, Yuxin

    2015-01-01

    With the burgeoning development of cloud technology and services, there are an increasing number of users who prefer cloud to run their applications. All software and associated data are hosted on the cloud, allowing users to access them via a web browser from any computer, anywhere. This paper presents cisPath, an R/Bioconductor package deployed on cloud servers for client users to visualize, manage, and share functional protein interaction networks. With this R package, users can easily integrate downloaded protein-protein interaction information from different online databases with private data to construct new and personalized interaction networks. Additional functions allow users to generate specific networks based on private databases. Since the results produced with the use of this package are in the form of web pages, cloud users can easily view and edit the network graphs via the browser, using a mouse or touch screen, without the need to download them to a local computer. This package can also be installed and run on a local desktop computer. Depending on user preference, results can be publicized or shared by uploading to a web server or cloud driver, allowing other users to directly access results via a web browser. This package can be installed and run on a variety of platforms. Since all network views are shown in web pages, such package is particularly useful for cloud users. The easy installation and operation is an attractive quality for R beginners and users with no previous experience with cloud services.

  7. cisPath: an R/Bioconductor package for cloud users for visualization and management of functional protein interaction networks

    PubMed Central

    2015-01-01

    Background With the burgeoning development of cloud technology and services, there are an increasing number of users who prefer cloud to run their applications. All software and associated data are hosted on the cloud, allowing users to access them via a web browser from any computer, anywhere. This paper presents cisPath, an R/Bioconductor package deployed on cloud servers for client users to visualize, manage, and share functional protein interaction networks. Results With this R package, users can easily integrate downloaded protein-protein interaction information from different online databases with private data to construct new and personalized interaction networks. Additional functions allow users to generate specific networks based on private databases. Since the results produced with the use of this package are in the form of web pages, cloud users can easily view and edit the network graphs via the browser, using a mouse or touch screen, without the need to download them to a local computer. This package can also be installed and run on a local desktop computer. Depending on user preference, results can be publicized or shared by uploading to a web server or cloud driver, allowing other users to directly access results via a web browser. Conclusions This package can be installed and run on a variety of platforms. Since all network views are shown in web pages, such package is particularly useful for cloud users. The easy installation and operation is an attractive quality for R beginners and users with no previous experience with cloud services. PMID:25708840

  8. Efficient analysis of large-scale genome-wide data with two R packages: bigstatsr and bigsnpr.

    PubMed

    Privé, Florian; Aschard, Hugues; Ziyatdinov, Andrey; Blum, Michael G B

    2017-03-30

    Genome-wide datasets produced for association studies have dramatically increased in size over the past few years, with modern datasets commonly including millions of variants measured in dozens of thousands of individuals. This increase in data size is a major challenge severely slowing down genomic analyses, leading to some software becoming obsolete and researchers having limited access to diverse analysis tools. Here we present two R packages, bigstatsr and bigsnpr, allowing for the analysis of large scale genomic data to be performed within R. To address large data size, the packages use memory-mapping for accessing data matrices stored on disk instead of in RAM. To perform data pre-processing and data analysis, the packages integrate most of the tools that are commonly used, either through transparent system calls to existing software, or through updated or improved implementation of existing methods. In particular, the packages implement fast and accurate computations of principal component analysis and association studies, functions to remove SNPs in linkage disequilibrium and algorithms to learn polygenic risk scores on millions of SNPs. We illustrate applications of the two R packages by analyzing a case-control genomic dataset for celiac disease, performing an association study and computing Polygenic Risk Scores. Finally, we demonstrate the scalability of the R packages by analyzing a simulated genome-wide dataset including 500,000 individuals and 1 million markers on a single desktop computer. https://privefl.github.io/bigstatsr/ & https://privefl.github.io/bigsnpr/. florian.prive@univ-grenoble-alpes.fr & michael.blum@univ-grenoble-alpes.fr. Supplementary materials are available at Bioinformatics online.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hagberg, Aric; Swart, Pieter; S Chult, Daniel

    NetworkX is a Python language package for exploration and analysis of networks and network algorithms. The core package provides data structures for representing many types of networks, or graphs, including simple graphs, directed graphs, and graphs with parallel edges and self loops. The nodes in NetworkX graphs can be any (hashable) Python object and edges can contain arbitrary data; this flexibility mades NetworkX ideal for representing networks found in many different scientific fields. In addition to the basic data structures many graph algorithms are implemented for calculating network properties and structure measures: shortest paths, betweenness centrality, clustering, and degree distributionmore » and many more. NetworkX can read and write various graph formats for eash exchange with existing data, and provides generators for many classic graphs and popular graph models, such as the Erdoes-Renyi, Small World, and Barabasi-Albert models, are included. The ease-of-use and flexibility of the Python programming language together with connection to the SciPy tools make NetworkX a powerful tool for scientific computations. We discuss some of our recent work studying synchronization of coupled oscillators to demonstrate how NetworkX enables research in the field of computational networks.« less

  10. Py4CAtS - Python tools for line-by-line modelling of infrared atmospheric radiative transfer

    NASA Astrophysics Data System (ADS)

    Schreier, Franz; García, Sebastián Gimeno

    2013-05-01

    Py4CAtS — Python scripts for Computational ATmospheric Spectroscopy is a Python re-implementation of the Fortran infrared radiative transfer code GARLIC, where compute-intensive code sections utilize the Numeric/Scientific Python modules for highly optimized array-processing. The individual steps of an infrared or microwave radiative transfer computation are implemented in separate scripts to extract lines of relevant molecules in the spectral range of interest, to compute line-by-line cross sections for given pressure(s) and temperature(s), to combine cross sections to absorption coefficients and optical depths, and to integrate along the line-of-sight to transmission and radiance/intensity. The basic design of the package, numerical and computational aspects relevant for optimization, and a sketch of the typical workflow are presented.

  11. Pharmacophore modeling, docking, and principal component analysis based clustering: combined computer-assisted approaches to identify new inhibitors of the human rhinovirus coat protein.

    PubMed

    Steindl, Theodora M; Crump, Carolyn E; Hayden, Frederick G; Langer, Thierry

    2005-10-06

    The development and application of a sophisticated virtual screening and selection protocol to identify potential, novel inhibitors of the human rhinovirus coat protein employing various computer-assisted strategies are described. A large commercially available database of compounds was screened using a highly selective, structure-based pharmacophore model generated with the program Catalyst. A docking study and a principal component analysis were carried out within the software package Cerius and served to validate and further refine the obtained results. These combined efforts led to the selection of six candidate structures, for which in vitro anti-rhinoviral activity could be shown in a biological assay.

  12. Benchmarking of Computational Models for NDE and SHM of Composites

    NASA Technical Reports Server (NTRS)

    Wheeler, Kevin; Leckey, Cara; Hafiychuk, Vasyl; Juarez, Peter; Timucin, Dogan; Schuet, Stefan; Hafiychuk, Halyna

    2016-01-01

    Ultrasonic wave phenomena constitute the leading physical mechanism for nondestructive evaluation (NDE) and structural health monitoring (SHM) of solid composite materials such as carbon-fiber-reinforced polymer (CFRP) laminates. Computational models of ultrasonic guided-wave excitation, propagation, scattering, and detection in quasi-isotropic laminates can be extremely valuable in designing practically realizable NDE and SHM hardware and software with desired accuracy, reliability, efficiency, and coverage. This paper presents comparisons of guided-wave simulations for CFRP composites implemented using three different simulation codes: two commercial finite-element analysis packages, COMSOL and ABAQUS, and a custom code implementing the Elastodynamic Finite Integration Technique (EFIT). Comparisons are also made to experimental laser Doppler vibrometry data and theoretical dispersion curves.

  13. Investigation of water droplet trajectories within the NASA icing research tunnel

    NASA Technical Reports Server (NTRS)

    Reehorst, Andrew; Ibrahim, Mounir

    1995-01-01

    Water droplet trajectories within the NASA Lewis Research Center's Icing Research Tunnel (IRT) were studied through computer analysis. Of interest was the influence of the wind tunnel contraction and wind tunnel model blockage on the water droplet trajectories. The computer analysis was carried out with a program package consisting of a three-dimensional potential panel code and a three-dimensional droplet trajectory code. The wind tunnel contraction was found to influence the droplet size distribution and liquid water content distribution across the test section from that at the inlet. The wind tunnel walls were found to have negligible influence upon the impingement of water droplets upon a wing model.

  14. Geometric modeling for computer aided design

    NASA Technical Reports Server (NTRS)

    Schwing, James L.

    1992-01-01

    The goal was the design and implementation of software to be used in the conceptual design of aerospace vehicles. Several packages and design studies were completed, including two software tools currently used in the conceptual level design of aerospace vehicles. These tools are the Solid Modeling Aerospace Research Tool (SMART) and the Environment for Software Integration and Execution (EASIE). SMART provides conceptual designers with a rapid prototyping capability and additionally provides initial mass property analysis. EASIE provides a set of interactive utilities that simplify the task of building and executing computer aided design systems consisting of diverse, stand alone analysis codes that result in the streamlining of the exchange of data between programs, reducing errors and improving efficiency.

  15. The Effectiveness of a Computer-Assisted Instruction Package in Supplementing Teaching of Selected Concepts in High School Chemistry: Writing Formulas and Balancing Chemical Equations.

    ERIC Educational Resources Information Center

    Wainwright, Camille L.

    Four classes of high school chemistry students (N=108) were randomly assigned to experimental and control groups to investigate the effectiveness of a computer assisted instruction (CAI) package during a unit on writing/naming of chemical formulas and balancing equations. Students in the experimental group received drill, review, and reinforcement…

  16. Computers in medical education 1: evaluation of a problem-orientated learning package.

    PubMed

    Devitt, P; Palmer, E

    1998-04-01

    A computer-based learning package has been developed, aimed at expanding students' knowledge base, as well as improving data-handling abilities and clinical problem-solving skills. The program was evaluated by monitoring its use by students, canvassing users' opinions and measuring its effectiveness as a learning tool compared to tutorials on the same material. Evaluation was undertaken using three methods: initially, by a questionnaire on computers as a learning tool and the applicability of the content: second, through monitoring by the computer of student use, decisions and performance; finally, through pre- and post-test assessment of fifth-year students who either used a computer package or attended a tutorial on equivalent material. Most students provided positive comments on the learning material and expressed a willingness to see computer-aided learning (CAL) introduced into the curriculum. Over a 3-month period, 26 modules in the program were used on 1246 occasions. Objective measurement showed a significant gain in knowledge, data handling and problem-solving skills. Computer-aided learning is a valuable learning resource that deserves better attention in medical education. When used appropriately, the computer can be an effective learning resource, not only for the delivery of knowledge. but also to help students develop their problem-solving skills.

  17. Cyberdyn supercomputer - a tool for imaging geodinamic processes

    NASA Astrophysics Data System (ADS)

    Pomeran, Mihai; Manea, Vlad; Besutiu, Lucian; Zlagnean, Luminita

    2014-05-01

    More and more physical processes developed within the deep interior of our planet, but with significant impact on the Earth's shape and structure, become subject to numerical modelling by using high performance computing facilities. Nowadays, worldwide an increasing number of research centers decide to make use of such powerful and fast computers for simulating complex phenomena involving fluid dynamics and get deeper insight to intricate problems of Earth's evolution. With the CYBERDYN cybernetic infrastructure (CCI), the Solid Earth Dynamics Department in the Institute of Geodynamics of the Romanian Academy boldly steps into the 21st century by entering the research area of computational geodynamics. The project that made possible this advancement, has been jointly supported by EU and Romanian Government through the Structural and Cohesion Funds. It lasted for about three years, ending October 2013. CCI is basically a modern high performance Beowulf-type supercomputer (HPCC), combined with a high performance visualization cluster (HPVC) and a GeoWall. The infrastructure is mainly structured around 1344 cores and 3 TB of RAM. The high speed interconnect is provided by a Qlogic InfiniBand switch, able to transfer up to 40 Gbps. The CCI storage component is a 40 TB Panasas NAS. The operating system is Linux (CentOS). For control and maintenance, the Bright Cluster Manager package is used. The SGE job scheduler manages the job queues. CCI has been designed for a theoretical peak performance up to 11.2 TFlops. Speed tests showed that a high resolution numerical model (256 × 256 × 128 FEM elements) could be resolved with a mean computational speed of 1 time step at 30 seconds, by employing only a fraction of the computing power (20%). After passing the mandatory tests, the CCI has been involved in numerical modelling of various scenarios related to the East Carpathians tectonic and geodynamic evolution, including the Neogene magmatic activity, and the intriguing intermediate-depth seismicity within the so-called Vrancea zone. The CFD code for numerical modelling is CitcomS, a widely employed open source package specifically developed for earth sciences. Several preliminary 3D geodynamic models for simulating an assumed subduction or the effect of a mantle plume will be presented and discussed.

  18. PICSiP: new system-in-package technology using a high bandwidth photonic interconnection layer for converged microsystems

    NASA Astrophysics Data System (ADS)

    Tekin, Tolga; Töpper, Michael; Reichl, Herbert

    2009-05-01

    Technological frontiers between semiconductor technology, packaging, and system design are disappearing. Scaling down geometries [1] alone does not provide improvement of performance, less power, smaller size, and lower cost. It will require "More than Moore" [2] through the tighter integration of system level components at the package level. System-in-Package (SiP) will deliver the efficient use of three dimensions (3D) through innovation in packaging and interconnect technology. A key bottleneck to the implementation of high-performance microelectronic systems, including SiP, is the lack of lowlatency, high-bandwidth, and high density off-chip interconnects. Some of the challenges in achieving high-bandwidth chip-to-chip communication using electrical interconnects include the high losses in the substrate dielectric, reflections and impedance discontinuities, and susceptibility to crosstalk [3]. Obviously, the incentive for the use of photonics to overcome the challenges and leverage low-latency and highbandwidth communication will enable the vision of optical computing within next generation architectures. Supercomputers of today offer sustained performance of more than petaflops, which can be increased by utilizing optical interconnects. Next generation computing architectures are needed with ultra low power consumption; ultra high performance with novel interconnection technologies. In this paper we will discuss a CMOS compatible underlying technology to enable next generation optical computing architectures. By introducing a new optical layer within the 3D SiP, the development of converged microsystems, deployment for next generation optical computing architecture will be leveraged.

  19. An integrated computational tool for precipitation simulation

    NASA Astrophysics Data System (ADS)

    Cao, W.; Zhang, F.; Chen, S.-L.; Zhang, C.; Chang, Y. A.

    2011-07-01

    Computer aided materials design is of increasing interest because the conventional approach solely relying on experimentation is no longer viable within the constraint of available resources. Modeling of microstructure and mechanical properties during precipitation plays a critical role in understanding the behavior of materials and thus accelerating the development of materials. Nevertheless, an integrated computational tool coupling reliable thermodynamic calculation, kinetic simulation, and property prediction of multi-component systems for industrial applications is rarely available. In this regard, we are developing a software package, PanPrecipitation, under the framework of integrated computational materials engineering to simulate precipitation kinetics. It is seamlessly integrated with the thermodynamic calculation engine, PanEngine, to obtain accurate thermodynamic properties and atomic mobility data necessary for precipitation simulation.

  20. Making it Easy to Construct Accurate Hydrological Models that Exploit High Performance Computers (Invited)

    NASA Astrophysics Data System (ADS)

    Kees, C. E.; Farthing, M. W.; Terrel, A.; Certik, O.; Seljebotn, D.

    2013-12-01

    This presentation will focus on two barriers to progress in the hydrological modeling community, and research and development conducted to lessen or eliminate them. The first is a barrier to sharing hydrological models among specialized scientists that is caused by intertwining the implementation of numerical methods with the implementation of abstract numerical modeling information. In the Proteus toolkit for computational methods and simulation, we have decoupled these two important parts of computational model through separate "physics" and "numerics" interfaces. More recently we have begun developing the Strong Form Language for easy and direct representation of the mathematical model formulation in a domain specific language embedded in Python. The second major barrier is sharing ANY scientific software tools that have complex library or module dependencies, as most parallel, multi-physics hydrological models must have. In this setting, users and developer are dependent on an entire distribution, possibly depending on multiple compilers and special instructions depending on the environment of the target machine. To solve these problem we have developed, hashdist, a stateless package management tool and a resulting portable, open source scientific software distribution.

  1. Analysis of the Variable Pressure Growth Chamber using the CASE/A simulation package

    NASA Technical Reports Server (NTRS)

    Mcfadden, Carl D.; Edeen, Marybeth A.

    1992-01-01

    A computer simulation of the Variable Pressure Growth Chamber (VPGC), located at the NASA Johnson Space Center, has been developed using the Computer Aided Systems Engineering and Analysis (CASE/A) package. The model has been used to perform several analyses of the VPGC. The analyses consisted of a study of the effects of a human metabolic load on the VPGC and a study of two new configurations for the temperature and humidity control (THC) subsystem in the VPGC. The objective of the human load analysis was to study the effects of a human metabolic load on the air revitalization and THC subsystems. This included the effects on the quantity of carbon dioxide injected and oxygen removed from the chamber and the effects of the additional sensible and latent heat loads. The objective of the configuration analysis was to compare the two new THC configurations against the current THC configuration to determine which had the best performance.

  2. GENESIS 1.1: A hybrid-parallel molecular dynamics simulator with enhanced sampling algorithms on multiple computational platforms.

    PubMed

    Kobayashi, Chigusa; Jung, Jaewoon; Matsunaga, Yasuhiro; Mori, Takaharu; Ando, Tadashi; Tamura, Koichi; Kamiya, Motoshi; Sugita, Yuji

    2017-09-30

    GENeralized-Ensemble SImulation System (GENESIS) is a software package for molecular dynamics (MD) simulation of biological systems. It is designed to extend limitations in system size and accessible time scale by adopting highly parallelized schemes and enhanced conformational sampling algorithms. In this new version, GENESIS 1.1, new functions and advanced algorithms have been added. The all-atom and coarse-grained potential energy functions used in AMBER and GROMACS packages now become available in addition to CHARMM energy functions. The performance of MD simulations has been greatly improved by further optimization, multiple time-step integration, and hybrid (CPU + GPU) computing. The string method and replica-exchange umbrella sampling with flexible collective variable choice are used for finding the minimum free-energy pathway and obtaining free-energy profiles for conformational changes of a macromolecule. These new features increase the usefulness and power of GENESIS for modeling and simulation in biological research. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  3. Increasing Student Retention Through Application of Attitude Change Packages (and) Increasing GPA and Student Retention of Low Income Minority Community College Students Through Application of Nightengale Conant Change Packages; A Pilot STUDY.

    ERIC Educational Resources Information Center

    Preising, Paul P.; Frost, Robert

    The first of two studies reported was conducted to determine whether unemployed aerospace engineers who received computer science training as well as the Nightengale-Conant attitude change packages would have a significantly higher course completion rate than control classes who were given the same training without the attitude change packages.…

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walker, La Tonya Nicole; Malczynski, Leonard A.

    DYNAMO is a computer program for building and running 'continuous' simulation models. It was developed by the Industrial Dynamics Group at the Massachusetts Institute of Technology for simulating dynamic feedback models of business, economic, and social systems. The history of the system dynamics method since 1957 includes many classic models built in DYANMO. It was not until the late 1980s that software was built to take advantage of the rise of personal computers and graphical user interfaces that DYNAMO was supplanted. There is much learning and insight to be gained from examining the DYANMO models and their accompanying research papers.more » We believe that it is a worthwhile exercise to convert DYNAMO models to more recent software packages. We have made an attempt to make it easier to turn these models into a more current system dynamics software language, Powersim © Studio produced by Powersim AS 2 of Bergen, Norway. This guide shows how to convert DYNAMO syntax into Studio syntax.« less

  5. A 3D visualization and simulation of the individual human jaw.

    PubMed

    Muftić, Osman; Keros, Jadranka; Baksa, Sarajko; Carek, Vlado; Matković, Ivo

    2003-01-01

    A new biomechanical three-dimensional (3D) model for the human mandible based on computer-generated virtual model is proposed. Using maps obtained from the special kinds of photos of the face of the real subject, it is possible to attribute personality to the virtual character, while computer animation offers movements and characteristics within the confines of space and time of the virtual world. A simple two-dimensional model of the jaw cannot explain the biomechanics, where the muscular forces through occlusion and condylar surfaces are in the state of 3D equilibrium. In the model all forces are resolved into components according to a selected coordinate system. The muscular forces act on the jaw, along with the necessary force level for chewing as some kind of mandible balance, preventing dislocation and loading of nonarticular tissues. In the work is used new approach to computer-generated animation of virtual 3D characters (called "Body SABA"), using in one object package of minimal costs and easy for operation.

  6. Skylab earth resources experiment package /EREP/ - Sea surface topography experiment

    NASA Technical Reports Server (NTRS)

    Vonbun, F. O.; Marsh, J. G.; Mcgoogan, J. T.; Leitao, C. D.; Vincent, S.; Wells, W. T.

    1976-01-01

    The S-193 Skylab radar altimeter was operated in a round-the-world pass on Jan. 31, 1974. The main purpose of this experiment was to test and 'measure' the variation of the sea surface topography using the Goddard Space Flight Center (GSFC) geoid model as a reference. This model is based upon 430,000 satellite and 25,000 ground gravity observations. Variations of the sea surface on the order of -40 to +60 m were observed along this pass. The 'computed' and 'measured' sea surfaces have an rms agreement on the order of 7 m. This is quite satisfactory, considering that this was the first time the sea surface has been observed directly over a distance of nearly 35,000 km and compared to a computed model. The Skylab orbit for this global pass was computed using the Goddard Earth Model (GEM 6) and S-band radar tracking data, resulting in an orbital height uncertainty of better than 5 m over one orbital period.

  7. Design of Bioprosthetic Aortic Valves using biaxial test data.

    PubMed

    Dabiri, Y; Paulson, K; Tyberg, J; Ronsky, J; Ali, I; Di Martino, E; Narine, K

    2015-01-01

    Bioprosthetic Aortic Valves (BAVs) do not have the serious limitations of mechanical aortic valves in terms of thrombosis. However, the lifetime of BAVs is too short, often requiring repeated surgeries. The lifetime of BAVs might be improved by using computer simulations of the structural behavior of the leaflets. The goal of this study was to develop a numerical model applicable to the optimization of durability of BAVs. The constitutive equations were derived using biaxial tensile tests. Using a Fung model, stress and strain data were computed from biaxial test data. SolidWorks was used to develop the geometry of the leaflets, and ABAQUS finite element software package was used for finite element calculations. Results showed the model is consistent with experimental observations. Reaction forces computed by the model corresponded with experimental measurements when the biaxial test was simulated. As well, the location of maximum stresses corresponded to the locations of frequent tearing of BAV leaflets. Results suggest that BAV design can be optimized with respect to durability.

  8. FREQ: A computational package for multivariable system loop-shaping procedures

    NASA Technical Reports Server (NTRS)

    Giesy, Daniel P.; Armstrong, Ernest S.

    1989-01-01

    Many approaches in the field of linear, multivariable time-invariant systems analysis and controller synthesis employ loop-sharing procedures wherein design parameters are chosen to shape frequency-response singular value plots of selected transfer matrices. A software package, FREQ, is documented for computing within on unified framework many of the most used multivariable transfer matrices for both continuous and discrete systems. The matrices are evaluated at user-selected frequency-response values, and singular values against frequency. Example computations are presented to demonstrate the use of the FREQ code.

  9. The development of an engineering computer graphics laboratory

    NASA Technical Reports Server (NTRS)

    Anderson, D. C.; Garrett, R. E.

    1975-01-01

    Hardware and software systems developed to further research and education in interactive computer graphics were described, as well as several of the ongoing application-oriented projects, educational graphics programs, and graduate research projects. The software system consists of a FORTRAN 4 subroutine package, in conjunction with a PDP 11/40 minicomputer as the primary computation processor and the Imlac PDS-1 as an intelligent display processor. The package comprises a comprehensive set of graphics routines for dynamic, structured two-dimensional display manipulation, and numerous routines to handle a variety of input devices at the Imlac.

  10. NiftySim: A GPU-based nonlinear finite element package for simulation of soft tissue biomechanics.

    PubMed

    Johnsen, Stian F; Taylor, Zeike A; Clarkson, Matthew J; Hipwell, John; Modat, Marc; Eiben, Bjoern; Han, Lianghao; Hu, Yipeng; Mertzanidou, Thomy; Hawkes, David J; Ourselin, Sebastien

    2015-07-01

    NiftySim, an open-source finite element toolkit, has been designed to allow incorporation of high-performance soft tissue simulation capabilities into biomedical applications. The toolkit provides the option of execution on fast graphics processing unit (GPU) hardware, numerous constitutive models and solid-element options, membrane and shell elements, and contact modelling facilities, in a simple to use library. The toolkit is founded on the total Lagrangian explicit dynamics (TLEDs) algorithm, which has been shown to be efficient and accurate for simulation of soft tissues. The base code is written in C[Formula: see text], and GPU execution is achieved using the nVidia CUDA framework. In most cases, interaction with the underlying solvers can be achieved through a single Simulator class, which may be embedded directly in third-party applications such as, surgical guidance systems. Advanced capabilities such as contact modelling and nonlinear constitutive models are also provided, as are more experimental technologies like reduced order modelling. A consistent description of the underlying solution algorithm, its implementation with a focus on GPU execution, and examples of the toolkit's usage in biomedical applications are provided. Efficient mapping of the TLED algorithm to parallel hardware results in very high computational performance, far exceeding that available in commercial packages. The NiftySim toolkit provides high-performance soft tissue simulation capabilities using GPU technology for biomechanical simulation research applications in medical image computing, surgical simulation, and surgical guidance applications.

  11. X based interactive computer graphics applications for aerodynamic design and education

    NASA Technical Reports Server (NTRS)

    Benson, Thomas J.; Higgs, C. Fred, III

    1995-01-01

    Six computer applications packages have been developed to solve a variety of aerodynamic problems in an interactive environment on a single workstation. The packages perform classical one dimensional analysis under the control of a graphical user interface and can be used for preliminary design or educational purposes. The programs were originally developed on a Silicon Graphics workstation and used the GL version of the FORMS library as the graphical user interface. These programs have recently been converted to the XFORMS library of X based graphics widgets and have been tested on SGI, IBM, Sun, HP and PC-Lunix computers. The paper will show results from the new VU-DUCT program as a prime example. VU-DUCT has been developed as an educational package for the study of subsonic open and closed loop wind tunnels.

  12. Simulation of the communication system between an AUV group and a surface station

    NASA Astrophysics Data System (ADS)

    Burtovaya, D.; Demin, A.; Demeshko, M.; Moiseev, A.; Kudryashova, A.

    2017-01-01

    An object model for simulation of the communications system of an autonomous underwater vehicles (AUV) group with a surface station is proposed in the paper. Implementation of the model is made on the basis of the software package “Object Distribution Simulation”. All structural relationships and behavior details are described. The application was developed on the basis of the proposed model and is now used for computational experiments on the simulation of the communications system between the autonomous underwater vehicles group and a surface station.

  13. A novel methodology to model the cooling processes of packed horticultural produce using 3D shape models

    NASA Astrophysics Data System (ADS)

    Gruyters, Willem; Verboven, Pieter; Rogge, Seppe; Vanmaercke, Simon; Ramon, Herman; Nicolai, Bart

    2017-10-01

    Freshly harvested horticultural produce require a proper temperature management to maintain their high economic value. Towards this end, low temperature storage is of crucial importance to maintain a high product quality. Optimizing both the package design of packed produce and the different steps in the postharvest cold chain can be achieved by numerical modelling of the relevant transport phenomena. This work presents a novel methodology to accurately model both the random filling of produce in a package and the subsequent cooling process. First, a cultivar-specific database of more than 100 realistic CAD models of apple and pear fruit is built with a validated geometrical 3D shape model generator. To have an accurate representation of a realistic picking season, the model generator also takes into account the biological variability of the produce shape. Next, a discrete element model (DEM) randomly chooses surface meshed bodies from the database to simulate the gravitational filling process of produce in a box or bin, using actual mechanical properties of the fruit. A computational fluid dynamics (CFD) model is then developed with the final stacking arrangement of the produce to study the cooling efficiency of packages under several conditions and configurations. Here, a typical precooling operation is simulated to demonstrate the large differences between using actual 3D shapes of the fruit and an equivalent spheres approach that simplifies the problem drastically. From this study, it is concluded that using a simplified representation of the actual fruit shape may lead to a severe overestimation of the cooling behaviour.

  14. BlueSNP: R package for highly scalable genome-wide association studies using Hadoop clusters.

    PubMed

    Huang, Hailiang; Tata, Sandeep; Prill, Robert J

    2013-01-01

    Computational workloads for genome-wide association studies (GWAS) are growing in scale and complexity outpacing the capabilities of single-threaded software designed for personal computers. The BlueSNP R package implements GWAS statistical tests in the R programming language and executes the calculations across computer clusters configured with Apache Hadoop, a de facto standard framework for distributed data processing using the MapReduce formalism. BlueSNP makes computationally intensive analyses, such as estimating empirical p-values via data permutation, and searching for expression quantitative trait loci over thousands of genes, feasible for large genotype-phenotype datasets. http://github.com/ibm-bioinformatics/bluesnp

  15. Characterization of open-cycle coal-fired MHD generators. Quarterly technical summary report No. 6, October 1--December 31, 1977. [PACKAGE code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kolb, C.E.; Yousefian, V.; Wormhoudt, J.

    1978-01-30

    Research has included theoretical modeling of important plasma chemical effects such as: conductivity reductions due to condensed slag/electron interactions; conductivity and generator efficiency reductions due to the formation of slag-related negative ion species; and the loss of alkali seed due to chemical combination with condensed slag. A summary of the major conclusions in each of these areas is presented. A major output of the modeling effort has been the development of an MHD plasma chemistry core flow model. This model has been formulated into a computer program designated the PACKAGE code (Plasma Analysis, Chemical Kinetics, And Generator Efficiency). The PACKAGEmore » code is designed to calculate the effect of coal rank, ash percentage, ash composition, air preheat temperatures, equivalence ratio, and various generator channel parameters on the overall efficiency of open-cycle, coal-fired MHD generators. A complete description of the PACKAGE code and a preliminary version of the PACKAGE user's manual are included. A laboratory measurements program involving direct, mass spectrometric sampling of the positive and negative ions formed in a one atmosphere coal combustion plasma was also completed during the contract's initial phase. The relative ion concentrations formed in a plasma due to the methane augmented combustion of pulverized Montana Rosebud coal with potassium carbonate seed and preheated air are summarized. Positive ions measured include K/sup +/, KO/sup +/, Na/sup +/, Rb/sup +/, Cs/sup +/, and CsO/sup +/, while negative ions identified include PO/sub 3//sup -/, PO/sub 2//sup -/, BO/sub 2//sup -/, OH/sup -/, SH/sup -/, and probably HCrO/sub 3/, HMoO/sub 4//sup -/, and HWO/sub 3//sup -/. Comparison of the measurements with PACKAGE code predictions are presented. Preliminary design considerations for a mass spectrometric sampling probe capable of characterizing coal combustion plasmas from full scale combustors and flow trains are presented and discussed.« less

  16. Community-driven computational biology with Debian Linux.

    PubMed

    Möller, Steffen; Krabbenhöft, Hajo Nils; Tille, Andreas; Paleino, David; Williams, Alan; Wolstencroft, Katy; Goble, Carole; Holland, Richard; Belhachemi, Dominique; Plessy, Charles

    2010-12-21

    The Open Source movement and its technologies are popular in the bioinformatics community because they provide freely available tools and resources for research. In order to feed the steady demand for updates on software and associated data, a service infrastructure is required for sharing and providing these tools to heterogeneous computing environments. The Debian Med initiative provides ready and coherent software packages for medical informatics and bioinformatics. These packages can be used together in Taverna workflows via the UseCase plugin to manage execution on local or remote machines. If such packages are available in cloud computing environments, the underlying hardware and the analysis pipelines can be shared along with the software. Debian Med closes the gap between developers and users. It provides a simple method for offering new releases of software and data resources, thus provisioning a local infrastructure for computational biology. For geographically distributed teams it can ensure they are working on the same versions of tools, in the same conditions. This contributes to the world-wide networking of researchers.

  17. Reconstruction of an Immune Dynamic Model to Simulate the Contrasting Role of Auxin and Cytokinin in Plant Immunity.

    PubMed

    Kaltdorf, Martin; Dandekar, Thomas; Naseem, Muhammad

    2017-01-01

    In order to increase our understanding of biological dependencies in plant immune signaling pathways, the known interactions involved in plant immune networks are modeled. This allows computational analysis to predict the functions of growth related hormones in plant-pathogen interaction. The SQUAD (Standardized Qualitative Dynamical Systems) algorithm first determines stable system states in the network and then use them to compute continuous dynamical system states. Our reconstructed Boolean model encompassing hormone immune networks of Arabidopsis thaliana (Arabidopsis) and pathogenicity factors injected by model pathogen Pseudomonas syringae pv. tomato DC3000 (Pst DC3000) can be exploited to determine the impact of growth hormones in plant immunity. We describe a detailed working protocol how to use the modified SQUAD-package by exemplifying the contrasting effects of auxin and cytokinins in shaping plant-pathogen interaction.

  18. Fast randomization of large genomic datasets while preserving alteration counts.

    PubMed

    Gobbi, Andrea; Iorio, Francesco; Dawson, Kevin J; Wedge, David C; Tamborero, David; Alexandrov, Ludmil B; Lopez-Bigas, Nuria; Garnett, Mathew J; Jurman, Giuseppe; Saez-Rodriguez, Julio

    2014-09-01

    Studying combinatorial patterns in cancer genomic datasets has recently emerged as a tool for identifying novel cancer driver networks. Approaches have been devised to quantify, for example, the tendency of a set of genes to be mutated in a 'mutually exclusive' manner. The significance of the proposed metrics is usually evaluated by computing P-values under appropriate null models. To this end, a Monte Carlo method (the switching-algorithm) is used to sample simulated datasets under a null model that preserves patient- and gene-wise mutation rates. In this method, a genomic dataset is represented as a bipartite network, to which Markov chain updates (switching-steps) are applied. These steps modify the network topology, and a minimal number of them must be executed to draw simulated datasets independently under the null model. This number has previously been deducted empirically to be a linear function of the total number of variants, making this process computationally expensive. We present a novel approximate lower bound for the number of switching-steps, derived analytically. Additionally, we have developed the R package BiRewire, including new efficient implementations of the switching-algorithm. We illustrate the performances of BiRewire by applying it to large real cancer genomics datasets. We report vast reductions in time requirement, with respect to existing implementations/bounds and equivalent P-value computations. Thus, we propose BiRewire to study statistical properties in genomic datasets, and other data that can be modeled as bipartite networks. BiRewire is available on BioConductor at http://www.bioconductor.org/packages/2.13/bioc/html/BiRewire.html. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  19. Computational solution verification and validation applied to a thermal model of a ruggedized instrumentation package

    DOE PAGES

    Scott, Sarah Nicole; Templeton, Jeremy Alan; Hough, Patricia Diane; ...

    2014-01-01

    This study details a methodology for quantification of errors and uncertainties of a finite element heat transfer model applied to a Ruggedized Instrumentation Package (RIP). The proposed verification and validation (V&V) process includes solution verification to examine errors associated with the code's solution techniques, and model validation to assess the model's predictive capability for quantities of interest. The model was subjected to mesh resolution and numerical parameters sensitivity studies to determine reasonable parameter values and to understand how they change the overall model response and performance criteria. To facilitate quantification of the uncertainty associated with the mesh, automatic meshing andmore » mesh refining/coarsening algorithms were created and implemented on the complex geometry of the RIP. Automated software to vary model inputs was also developed to determine the solution’s sensitivity to numerical and physical parameters. The model was compared with an experiment to demonstrate its accuracy and determine the importance of both modelled and unmodelled physics in quantifying the results' uncertainty. An emphasis is placed on automating the V&V process to enable uncertainty quantification within tight development schedules.« less

  20. MGtoolkit: A python package for implementing metagraphs

    NASA Astrophysics Data System (ADS)

    Ranathunga, D.; Nguyen, H.; Roughan, M.

    In this paper we present MGtoolkit: an open-source Python package for implementing metagraphs - a first of its kind. Metagraphs are commonly used to specify and analyse business and computer-network policies alike. MGtoolkit can help verify such policies and promotes learning and experimentation with metagraphs. The package currently provides purely textual output for visualising metagraphs and their analysis results.

  1. General-Purpose Ada Software Packages

    NASA Technical Reports Server (NTRS)

    Klumpp, Allan R.

    1991-01-01

    Collection of subprograms brings to Ada many features from other programming languages. All generic packages designed to be easily instantiated for types declared in user's facility. Most packages have widespread applicability, although some oriented for avionics applications. All designed to facilitate writing new software in Ada. Written on IBM/AT personal computer running under PC DOS, v.3.1.

  2. CMIP: a software package capable of reconstructing genome-wide regulatory networks using gene expression data.

    PubMed

    Zheng, Guangyong; Xu, Yaochen; Zhang, Xiujun; Liu, Zhi-Ping; Wang, Zhuo; Chen, Luonan; Zhu, Xin-Guang

    2016-12-23

    A gene regulatory network (GRN) represents interactions of genes inside a cell or tissue, in which vertexes and edges stand for genes and their regulatory interactions respectively. Reconstruction of gene regulatory networks, in particular, genome-scale networks, is essential for comparative exploration of different species and mechanistic investigation of biological processes. Currently, most of network inference methods are computationally intensive, which are usually effective for small-scale tasks (e.g., networks with a few hundred genes), but are difficult to construct GRNs at genome-scale. Here, we present a software package for gene regulatory network reconstruction at a genomic level, in which gene interaction is measured by the conditional mutual information measurement using a parallel computing framework (so the package is named CMIP). The package is a greatly improved implementation of our previous PCA-CMI algorithm. In CMIP, we provide not only an automatic threshold determination method but also an effective parallel computing framework for network inference. Performance tests on benchmark datasets show that the accuracy of CMIP is comparable to most current network inference methods. Moreover, running tests on synthetic datasets demonstrate that CMIP can handle large datasets especially genome-wide datasets within an acceptable time period. In addition, successful application on a real genomic dataset confirms its practical applicability of the package. This new software package provides a powerful tool for genomic network reconstruction to biological community. The software can be accessed at http://www.picb.ac.cn/CMIP/ .

  3. Overview of NASA Langley's Piezoelectric Ceramic Packaging Technology and Applications

    NASA Technical Reports Server (NTRS)

    Bryant, Robert G.

    2007-01-01

    Over the past decade, NASA Langley Research Center (LaRC) has developed several actuator packaging concepts designed to enhance the performance of commercial electroactive ceramics. NASA LaRC focused on properly designed actuator and sensor packaging for the following reasons, increased durability, protect the working material from the environment, allow for proper mechanical and electrical contact, afford "ready to use" mechanisms that are scalable, and develop fabrication methodology applicable to any active material of the same physical class. It is more cost effective to enhance or tailor the performance of existing systems, through innovative packaging, than to develop, test and manufacture new materials. This approach led to the development of several solid state actuators that include THUNDER, the Macrofiber Composite or (MFC) and the Radial Field Diaphragm or (RFD). All these actuators are fabricated using standard materials and processes derived from earlier concepts. NASA s fabrication and packaging technology as yielded, piezoelectric actuators and sensors that are easy to implement, reliable, consistent in properties, and of lower cost to manufacture in quantity, than their predecessors (as evidenced by their continued commercial availability.) These piezoelectric actuators have helped foster new research and development in areas involving computational modeling, actuator specific refinements, and engineering system redesign which led to new applications for piezo-based devices that replace traditional systems currently in use.

  4. Impact of stent mis-sizing and mis-positioning on coronary fluid wall shear and intramural stress

    PubMed Central

    Chen, Henry Y.; Koo, Bon-Kwon; Bhatt, Deepak L.

    2013-01-01

    Stent deployments with geographical miss (GM) are associated with increased risk of target-vessel revascularization and periprocedural myocardial infarction. The aim of the current study was to investigate the underlying biomechanical mechanisms for adverse events with GM. The hypothesis is that stent deployment with GM [longitudinal GM, or LGM (i.e., stent not centered on the lesion); or radial GM, RGM (i.e., stent oversizing)] results in unfavorable fluid wall shear stress (WSS), WSS gradient (WSSG), oscillatory shear index (OSI), and intramural circumferential wall stress (CWS). Three-dimensional computational models of stents and plaque were created using a computer-assisted design package. The models were then solved with validated finite element and computational fluid dynamic packages. The dynamic process of large deformation stent deployment was modeled to expand the stent to the desired vessel size. Stent deployed with GM resulted in a 45% increase in vessel CWS compared with stents that were centered and fully covered the lesion. A 20% oversized stent resulted in 72% higher CWS than a correct sized stent. The linkages between the struts had much higher stress than the main struts (i.e., 180 MPa vs. 80 MPa). Additionally, LGM and RGM reduced endothelial WSS and increased WSSG and OSI. The simulations suggest that both LGM and RGM adversely reduce WSS but increase WSSG, OSI, and CWS. These findings highlight the potential mechanical mechanism of the higher adverse events and underscore the importance of stent positioning and sizing for improved clinical outcome. PMID:23722708

  5. [Intranarcotic infusion therapy -- a computer interpretation using the program package SPSS (Statistical Package for the Social Sciences)].

    PubMed

    Link, J; Pachaly, J

    1975-08-01

    In a retrospective 18-month study the infusion therapy applied in a great anesthesia institute is examined. The data of the course of anesthesia recorded on magnetic tape by routine are analysed for this purpose bya computer with the statistical program SPSS. It could be proved that the behaviour of the several anesthetists is very different. Various correlations are discussed.

  6. 1986 Petroleum Software Directory. [800 mini, micro and mainframe computer software packages

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1985-01-01

    Pennwell's 1986 Petroleum Software Directory is a complete listing of software created specifically for the petroleum industry. Details are provided on over 800 mini, micro and mainframe computer software packages from more than 250 different companies. An accountant can locate programs to automate bookkeeping functions in large oil and gas production firms. A pipeline engineer will find programs designed to calculate line flow and wellbore pressure drop.

  7. Mixing, Combustion, and Other Interface Dominated Flows; Paragraphs 3.2.1 A, B, C and 3.2.2 A

    DTIC Science & Technology

    2014-04-09

    Condensed Matter Physics , (12 2010): 43401. doi: H. Lim, Y. Yu, J. Glimm, X. L. Li, D.H. Sharp. Subgrid Models for Mass and Thermal Diffusion in...zone and a series of radial cracks in solid plates hit by high velocity projectiles). • Only 2D dimensional models • Serial codes for running on single ...exter- nal parallel packages TAO and Global Arrays, developed within DOE high performance computing initiatives. A Schwartz-type overlapping domain

  8. TEMPEST/N33.5. Computational Fluid Dynamics Package For Incompressible, 3D, Time Dependent Pro

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trent, Dr.D.S.; Eyler, Dr.L.L.

    TEMPESTN33.5 provides numerical solutions to general incompressible flow problems with coupled heat transfer in fluids and solids. Turbulence is created with a k-e model and gas, liquid or solid constituents may be included with the bulk flow. Problems may be modeled in Cartesian or cylindrical coordinates. Limitations include incompressible flow, Boussinesq approximation, and passive constituents. No direct steady state solution is available; steady state is obtained as the limit of a transient.

  9. DISTING: A web application for fast algorithmic computation of alternative indistinguishable linear compartmental models.

    PubMed

    Davidson, Natalie R; Godfrey, Keith R; Alquaddoomi, Faisal; Nola, David; DiStefano, Joseph J

    2017-05-01

    We describe and illustrate use of DISTING, a novel web application for computing alternative structurally identifiable linear compartmental models that are input-output indistinguishable from a postulated linear compartmental model. Several computer packages are available for analysing the structural identifiability of such models, but DISTING is the first to be made available for assessing indistinguishability. The computational algorithms embedded in DISTING are based on advanced versions of established geometric and algebraic properties of linear compartmental models, embedded in a user-friendly graphic model user interface. Novel computational tools greatly speed up the overall procedure. These include algorithms for Jacobian matrix reduction, submatrix rank reduction, and parallelization of candidate rank computations in symbolic matrix analysis. The application of DISTING to three postulated models with respectively two, three and four compartments is given. The 2-compartment example is used to illustrate the indistinguishability problem; the original (unidentifiable) model is found to have two structurally identifiable models that are indistinguishable from it. The 3-compartment example has three structurally identifiable indistinguishable models. It is found from DISTING that the four-compartment example has five structurally identifiable models indistinguishable from the original postulated model. This example shows that care is needed when dealing with models that have two or more compartments which are neither perturbed nor observed, because the numbering of these compartments may be arbitrary. DISTING is universally and freely available via the Internet. It is easy to use and circumvents tedious and complicated algebraic analysis previously done by hand. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Scilab software package for the study of dynamical systems

    NASA Astrophysics Data System (ADS)

    Bordeianu, C. C.; Beşliu, C.; Jipa, Al.; Felea, D.; Grossu, I. V.

    2008-05-01

    This work presents a new software package for the study of chaotic flows and maps. The codes were written using Scilab, a software package for numerical computations providing a powerful open computing environment for engineering and scientific applications. It was found that Scilab provides various functions for ordinary differential equation solving, Fast Fourier Transform, autocorrelation, and excellent 2D and 3D graphical capabilities. The chaotic behaviors of the nonlinear dynamics systems were analyzed using phase-space maps, autocorrelation functions, power spectra, Lyapunov exponents and Kolmogorov-Sinai entropy. Various well known examples are implemented, with the capability of the users inserting their own ODE. Program summaryProgram title: Chaos Catalogue identifier: AEAP_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEAP_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 885 No. of bytes in distributed program, including test data, etc.: 5925 Distribution format: tar.gz Programming language: Scilab 3.1.1 Computer: PC-compatible running Scilab on MS Windows or Linux Operating system: Windows XP, Linux RAM: below 100 Megabytes Classification: 6.2 Nature of problem: Any physical model containing linear or nonlinear ordinary differential equations (ODE). Solution method: Numerical solving of ordinary differential equations. The chaotic behavior of the nonlinear dynamical system is analyzed using Poincaré sections, phase-space maps, autocorrelation functions, power spectra, Lyapunov exponents and Kolmogorov-Sinai entropies. Restrictions: The package routines are normally able to handle ODE systems of high orders (up to order twelve and possibly higher), depending on the nature of the problem. Running time: 10 to 20 seconds for problems that do not involve Lyapunov exponents calculation; 60 to 1000 seconds for problems that involve high orders ODE and Lyapunov exponents calculation.

  11. [Not Available].

    PubMed

    Pecevski, Dejan; Natschläger, Thomas; Schuch, Klaus

    2009-01-01

    The Parallel Circuit SIMulator (PCSIM) is a software package for simulation of neural circuits. It is primarily designed for distributed simulation of large scale networks of spiking point neurons. Although its computational core is written in C++, PCSIM's primary interface is implemented in the Python programming language, which is a powerful programming environment and allows the user to easily integrate the neural circuit simulator with data analysis and visualization tools to manage the full neural modeling life cycle. The main focus of this paper is to describe PCSIM's full integration into Python and the benefits thereof. In particular we will investigate how the automatically generated bidirectional interface and PCSIM's object-oriented modular framework enable the user to adopt a hybrid modeling approach: using and extending PCSIM's functionality either employing pure Python or C++ and thus combining the advantages of both worlds. Furthermore, we describe several supplementary PCSIM packages written in pure Python and tailored towards setting up and analyzing neural simulations.

  12. IN-PACKAGE CHEMISTRY ABSTRACTION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    E. Thomas

    2005-07-14

    This report was developed in accordance with the requirements in ''Technical Work Plan for Postclosure Waste Form Modeling'' (BSC 2005 [DIRS 173246]). The purpose of the in-package chemistry model is to predict the bulk chemistry inside of a breached waste package and to provide simplified expressions of that chemistry as a function of time after breach to Total Systems Performance Assessment for the License Application (TSPA-LA). The scope of this report is to describe the development and validation of the in-package chemistry model. The in-package model is a combination of two models, a batch reactor model, which uses the EQ3/6more » geochemistry-modeling tool, and a surface complexation model, which is applied to the results of the batch reactor model. The batch reactor model considers chemical interactions of water with the waste package materials, and the waste form for commercial spent nuclear fuel (CSNF) waste packages and codisposed (CDSP) waste packages containing high-level waste glass (HLWG) and DOE spent fuel. The surface complexation model includes the impact of fluid-surface interactions (i.e., surface complexation) on the resulting fluid composition. The model examines two types of water influx: (1) the condensation of water vapor diffusing into the waste package, and (2) seepage water entering the waste package as a liquid from the drift. (1) Vapor-Influx Case: The condensation of vapor onto the waste package internals is simulated as pure H{sub 2}O and enters at a rate determined by the water vapor pressure for representative temperature and relative humidity conditions. (2) Liquid-Influx Case: The water entering a waste package from the drift is simulated as typical groundwater and enters at a rate determined by the amount of seepage available to flow through openings in a breached waste package.« less

  13. Image-Based Modeling Techniques for Architectural Heritage 3d Digitalization: Limits and Potentialities

    NASA Astrophysics Data System (ADS)

    Santagati, C.; Inzerillo, L.; Di Paola, F.

    2013-07-01

    3D reconstruction from images has undergone a revolution in the last few years. Computer vision techniques use photographs from data set collection to rapidly build detailed 3D models. The simultaneous applications of different algorithms (MVS), the different techniques of image matching, feature extracting and mesh optimization are inside an active field of research in computer vision. The results are promising: the obtained models are beginning to challenge the precision of laser-based reconstructions. Among all the possibilities we can mainly distinguish desktop and web-based packages. Those last ones offer the opportunity to exploit the power of cloud computing in order to carry out a semi-automatic data processing, thus allowing the user to fulfill other tasks on its computer; whereas desktop systems employ too much processing time and hard heavy approaches. Computer vision researchers have explored many applications to verify the visual accuracy of 3D model but the approaches to verify metric accuracy are few and no one is on Autodesk 123D Catch applied on Architectural Heritage Documentation. Our approach to this challenging problem is to compare the 3Dmodels by Autodesk 123D Catch and 3D models by terrestrial LIDAR considering different object size, from the detail (capitals, moldings, bases) to large scale buildings for practitioner purpose.

  14. A Component Approach to Collaborative Scientific Software Development: Tools and Techniques Utilized by the Quantum Chemistry Science Application Partnership

    DOE PAGES

    Kenny, Joseph P.; Janssen, Curtis L.; Gordon, Mark S.; ...

    2008-01-01

    Cutting-edge scientific computing software is complex, increasingly involving the coupling of multiple packages to combine advanced algorithms or simulations at multiple physical scales. Component-based software engineering (CBSE) has been advanced as a technique for managing this complexity, and complex component applications have been created in the quantum chemistry domain, as well as several other simulation areas, using the component model advocated by the Common Component Architecture (CCA) Forum. While programming models do indeed enable sound software engineering practices, the selection of programming model is just one building block in a comprehensive approach to large-scale collaborative development which must also addressmore » interface and data standardization, and language and package interoperability. We provide an overview of the development approach utilized within the Quantum Chemistry Science Application Partnership, identifying design challenges, describing the techniques which we have adopted to address these challenges and highlighting the advantages which the CCA approach offers for collaborative development.« less

  15. Computer software tool REALM for sustainable water allocation and management.

    PubMed

    Perera, B J C; James, B; Kularathna, M D U

    2005-12-01

    REALM (REsource ALlocation Model) is a generalised computer simulation package that models harvesting and bulk distribution of water resources within a water supply system. It is a modeling tool, which can be applied to develop specific water allocation models. Like other water resource simulation software tools, REALM uses mass-balance accounting at nodes, while the movement of water within carriers is subject to capacity constraints. It uses a fast network linear programming algorithm to optimise the water allocation within the network during each simulation time step, in accordance with user-defined operating rules. This paper describes the main features of REALM and provides potential users with an appreciation of its capabilities. In particular, it describes two case studies covering major urban and rural water supply systems. These case studies illustrate REALM's capabilities in the use of stochastically generated data in water supply planning and management, modelling of environmental flows, and assessing security of supply issues.

  16. A computer program (MODFLOWP) for estimating parameters of a transient, three-dimensional ground-water flow model using nonlinear regression

    USGS Publications Warehouse

    Hill, Mary Catherine

    1992-01-01

    This report documents a new version of the U.S. Geological Survey modular, three-dimensional, finite-difference, ground-water flow model (MODFLOW) which, with the new Parameter-Estimation Package that also is documented in this report, can be used to estimate parameters by nonlinear regression. The new version of MODFLOW is called MODFLOWP (pronounced MOD-FLOW*P), and functions nearly identically to MODFLOW when the ParameterEstimation Package is not used. Parameters are estimated by minimizing a weighted least-squares objective function by the modified Gauss-Newton method or by a conjugate-direction method. Parameters used to calculate the following MODFLOW model inputs can be estimated: Transmissivity and storage coefficient of confined layers; hydraulic conductivity and specific yield of unconfined layers; vertical leakance; vertical anisotropy (used to calculate vertical leakance); horizontal anisotropy; hydraulic conductance of the River, Streamflow-Routing, General-Head Boundary, and Drain Packages; areal recharge rates; maximum evapotranspiration; pumpage rates; and the hydraulic head at constant-head boundaries. Any spatial variation in parameters can be defined by the user. Data used to estimate parameters can include existing independent estimates of parameter values, observed hydraulic heads or temporal changes in hydraulic heads, and observed gains and losses along head-dependent boundaries (such as streams). Model output includes statistics for analyzing the parameter estimates and the model; these statistics can be used to quantify the reliability of the resulting model, to suggest changes in model construction, and to compare results of models constructed in different ways.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grant, Robert

    Under this grant, three significant software packages were developed or improved, all with the goal of improving the ease-of-use of HPC libraries. The first component is a Python package, named DistArray (originally named Odin), that provides a high-level interface to distributed array computing. This interface is based on the popular and widely used NumPy package and is integrated with the IPython project for enhanced interactive parallel distributed computing. The second Python package is the Distributed Array Protocol (DAP) that enables separate distributed array libraries to share arrays efficiently without copying or sending messages. If a distributed array library supports themore » DAP, it is then automatically able to communicate with any other library that also supports the protocol. This protocol allows DistArray to communicate with the Trilinos library via PyTrilinos, which was also enhanced during this project. A third package, PyTrilinos, was extended to support distributed structured arrays (in addition to the unstructured arrays of its original design), allow more flexible distributed arrays (i.e., the restriction to double precision data was lifted), and implement the DAP. DAP support includes both exporting the protocol so that external packages can use distributed Trilinos data structures, and importing the protocol so that PyTrilinos can work with distributed data from external packages.« less

  18. Description of the NCAR Community Climate Model (CCM3). Technical note

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kiehl, J.T.; Hack, J.J.; Bonan, G.B.

    This repor presents the details of the governing equations, physical parameterizations, and numerical algorithms defining the version of the NCAR Community Climate Model designated CCM3. The material provides an overview of the major model components, and the way in which they interact as the numerical integration proceeds. This version of the CCM incorporates significant improvements to the physic package, new capabilities such as the incorporation of a slab ocean component, and a number of enhancements to the implementation (e.g., the ability to integrate the model on parallel distributed-memory computational platforms).

  19. AGAMA: Action-based galaxy modeling framework

    NASA Astrophysics Data System (ADS)

    Vasiliev, Eugene

    2018-05-01

    The AGAMA library models galaxies. It computes gravitational potential and forces, performs orbit integration and analysis, and can convert between position/velocity and action/angle coordinates. It offers a framework for finding best-fit parameters of a model from data and self-consistent multi-component galaxy models, and contains useful auxiliary utilities such as various mathematical routines. The core of the library is written in C++, and there are Python and Fortran interfaces. AGAMA may be used as a plugin for the stellar-dynamical software packages galpy (ascl:1411.008), AMUSE (ascl:1107.007), and NEMO (ascl:1010.051).

  20. Sample size calculations for case-control studies

    Cancer.gov

    This R package can be used to calculate the required samples size for unconditional multivariate analyses of unmatched case-control studies. The sample sizes are for a scalar exposure effect, such as binary, ordinal or continuous exposures. The sample sizes can also be computed for scalar interaction effects. The analyses account for the effects of potential confounder variables that are also included in the multivariate logistic model.

  1. Stan : A Probabilistic Programming Language

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carpenter, Bob; Gelman, Andrew; Hoffman, Matthew D.

    Stan is a probabilistic programming language for specifying statistical models. A Stan program imperatively defines a log probability function over parameters conditioned on specified data and constants. As of version 2.14.0, Stan provides full Bayesian inference for continuous-variable models through Markov chain Monte Carlo methods such as the No-U-Turn sampler, an adaptive form of Hamiltonian Monte Carlo sampling. Penalized maximum likelihood estimates are calculated using optimization methods such as the limited memory Broyden-Fletcher-Goldfarb-Shanno algorithm. Stan is also a platform for computing log densities and their gradients and Hessians, which can be used in alternative algorithms such as variational Bayes, expectationmore » propagation, and marginal inference using approximate integration. To this end, Stan is set up so that the densities, gradients, and Hessians, along with intermediate quantities of the algorithm such as acceptance probabilities, are easily accessible. Stan can also be called from the command line using the cmdstan package, through R using the rstan package, and through Python using the pystan package. All three interfaces support sampling and optimization-based inference with diagnostics and posterior analysis. rstan and pystan also provide access to log probabilities, gradients, Hessians, parameter transforms, and specialized plotting.« less

  2. Development and Validation of an Interactive Liner Design and Impedance Modeling Tool

    NASA Technical Reports Server (NTRS)

    Howerton, Brian M.; Jones, Michael G.; Buckley, James L.

    2012-01-01

    The Interactive Liner Impedance Analysis and Design (ILIAD) tool is a LabVIEW-based software package used to design the composite surface impedance of a series of small-diameter quarter-wavelength resonators incorporating variable depth and sharp bends. Such structures are useful for packaging broadband acoustic liners into constrained spaces for turbofan engine noise control applications. ILIAD s graphical user interface allows the acoustic channel geometry to be drawn in the liner volume while the surface impedance and absorption coefficient calculations are updated in real-time. A one-dimensional transmission line model serves as the basis for the impedance calculation and can be applied to many liner configurations. Experimentally, tonal and broadband acoustic data were acquired in the NASA Langley Normal Incidence Tube over the frequency range of 500 to 3000 Hz at 120 and 140 dB SPL. Normalized impedance spectra were measured using the Two-Microphone Method for the various combinations of channel configurations. Comparisons between the computed and measured impedances show excellent agreement for broadband liners comprised of multiple, variable-depth channels. The software can be used to design arrays of resonators that can be packaged into complex geometries heretofore unsuitable for effective acoustic treatment.

  3. NetMOD Version 2.0 Mathematical Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merchant, Bion J.; Young, Christopher J.; Chael, Eric P.

    2015-08-01

    NetMOD ( Net work M onitoring for O ptimal D etection) is a Java-based software package for conducting simulation of seismic, hydroacoustic and infrasonic networks. Network simulations have long been used to study network resilience to station outages and to determine where additional stations are needed to reduce monitoring thresholds. NetMOD makes use of geophysical models to determine the source characteristics, signal attenuation along the path between the source and station, and the performance and noise properties of the station. These geophysical models are combined to simulate the relative amplitudes of signal and noise that are observed at each ofmore » the stations. From these signal-to-noise ratios (SNR), the probabilities of signal detection at each station and event detection across the network of stations can be computed given a detection threshold. The purpose of this document is to clearly and comprehensively present the mathematical framework used by NetMOD, the software package developed by Sandia National Laboratories to assess the monitoring capability of ground-based sensor networks. Many of the NetMOD equations used for simulations are inherited from the NetSim network capability assessment package developed in the late 1980s by SAIC (Sereno et al., 1990).« less

  4. Stan : A Probabilistic Programming Language

    DOE PAGES

    Carpenter, Bob; Gelman, Andrew; Hoffman, Matthew D.; ...

    2017-01-01

    Stan is a probabilistic programming language for specifying statistical models. A Stan program imperatively defines a log probability function over parameters conditioned on specified data and constants. As of version 2.14.0, Stan provides full Bayesian inference for continuous-variable models through Markov chain Monte Carlo methods such as the No-U-Turn sampler, an adaptive form of Hamiltonian Monte Carlo sampling. Penalized maximum likelihood estimates are calculated using optimization methods such as the limited memory Broyden-Fletcher-Goldfarb-Shanno algorithm. Stan is also a platform for computing log densities and their gradients and Hessians, which can be used in alternative algorithms such as variational Bayes, expectationmore » propagation, and marginal inference using approximate integration. To this end, Stan is set up so that the densities, gradients, and Hessians, along with intermediate quantities of the algorithm such as acceptance probabilities, are easily accessible. Stan can also be called from the command line using the cmdstan package, through R using the rstan package, and through Python using the pystan package. All three interfaces support sampling and optimization-based inference with diagnostics and posterior analysis. rstan and pystan also provide access to log probabilities, gradients, Hessians, parameter transforms, and specialized plotting.« less

  5. Variational data assimilation system "INM RAS - Black Sea"

    NASA Astrophysics Data System (ADS)

    Parmuzin, Eugene; Agoshkov, Valery; Assovskiy, Maksim; Giniatulin, Sergey; Zakharova, Natalia; Kuimov, Grigory; Fomin, Vladimir

    2013-04-01

    Development of Informational-Computational Systems (ICS) for Data Assimilation Procedures is one of multidisciplinary problems. To study and solve these problems one needs to apply modern results from different disciplines and recent developments in: mathematical modeling; theory of adjoint equations and optimal control; inverse problems; numerical methods theory; numerical algebra and scientific computing. The problems discussed above are studied in the Institute of Numerical Mathematics of the Russian Academy of Science (INM RAS) in ICS for Personal Computers (PC). Special problems and questions arise while effective ICS versions for PC are being developed. These problems and questions can be solved with applying modern methods of numerical mathematics and by solving "parallelism problem" using OpenMP technology and special linear algebra packages. In this work the results on the ICS development for PC-ICS "INM RAS - Black Sea" are presented. In the work the following problems and questions are discussed: practical problems that can be studied by ICS; parallelism problems and their solutions with applying of OpenMP technology and the linear algebra packages used in ICS "INM - Black Sea"; Interface of ICS. The results of ICS "INM RAS - Black Sea" testing are presented. Efficiency of technologies and methods applied are discussed. The work was supported by RFBR, grants No. 13-01-00753, 13-05-00715 and by The Ministry of education and science of Russian Federation, project 8291, project 11.519.11.1005 References: [1] V.I. Agoshkov, M.V. Assovskii, S.A. Lebedev, Numerical simulation of Black Sea hydrothermodynamics taking into account tide-forming forces. Russ. J. Numer. Anal. Math. Modelling (2012) 27, No.1, 5-31 [2] E.I. Parmuzin, V.I. Agoshkov, Numerical solution of the variational assimilation problem for sea surface temperature in the model of the Black Sea dynamics. Russ. J. Numer. Anal. Math. Modelling (2012) 27, No.1, 69-94 [3] V.B. Zalesny, N.A. Diansky, V.V. Fomin, S.N. Moshonkin, S.G. Demyshev, Numerical model of the circulation of Black Sea and Sea of Azov. Russ. J. Numer. Anal. Math. Modelling (2012) 27, No.1, 95-111 [4] V.I. Agoshkov, S.V. Giniatulin, G.V. Kuimov. OpenMP technology and linear algebra packages in the variation data assimilation systems. - Abstracts of the 1-st China-Russia Conference on Numerical Algebra with Applications in Radiactive Hydrodynamics, Beijing, China, October 16-18, 2012. [5] Zakharova N.B., Agoshkov V.I., Parmuzin E.I., The new method of ARGO buoys system observation data interpolation. Russian Journal of Numerical Analysis and Mathematical Modelling. Vol. 28, Issue 1, 2013.

  6. ScrumPy: metabolic modelling with Python.

    PubMed

    Poolman, M G

    2006-09-01

    ScrumPy is a software package used for the definition and analysis of metabolic models. It is written using the Python programming language that is also used as a user interface. ScrumPy has features for both kinetic and structural modelling, but the emphasis is on structural modelling and those features of most relevance to analysis of large (genome-scale) models. The aim is at describing ScrumPy's functionality to readers with some knowledge of metabolic modelling, but implementation, programming and other computational details are omitted. ScrumPy is released under the Gnu Public Licence, and available for download from http://mudshark.brookes.ac.uk/ ScrumPy.

  7. Computer program for simulation of variable recharge with the U. S. Geological Survey modular finite-difference ground-water flow model (MODFLOW)

    USGS Publications Warehouse

    Kontis, A.L.

    2001-01-01

    The Variable-Recharge Package is a computerized method designed for use with the U.S. Geological Survey three-dimensional finitedifference ground-water flow model (MODFLOW-88) to simulate areal recharge to an aquifer. It is suitable for simulations of aquifers in which the relation between ground-water levels and land surface can affect the amount and distribution of recharge. The method is based on the premise that recharge to an aquifer cannot occur where the water level is at or above land surface. Consequently, recharge will vary spatially in simulations in which the Variable- Recharge Package is applied, if the water levels are sufficiently high. The input data required by the program for each model cell that can potentially receive recharge includes the average land-surface elevation and a quantity termed ?water available for recharge,? which is equal to precipitation minus evapotranspiration. The Variable-Recharge Package also can be used to simulate recharge to a valley-fill aquifer in which the valley fill and the adjoining uplands are explicitly simulated. Valley-fill aquifers, which are the most common type of aquifer in the glaciated northeastern United States, receive much of their recharge from upland sources as channeled and(or) unchanneled surface runoff and as lateral ground-water flow. Surface runoff in the uplands is generated in the model when the applied water available for recharge is rejected because simulated water levels are at or above land surface. The surface runoff can be distributed to other parts of the model by (1) applying the amount of the surface runoff that flows to upland streams (channeled runoff) to explicitly simulated streams that flow onto the valley floor, and(or) (2) applying the amount that flows downslope toward the valley- fill aquifer (unchanneled runoff) to specified model cells, typically those near the valley wall. An example model of an idealized valley- fill aquifer is presented to demonstrate application of the method and the type of information that can be derived from its use. Documentation of the Variable-Recharge Package is provided in the appendixes and includes listings of model code and of program variables. Comment statements in the program listings provide a narrative of the code. Input-data instructions and printed model output for the package are included.

  8. AlgoRun: a Docker-based packaging system for platform-agnostic implemented algorithms.

    PubMed

    Hosny, Abdelrahman; Vera-Licona, Paola; Laubenbacher, Reinhard; Favre, Thibauld

    2016-08-01

    There is a growing need in bioinformatics for easy-to-use software implementations of algorithms that are usable across platforms. At the same time, reproducibility of computational results is critical and often a challenge due to source code changes over time and dependencies. The approach introduced in this paper addresses both of these needs with AlgoRun, a dedicated packaging system for implemented algorithms, using Docker technology. Implemented algorithms, packaged with AlgoRun, can be executed through a user-friendly interface directly from a web browser or via a standardized RESTful web API to allow easy integration into more complex workflows. The packaged algorithm includes the entire software execution environment, thereby eliminating the common problem of software dependencies and the irreproducibility of computations over time. AlgoRun-packaged algorithms can be published on http://algorun.org, a centralized searchable directory to find existing AlgoRun-packaged algorithms. AlgoRun is available at http://algorun.org and the source code under GPL license is available at https://github.com/algorun laubenbacher@uchc.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  9. TLM-Tracker: software for cell segmentation, tracking and lineage analysis in time-lapse microscopy movies.

    PubMed

    Klein, Johannes; Leupold, Stefan; Biegler, Ilona; Biedendieck, Rebekka; Münch, Richard; Jahn, Dieter

    2012-09-01

    Time-lapse imaging in combination with fluorescence microscopy techniques enable the investigation of gene regulatory circuits and uncovered phenomena like culture heterogeneity. In this context, computational image processing for the analysis of single cell behaviour plays an increasing role in systems biology and mathematical modelling approaches. Consequently, we developed a software package with graphical user interface for the analysis of single bacterial cell behaviour. A new software called TLM-Tracker allows for the flexible and user-friendly interpretation for the segmentation, tracking and lineage analysis of microbial cells in time-lapse movies. The software package, including manual, tutorial video and examples, is available as Matlab code or executable binaries at http://www.tlmtracker.tu-bs.de.

  10. diffuStats: an R package to compute diffusion-based scores on biological networks.

    PubMed

    Picart-Armada, Sergio; Thompson, Wesley K; Buil, Alfonso; Perera-Lluna, Alexandre

    2018-02-01

    Label propagation and diffusion over biological networks are a common mathematical formalism in computational biology for giving context to molecular entities and prioritizing novel candidates in the area of study. There are several choices in conceiving the diffusion process-involving the graph kernel, the score definitions and the presence of a posterior statistical normalization-which have an impact on the results. This manuscript describes diffuStats, an R package that provides a collection of graph kernels and diffusion scores, as well as a parallel permutation analysis for the normalized scores, that eases the computation of the scores and their benchmarking for an optimal choice. The R package diffuStats is publicly available in Bioconductor, https://bioconductor.org, under the GPL-3 license. sergi.picart@upc.edu. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  11. Enhancement of the CAVE computer code. [aerodynamic heating package for nose cones and scramjet engine sidewalls

    NASA Technical Reports Server (NTRS)

    Rathjen, K. A.; Burk, H. O.

    1983-01-01

    The computer code CAVE (Conduction Analysis via Eigenvalues) is a convenient and efficient computer code for predicting two dimensional temperature histories within thermal protection systems for hypersonic vehicles. The capabilities of CAVE were enhanced by incorporation of the following features into the code: real gas effects in the aerodynamic heating predictions, geometry and aerodynamic heating package for analyses of cone shaped bodies, input option to change from laminar to turbulent heating predictions on leading edges, modification to account for reduction in adiabatic wall temperature with increase in leading sweep, geometry package for two dimensional scramjet engine sidewall, with an option for heat transfer to external and internal surfaces, print out modification to provide tables of select temperatures for plotting and storage, and modifications to the radiation calculation procedure to eliminate temperature oscillations induced by high heating rates. These new features are described.

  12. Hermetic electronic packaging of an implantable brain-machine-interface with transcutaneous optical data communication.

    PubMed

    Schuettler, Martin; Kohler, Fabian; Ordonez, Juan S; Stieglitz, Thomas

    2012-01-01

    Future brain-computer-interfaces (BCIs) for severely impaired patients are implanted to electrically contact the brain tissue. Avoiding percutaneous cables requires amplifier and telemetry electronics to be implanted too. We developed a hermetic package that protects the electronic circuitry of a BCI from body moisture while permitting infrared communication through the package wall made from alumina ceramic. The ceramic package is casted in medical grade silicone adhesive, for which we identified MED2-4013 as a promising candidate.

  13. High-Performance Mixed Models Based Genome-Wide Association Analysis with omicABEL software

    PubMed Central

    Fabregat-Traver, Diego; Sharapov, Sodbo Zh.; Hayward, Caroline; Rudan, Igor; Campbell, Harry; Aulchenko, Yurii; Bientinesi, Paolo

    2014-01-01

    To raise the power of genome-wide association studies (GWAS) and avoid false-positive results in structured populations, one can rely on mixed model based tests. When large samples are used, and when multiple traits are to be studied in the ’omics’ context, this approach becomes computationally challenging. Here we consider the problem of mixed-model based GWAS for arbitrary number of traits, and demonstrate that for the analysis of single-trait and multiple-trait scenarios different computational algorithms are optimal. We implement these optimal algorithms in a high-performance computing framework that uses state-of-the-art linear algebra kernels, incorporates optimizations, and avoids redundant computations, increasing throughput while reducing memory usage and energy consumption. We show that, compared to existing libraries, our algorithms and software achieve considerable speed-ups. The OmicABEL software described in this manuscript is available under the GNU GPL v. 3 license as part of the GenABEL project for statistical genomics at http: //www.genabel.org/packages/OmicABEL. PMID:25717363

  14. High-Performance Mixed Models Based Genome-Wide Association Analysis with omicABEL software.

    PubMed

    Fabregat-Traver, Diego; Sharapov, Sodbo Zh; Hayward, Caroline; Rudan, Igor; Campbell, Harry; Aulchenko, Yurii; Bientinesi, Paolo

    2014-01-01

    To raise the power of genome-wide association studies (GWAS) and avoid false-positive results in structured populations, one can rely on mixed model based tests. When large samples are used, and when multiple traits are to be studied in the 'omics' context, this approach becomes computationally challenging. Here we consider the problem of mixed-model based GWAS for arbitrary number of traits, and demonstrate that for the analysis of single-trait and multiple-trait scenarios different computational algorithms are optimal. We implement these optimal algorithms in a high-performance computing framework that uses state-of-the-art linear algebra kernels, incorporates optimizations, and avoids redundant computations, increasing throughput while reducing memory usage and energy consumption. We show that, compared to existing libraries, our algorithms and software achieve considerable speed-ups. The OmicABEL software described in this manuscript is available under the GNU GPL v. 3 license as part of the GenABEL project for statistical genomics at http: //www.genabel.org/packages/OmicABEL.

  15. A review of evaluative studies of computer-based learning in nursing education.

    PubMed

    Lewis, M J; Davies, R; Jenkins, D; Tait, M I

    2001-01-01

    Although there have been numerous attempts to evaluate the learning benefits of computer-based learning (CBL) packages in nursing education, the results obtained have been equivocal. A literature search conducted for this review found 25 reports of the evaluation of nursing CBL packages since 1966. Detailed analysis of the evaluation methods used in these reports revealed that most had significant design flaws, including the use of too small a sample group, the lack of a control group, etc. Because of this, the conclusions reached were not always valid. More effort is required in the design of future evaluation studies of nursing CBL packages. Copyright 2001 Harcourt Publishers Ltd.

  16. NORTICA—a new code for cyclotron analysis

    NASA Astrophysics Data System (ADS)

    Gorelov, D.; Johnson, D.; Marti, F.

    2001-12-01

    The new package NORTICA (Numerical ORbit Tracking In Cyclotrons with Analysis) of computer codes for beam dynamics simulations is under development at NSCL. The package was started as a replacement for the code MONSTER [1] developed in the laboratory in the past. The new codes are capable of beam dynamics simulations in both CCF (Coupled Cyclotron Facility) accelerators, the K500 and K1200 superconducting cyclotrons. The general purpose of this package is assisting in setting and tuning the cyclotrons taking into account the main field and extraction channel imperfections. The computer platform for the package is Alpha Station with UNIX operating system and X-Windows graphic interface. A multiple programming language approach was used in order to combine the reliability of the numerical algorithms developed over the long period of time in the laboratory and the friendliness of modern style user interface. This paper describes the capability and features of the codes in the present state.

  17. Computerized power supply analysis: State equation generation and terminal models

    NASA Technical Reports Server (NTRS)

    Garrett, S. J.

    1978-01-01

    To aid engineers that design power supply systems two analysis tools that can be used with the state equation analysis package were developed. These tools include integration routines that start with the description of a power supply in state equation form and yield analytical results. The first tool uses a computer program that works with the SUPER SCEPTRE circuit analysis program and prints the state equation for an electrical network. The state equations developed automatically by the computer program are used to develop an algorithm for reducing the number of state variables required to describe an electrical network. In this way a second tool is obtained in which the order of the network is reduced and a simpler terminal model is obtained.

  18. Modeling, design, packing and experimental analysis of liquid-phase shear-horizontal surface acoustic wave sensors

    NASA Astrophysics Data System (ADS)

    Pollard, Thomas B

    Recent advances in microbiology, computational capabilities, and microelectromechanical-system fabrication techniques permit modeling, design, and fabrication of low-cost, miniature, sensitive and selective liquid-phase sensors and lab-on-a-chip systems. Such devices are expected to replace expensive, time-consuming, and bulky laboratory-based testing equipment. Potential applications for devices include: fluid characterization for material science and industry; chemical analysis in medicine and pharmacology; study of biological processes; food analysis; chemical kinetics analysis; and environmental monitoring. When combined with liquid-phase packaging, sensors based on surface-acoustic-wave (SAW) technology are considered strong candidates. For this reason such devices are focused on in this work; emphasis placed on device modeling and packaging for liquid-phase operation. Regarding modeling, topics considered include mode excitation efficiency of transducers; mode sensitivity based on guiding structure materials/geometries; and use of new piezoelectric materials. On packaging, topics considered include package interfacing with SAW devices, and minimization of packaging effects on device performance. In this work novel numerical models are theoretically developed and implemented to study propagation and transduction characteristics of sensor designs using wave/constitutive equations, Green's functions, and boundary/finite element methods. Using developed simulation tools that consider finite-thickness of all device electrodes, transduction efficiency for SAW transducers with neighboring uniform or periodic guiding electrodes is reported for the first time. Results indicate finite electrode thickness strongly affects efficiency. Using dense electrodes, efficiency is shown to approach 92% and 100% for uniform and periodic electrode guiding, respectively; yielding improved sensor detection limits. A numerical sensitivity analysis is presented targeting viscosity using uniform-electrode and shear-horizontal mode configurations on potassium-niobate, langasite, and quartz substrates. Optimum configurations are determined yielding maximum sensitivity. Results show mode propagation-loss and sensitivity to viscosity are correlated by a factor independent of substrate material. The analysis is useful for designing devices meeting sensitivity and signal level requirements. A novel, rapid and precise microfluidic chamber alignment/bonding method was developed for SAW platforms. The package is shown to have little effect on device performance and permits simple macrofluidic interfacing. Lastly, prototypes were designed, fabricated, and tested for viscosity and biosensor applications; results show ability to detect as low as 1% glycerol in water and surface-bound DNA crosslinking.

  19. QMCPACK : an open source ab initio quantum Monte Carlo package for the electronic structure of atoms, molecules and solids

    DOE PAGES

    Kim, Jeongnim; Baczewski, Andrew T.; Beaudet, Todd D.; ...

    2018-04-19

    QMCPACK is an open source quantum Monte Carlo package for ab-initio electronic structure calculations. It supports calculations of metallic and insulating solids, molecules, atoms, and some model Hamiltonians. Implemented real space quantum Monte Carlo algorithms include variational, diffusion, and reptation Monte Carlo. QMCPACK uses Slater-Jastrow type trial wave functions in conjunction with a sophisticated optimizer capable of optimizing tens of thousands of parameters. The orbital space auxiliary field quantum Monte Carlo method is also implemented, enabling cross validation between different highly accurate methods. The code is specifically optimized for calculations with large numbers of electrons on the latest high performancemore » computing architectures, including multicore central processing unit (CPU) and graphical processing unit (GPU) systems. We detail the program’s capabilities, outline its structure, and give examples of its use in current research calculations. The package is available at http://www.qmcpack.org.« less

  20. QMCPACK : an open source ab initio quantum Monte Carlo package for the electronic structure of atoms, molecules and solids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Jeongnim; Baczewski, Andrew T.; Beaudet, Todd D.

    QMCPACK is an open source quantum Monte Carlo package for ab-initio electronic structure calculations. It supports calculations of metallic and insulating solids, molecules, atoms, and some model Hamiltonians. Implemented real space quantum Monte Carlo algorithms include variational, diffusion, and reptation Monte Carlo. QMCPACK uses Slater-Jastrow type trial wave functions in conjunction with a sophisticated optimizer capable of optimizing tens of thousands of parameters. The orbital space auxiliary field quantum Monte Carlo method is also implemented, enabling cross validation between different highly accurate methods. The code is specifically optimized for calculations with large numbers of electrons on the latest high performancemore » computing architectures, including multicore central processing unit (CPU) and graphical processing unit (GPU) systems. We detail the program’s capabilities, outline its structure, and give examples of its use in current research calculations. The package is available at http://www.qmcpack.org.« less

  1. DOSE: an R/Bioconductor package for disease ontology semantic and enrichment analysis.

    PubMed

    Yu, Guangchuang; Wang, Li-Gen; Yan, Guang-Rong; He, Qing-Yu

    2015-02-15

    Disease ontology (DO) annotates human genes in the context of disease. DO is important annotation in translating molecular findings from high-throughput data to clinical relevance. DOSE is an R package providing semantic similarity computations among DO terms and genes which allows biologists to explore the similarities of diseases and of gene functions in disease perspective. Enrichment analyses including hypergeometric model and gene set enrichment analysis are also implemented to support discovering disease associations of high-throughput biological data. This allows biologists to verify disease relevance in a biological experiment and identify unexpected disease associations. Comparison among gene clusters is also supported. DOSE is released under Artistic-2.0 License. The source code and documents are freely available through Bioconductor (http://www.bioconductor.org/packages/release/bioc/html/DOSE.html). Supplementary data are available at Bioinformatics online. gcyu@connect.hku.hk or tqyhe@jnu.edu.cn. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  2. Eigensolver for a Sparse, Large Hermitian Matrix

    NASA Technical Reports Server (NTRS)

    Tisdale, E. Robert; Oyafuso, Fabiano; Klimeck, Gerhard; Brown, R. Chris

    2003-01-01

    A parallel-processing computer program finds a few eigenvalues in a sparse Hermitian matrix that contains as many as 100 million diagonal elements. This program finds the eigenvalues faster, using less memory, than do other, comparable eigensolver programs. This program implements a Lanczos algorithm in the American National Standards Institute/ International Organization for Standardization (ANSI/ISO) C computing language, using the Message Passing Interface (MPI) standard to complement an eigensolver in PARPACK. [PARPACK (Parallel Arnoldi Package) is an extension, to parallel-processing computer architectures, of ARPACK (Arnoldi Package), which is a collection of Fortran 77 subroutines that solve large-scale eigenvalue problems.] The eigensolver runs on Beowulf clusters of computers at the Jet Propulsion Laboratory (JPL).

  3. PRROC: computing and visualizing precision-recall and receiver operating characteristic curves in R.

    PubMed

    Grau, Jan; Grosse, Ivo; Keilwagen, Jens

    2015-08-01

    Precision-recall (PR) and receiver operating characteristic (ROC) curves are valuable measures of classifier performance. Here, we present the R-package PRROC, which allows for computing and visualizing both PR and ROC curves. In contrast to available R-packages, PRROC allows for computing PR and ROC curves and areas under these curves for soft-labeled data using a continuous interpolation between the points of PR curves. In addition, PRROC provides a generic plot function for generating publication-quality graphics of PR and ROC curves. © The Author 2015. Published by Oxford University Press.

  4. Quantum Computing Architectural Design

    NASA Astrophysics Data System (ADS)

    West, Jacob; Simms, Geoffrey; Gyure, Mark

    2006-03-01

    Large scale quantum computers will invariably require scalable architectures in addition to high fidelity gate operations. Quantum computing architectural design (QCAD) addresses the problems of actually implementing fault-tolerant algorithms given physical and architectural constraints beyond those of basic gate-level fidelity. Here we introduce a unified framework for QCAD that enables the scientist to study the impact of varying error correction schemes, architectural parameters including layout and scheduling, and physical operations native to a given architecture. Our software package, aptly named QCAD, provides compilation, manipulation/transformation, multi-paradigm simulation, and visualization tools. We demonstrate various features of the QCAD software package through several examples.

  5. Conduction-driven cooling of LED-based automotive LED lighting systems for abating local hot spots

    NASA Astrophysics Data System (ADS)

    Saati, Ferina; Arik, Mehmet

    2018-02-01

    Light-emitting diode (LED)-based automotive lighting systems pose unique challenges, such as dual-side packaging (front side for LEDs and back side for driver electronics circuit), size, harsh ambient, and cooling. Packaging for automotive lighting applications combining the advanced printed circuit board (PCB) technology with a multifunctional LED-based board is investigated with a focus on the effect of thermal conduction-based cooling for hot spot abatement. A baseline study with a flame retardant 4 technology, commonly known as FR4 PCB, is first compared with a metal-core PCB technology, both experimentally and computationally. The double-sided advanced PCB that houses both electronics and LEDs is then investigated computationally and experimentally compared with the baseline FR4 PCB. Computational models are first developed with a commercial computational fluid dynamics software and are followed by an advanced PCB technology based on embedded heat pipes, which is computationally and experimentally studied. Then, attention is turned to studying different heat pipe orientations and heat pipe placements on the board. Results show that conventional FR4-based light engines experience local hot spots (ΔT>50°C) while advanced PCB technology based on heat pipes and thermal spreaders eliminates these local hot spots (ΔT<10°C), leading to a higher lumen extraction with improved reliability. Finally, possible design options are presented with embedded heat pipe structures that further improve the PCB performance.

  6. Coal conversion systems design and process modeling. Volume 1: Application of MPPR and Aspen computer models

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The development of a coal gasification system design and mass and energy balance simulation program for the TVA and other similar facilities is described. The materials-process-product model (MPPM) and the advanced system for process engineering (ASPEN) computer program were selected from available steady state and dynamic models. The MPPM was selected to serve as the basis for development of system level design model structure because it provided the capability for process block material and energy balance and high-level systems sizing and costing. The ASPEN simulation serves as the basis for assessing detailed component models for the system design modeling program. The ASPEN components were analyzed to identify particular process blocks and data packages (physical properties) which could be extracted and used in the system design modeling program. While ASPEN physical properties calculation routines are capable of generating physical properties required for process simulation, not all required physical property data are available, and must be user-entered.

  7. A computational approach to climate science education with CLIMLAB

    NASA Astrophysics Data System (ADS)

    Rose, B. E. J.

    2017-12-01

    CLIMLAB is a Python-based software toolkit for interactive, process-oriented climate modeling for use in education and research. It is motivated by the need for simpler tools and more reproducible workflows with which to "fill in the gaps" between blackboard-level theory and the results of comprehensive climate models. With CLIMLAB you can interactively mix and match physical model components, or combine simpler process models together into a more comprehensive model. I use CLIMLAB in the classroom to put models in the hands of students (undergraduate and graduate), and emphasize a hierarchical, process-oriented approach to understanding the key emergent properties of the climate system. CLIMLAB is equally a tool for climate research, where the same needs exist for more robust, process-based understanding and reproducible computational results. I will give an overview of CLIMLAB and an update on recent developments, including: a full-featured, well-documented, interactive implementation of a widely-used radiation model (RRTM) packaging with conda-forge for compiler-free (and hassle-free!) installation on Mac, Windows and Linux interfacing with xarray for i/o and graphics with gridded model data a rich and growing collection of examples and self-computing lecture notes in Jupyter notebook format

  8. An Implemented Strategy for Campus Connectivity and Cooperative Computing.

    ERIC Educational Resources Information Center

    Halaris, Antony S.; Sloan, Lynda W.

    1989-01-01

    ConnectPac, a software package developed at Iona College to allow a computer user to access all services from a single personal computer, is described. ConnectPac uses mainframe computing to support a campus computing network, integrating personal and centralized computing into a menu-driven user environment. (Author/MLW)

  9. Aspects of perturbation theory in quantum mechanics: The BenderWuMATHEMATICA® package

    NASA Astrophysics Data System (ADS)

    Sulejmanpasic, Tin; Ünsal, Mithat

    2018-07-01

    We discuss a general setup which allows the study of the perturbation theory of an arbitrary, locally harmonic 1D quantum mechanical potential as well as its multi-variable (many-body) generalization. The latter may form a prototype for regularized quantum field theory. We first generalize the method of Bender-Wu,and derive exact recursion relations which allow the determination of the perturbative wave-function and energy corrections to an arbitrary order, at least in principle. For 1D systems, we implement these equations in an easy to use MATHEMATICA® package we call BenderWu. Our package enables quick home-computer computation of high orders of perturbation theory (about 100 orders in 10-30 s, and 250 orders in 1-2 h) and enables practical study of a large class of problems in Quantum Mechanics. We have two hopes concerning the BenderWu package. One is that due to resurgence, large amount of non-perturbative information, such as non-perturbative energies and wave-functions (e.g. WKB wave functions), can in principle be extracted from the perturbative data. We also hope that the package may be used as a teaching tool, providing an effective bridge between perturbation theory and non-perturbative physics in textbooks. Finally, we show that for the multi-variable case, the recursion relation acquires a geometric character, and has a structure which allows parallelization to computer clusters.

  10. PLATSIM: An efficient linear simulation and analysis package for large-order flexible systems

    NASA Technical Reports Server (NTRS)

    Maghami, Periman; Kenny, Sean P.; Giesy, Daniel P.

    1995-01-01

    PLATSIM is a software package designed to provide efficient time and frequency domain analysis of large-order generic space platforms implemented with any linear time-invariant control system. Time domain analysis provides simulations of the overall spacecraft response levels due to either onboard or external disturbances. The time domain results can then be processed by the jitter analysis module to assess the spacecraft's pointing performance in a computationally efficient manner. The resulting jitter analysis algorithms have produced an increase in speed of several orders of magnitude over the brute force approach of sweeping minima and maxima. Frequency domain analysis produces frequency response functions for uncontrolled and controlled platform configurations. The latter represents an enabling technology for large-order flexible systems. PLATSIM uses a sparse matrix formulation for the spacecraft dynamics model which makes both the time and frequency domain operations quite efficient, particularly when a large number of modes are required to capture the true dynamics of the spacecraft. The package is written in MATLAB script language. A graphical user interface (GUI) is included in the PLATSIM software package. This GUI uses MATLAB's Handle graphics to provide a convenient way for setting simulation and analysis parameters.

  11. Technical note: an R package for fitting sparse neural networks with application in animal breeding.

    PubMed

    Wang, Yangfan; Mi, Xue; Rosa, Guilherme J M; Chen, Zhihui; Lin, Ping; Wang, Shi; Bao, Zhenmin

    2018-05-04

    Neural networks (NNs) have emerged as a new tool for genomic selection (GS) in animal breeding. However, the properties of NN used in GS for the prediction of phenotypic outcomes are not well characterized due to the problem of over-parameterization of NN and difficulties in using whole-genome marker sets as high-dimensional NN input. In this note, we have developed an R package called snnR that finds an optimal sparse structure of a NN by minimizing the square error subject to a penalty on the L1-norm of the parameters (weights and biases), therefore solving the problem of over-parameterization in NN. We have also tested some models fitted in the snnR package to demonstrate their feasibility and effectiveness to be used in several cases as examples. In comparison of snnR to the R package brnn (the Bayesian regularized single layer NNs), with both using the entries of a genotype matrix or a genomic relationship matrix as inputs, snnR has greatly improved the computational efficiency and the prediction ability for the GS in animal breeding because snnR implements a sparse NN with many hidden layers.

  12. Design Aspects of the Rayleigh Convection Code

    NASA Astrophysics Data System (ADS)

    Featherstone, N. A.

    2017-12-01

    Understanding the long-term generation of planetary or stellar magnetic field requires complementary knowledge of the large-scale fluid dynamics pervading large fractions of the object's interior. Such large-scale motions are sensitive to the system's geometry which, in planets and stars, is spherical to a good approximation. As a result, computational models designed to study such systems often solve the MHD equations in spherical geometry, frequently employing a spectral approach involving spherical harmonics. We present computational and user-interface design aspects of one such modeling tool, the Rayleigh convection code, which is suitable for deployment on desktop and petascale-hpc architectures alike. In this poster, we will present an overview of this code's parallel design and its built-in diagnostics-output package. Rayleigh has been developed with NSF support through the Computational Infrastructure for Geodynamics and is expected to be released as open-source software in winter 2017/2018.

  13. Computing resonant frequency of C-shaped compact microstrip antennas by using ANFIS

    NASA Astrophysics Data System (ADS)

    Akdagli, Ali; Kayabasi, Ahmet; Develi, Ibrahim

    2015-03-01

    In this work, the resonant frequency of C-shaped compact microstrip antennas (CCMAs) operating at UHF band is computed by using the adaptive neuro-fuzzy inference system (ANFIS). For this purpose, 144 CCMAs with various relative dielectric constants and different physical dimensions were simulated by the XFDTD software package based on the finite-difference time domain (FDTD) method. One hundred and twenty-nine CCMAs were employed for training, while the remaining 15 CCMAs were used for testing of the ANFIS model. Average percentage error (APE) values were obtained as 0.8413% and 1.259% for training and testing, respectively. In order to demonstrate its validity and accuracy, the proposed ANFIS model was also tested over the simulation data given in the literature, and APE was obtained as 0.916%. These results show that ANFIS can be successfully used to compute the resonant frequency of CCMAs.

  14. On Parallelizing Single Dynamic Simulation Using HPC Techniques and APIs of Commercial Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diao, Ruisheng; Jin, Shuangshuang; Howell, Frederic

    Time-domain simulations are heavily used in today’s planning and operation practices to assess power system transient stability and post-transient voltage/frequency profiles following severe contingencies to comply with industry standards. Because of the increased modeling complexity, it is several times slower than real time for state-of-the-art commercial packages to complete a dynamic simulation for a large-scale model. With the growing stochastic behavior introduced by emerging technologies, power industry has seen a growing need for performing security assessment in real time. This paper presents a parallel implementation framework to speed up a single dynamic simulation by leveraging the existing stability model librarymore » in commercial tools through their application programming interfaces (APIs). Several high performance computing (HPC) techniques are explored such as parallelizing the calculation of generator current injection, identifying fast linear solvers for network solution, and parallelizing data outputs when interacting with APIs in the commercial package, TSAT. The proposed method has been tested on a WECC planning base case with detailed synchronous generator models and exhibits outstanding scalable performance with sufficient accuracy.« less

  15. The application of virtual reality systems as a support of digital manufacturing and logistics

    NASA Astrophysics Data System (ADS)

    Golda, G.; Kampa, A.; Paprocka, I.

    2016-08-01

    Modern trends in development of computer aided techniques are heading toward the integration of design competitive products and so-called "digital manufacturing and logistics", supported by computer simulation software. All phases of product lifecycle: starting from design of a new product, through planning and control of manufacturing, assembly, internal logistics and repairs, quality control, distribution to customers and after-sale service, up to its recycling or utilization should be aided and managed by advanced packages of product lifecycle management software. Important problems for providing the efficient flow of materials in supply chain management of whole product lifecycle, using computer simulation will be described on that paper. Authors will pay attention to the processes of acquiring relevant information and correct data, necessary for virtual modeling and computer simulation of integrated manufacturing and logistics systems. The article describes possibilities of use an applications of virtual reality software for modeling and simulation the production and logistics processes in enterprise in different aspects of product lifecycle management. The authors demonstrate effective method of creating computer simulations for digital manufacturing and logistics and show modeled and programmed examples and solutions. They pay attention to development trends and show options of the applications that go beyond enterprise.

  16. Efficient computation of the joint sample frequency spectra for multiple populations.

    PubMed

    Kamm, John A; Terhorst, Jonathan; Song, Yun S

    2017-01-01

    A wide range of studies in population genetics have employed the sample frequency spectrum (SFS), a summary statistic which describes the distribution of mutant alleles at a polymorphic site in a sample of DNA sequences and provides a highly efficient dimensional reduction of large-scale population genomic variation data. Recently, there has been much interest in analyzing the joint SFS data from multiple populations to infer parameters of complex demographic histories, including variable population sizes, population split times, migration rates, admixture proportions, and so on. SFS-based inference methods require accurate computation of the expected SFS under a given demographic model. Although much methodological progress has been made, existing methods suffer from numerical instability and high computational complexity when multiple populations are involved and the sample size is large. In this paper, we present new analytic formulas and algorithms that enable accurate, efficient computation of the expected joint SFS for thousands of individuals sampled from hundreds of populations related by a complex demographic model with arbitrary population size histories (including piecewise-exponential growth). Our results are implemented in a new software package called momi (MOran Models for Inference). Through an empirical study we demonstrate our improvements to numerical stability and computational complexity.

  17. Rapid high performance liquid chromatography method development with high prediction accuracy, using 5cm long narrow bore columns packed with sub-2microm particles and Design Space computer modeling.

    PubMed

    Fekete, Szabolcs; Fekete, Jeno; Molnár, Imre; Ganzler, Katalin

    2009-11-06

    Many different strategies of reversed phase high performance liquid chromatographic (RP-HPLC) method development are used today. This paper describes a strategy for the systematic development of ultrahigh-pressure liquid chromatographic (UHPLC or UPLC) methods using 5cmx2.1mm columns packed with sub-2microm particles and computer simulation (DryLab((R)) package). Data for the accuracy of computer modeling in the Design Space under ultrahigh-pressure conditions are reported. An acceptable accuracy for these predictions of the computer models is presented. This work illustrates a method development strategy, focusing on time reduction up to a factor 3-5, compared to the conventional HPLC method development and exhibits parts of the Design Space elaboration as requested by the FDA and ICH Q8R1. Furthermore this paper demonstrates the accuracy of retention time prediction at elevated pressure (enhanced flow-rate) and shows that the computer-assisted simulation can be applied with sufficient precision for UHPLC applications (p>400bar). Examples of fast and effective method development in pharmaceutical analysis, both for gradient and isocratic separations are presented.

  18. Efficient computation of the joint sample frequency spectra for multiple populations

    PubMed Central

    Kamm, John A.; Terhorst, Jonathan; Song, Yun S.

    2016-01-01

    A wide range of studies in population genetics have employed the sample frequency spectrum (SFS), a summary statistic which describes the distribution of mutant alleles at a polymorphic site in a sample of DNA sequences and provides a highly efficient dimensional reduction of large-scale population genomic variation data. Recently, there has been much interest in analyzing the joint SFS data from multiple populations to infer parameters of complex demographic histories, including variable population sizes, population split times, migration rates, admixture proportions, and so on. SFS-based inference methods require accurate computation of the expected SFS under a given demographic model. Although much methodological progress has been made, existing methods suffer from numerical instability and high computational complexity when multiple populations are involved and the sample size is large. In this paper, we present new analytic formulas and algorithms that enable accurate, efficient computation of the expected joint SFS for thousands of individuals sampled from hundreds of populations related by a complex demographic model with arbitrary population size histories (including piecewise-exponential growth). Our results are implemented in a new software package called momi (MOran Models for Inference). Through an empirical study we demonstrate our improvements to numerical stability and computational complexity. PMID:28239248

  19. PyMT: A Python package for model-coupling in the Earth sciences

    NASA Astrophysics Data System (ADS)

    Hutton, E.

    2016-12-01

    The current landscape of Earth-system models is not only broad in scientific scope, but also broad in type. On the one hand, the large variety of models is exciting, as it provides fertile ground for extending or linking models together in novel ways to answer new scientific questions. However, the heterogeneity in model type acts to inhibit model coupling, model development, or even model use. Existing models are written in a variety of programming languages, operate on different grids, use their own file formats (both for input and output), have different user interfaces, have their own time steps, etc. Each of these factors become obstructions to scientists wanting to couple, extend - or simply run - existing models. For scientists whose main focus may not be computer science these barriers become even larger and become significant logistical hurdles. And this is all before the scientific difficulties of coupling or running models are addressed. The CSDMS Python Modeling Toolkit (PyMT) was developed to help non-computer scientists deal with these sorts of modeling logistics. PyMT is the fundamental package the Community Surface Dynamics Modeling System uses for the coupling of models that expose the Basic Modeling Interface (BMI). It contains: Tools necessary for coupling models of disparate time and space scales (including grid mappers) Time-steppers that coordinate the sequencing of coupled models Exchange of data between BMI-enabled models Wrappers that automatically load BMI-enabled models into the PyMT framework Utilities that support open-source interfaces (UGRID, SGRID,CSDMS Standard Names, etc.) A collection of community-submitted models, written in a variety of programminglanguages, from a variety of process domains - but all usable from within the Python programming language A plug-in framework for adding additional BMI-enabled models to the framework In this presentation we intoduce the basics of the PyMT as well as provide an example of coupling models of different domains and grid types.

  20. Testing of Selected Geopotential Models in Terms of GOCE Satellite Orbit Determination Using Simulated GPS Observations

    NASA Astrophysics Data System (ADS)

    Bobojc, Andrzej; Drozyner, Andrzej

    2016-04-01

    This work contains a comparative study of performance of twenty geopotential models in an orbit estimation process of the satellite of the Gravity Field and Steady-State Ocean Circulation Explorer (GOCE) mission. For testing, among others, such models as JYY_GOCE02S, ITG-GOCE02, ULUX_CHAMP2013S, GOGRA02S, ITG-GRACE2010S, EIGEN-51C, EGM2008, EGM96, JGM3, OSU91a, OSU86F were adopted. A special software package, called the Orbital Computation System (OCS), based on the classical method of least squares was used. In the frame of OCS, initial satellite state vector components are corrected in an iterative process, using the given geopotential model and the models describing the remaining gravitational perturbations. An important part of the OCS package is the 8th order Cowell numerical integration procedure, which enables a satellite orbit computation. Different sets of pseudorange simulations along reference GOCE satellite orbital arcs were obtained using real orbits of the Global Positioning System (GPS) satellites. These sets were the basic observation data used in the adjustment. The centimeter-accuracy Precise Science Orbit (PSO) for the GOCE satellite provided by the European Space Agency (ESA) was adopted as the GOCE reference orbit. Comparing various variants of the orbital solutions, the relative accuracy of geopotential models in an orbital aspect is determined. Full geopotential models were used in the adjustment process. However, the solutions were also determined taking into account truncated geopotential models. In such case, an accuracy of the orbit estimated was slightly enhanced. The obtained solutions refer to the orbital arcs with the lengths of 90-minute and 1-day.

Top