Sample records for fuel-efficiency software package

  1. Increase of efficiency and reliability of liquid fuel combustion in small-sized boilers

    NASA Astrophysics Data System (ADS)

    Roslyakov, P. V.; Proskurin, Yu V.; Ionkin, I. L.

    2017-11-01

    One of the ways to increase the efficiency of using fuels is to create highly efficient domestic energy equipment, in particular small-sized hot-water boilers in autonomous heating systems. Increasing the efficiency of the boiler requires a reduction in the temperature of the flue gases leaving, which, in turn, can be achieved by installing additional heating surfaces. The purpose of this work was to determine the principal design solutions and to develop a draft design for a high-efficiency 3-MW hot-water boiler using crude oil as its main fuel. Ensuring a high efficiency of the boiler is realized through the use of an external remote economizer, which makes it possible to reduce the dimensions of the boiler, facilitate the layout of equipment in a limited size block-modular boiler house and virtually eliminate low-temperature corrosion of boiler heat exchange surfaces. In the article the variants of execution of the water boiler and remote economizer are considered and the preliminary design calculations of the remote economizer for various schemes of the boiler layout in the Boiler Designer software package are made. Based on the results of the studies, a scheme was chosen with a three-way boiler and a two-way remote economizer. The design of a three-way fire tube hot water boiler and an external economizer with an internal arrangement of the collectors, providing for its location above the boiler in a block-modular boiler house and providing access for servicing both a remote economizer and a hot water boiler, is proposed. Its mass-dimensional and design parameters are determined. In the software package Boiler Designer thermal, hydraulic and aerodynamic calculations of the developed fire tube boiler have been performed. Optimization of the boiler design was performed, providing the required 94% efficiency value for crude oil combustion. The description of the developed flue and fire-tube hot water boiler and the value of the main design and technical and economic parameters are given.

  2. Role of Volatility in the Development of JP-8 Surrogates for Diesel Engine Application

    DTIC Science & Technology

    2014-01-01

    distillation curves of the surrogate fuels were calculated using the Aspen HYSYS [41] software package, and the Peng- Robinson model was chosen to...distillation curves for the surrogate fuels developed in this investigation, the accuracy of Aspen HYSYS software predictions were compared with...and SF3. The distillation curves calculated using Aspen HYSYS software for the five surrogate fuels of Table 1 are shown in Figure 7, along with the

  3. Regional demand forecasting and simulation model: user's manual. Task 4, final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parhizgari, A M

    1978-09-25

    The Department of Energy's Regional Demand Forecasting Model (RDFOR) is an econometric and simulation system designed to estimate annual fuel-sector-region specific consumption of energy for the US. Its purposes are to (1) provide the demand side of the Project Independence Evaluation System (PIES), (2) enhance our empirical insights into the structure of US energy demand, and (3) assist policymakers in their decisions on and formulations of various energy policies and/or scenarios. This report provides a self-contained user's manual for interpreting, utilizing, and implementing RDFOR simulation software packages. Chapters I and II present the theoretical structure and the simulation of RDFOR,more » respectively. Chapter III describes several potential scenarios which are (or have been) utilized in the RDFOR simulations. Chapter IV presents an overview of the complete software package utilized in simulation. Chapter V provides the detailed explanation and documentation of this package. The last chapter describes step-by-step implementation of the simulation package using the two scenarios detailed in Chapter III. The RDFOR model contains 14 fuels: gasoline, electricity, natural gas, distillate and residual fuels, liquid gases, jet fuel, coal, oil, petroleum products, asphalt, petroleum coke, metallurgical coal, and total fuels, spread over residential, commercial, industrial, and transportation sectors.« less

  4. Modeling hydraulic regenerative hybrid vehicles using AMESim and Matlab/Simulink

    NASA Astrophysics Data System (ADS)

    Lynn, Alfred; Smid, Edzko; Eshraghi, Moji; Caldwell, Niall; Woody, Dan

    2005-05-01

    This paper presents the overview of the simulation modeling of a hydraulic system with regenerative braking used to improve vehicle emissions and fuel economy. Two simulation software packages were used together to enhance the simulation capability for fuel economy results and development of vehicle and hybrid control strategy. AMESim, a hydraulic simulation software package modeled the complex hydraulic circuit and component hardware and was interlinked with a Matlab/Simulink model of the vehicle, engine and the control strategy required to operate the vehicle and the hydraulic hybrid system through various North American and European drive cycles.

  5. Benchmark tests for a Formula SAE Student car prototyping

    NASA Astrophysics Data System (ADS)

    Mariasiu, Florin

    2011-12-01

    Aerodynamic characteristics of a vehicle are important elements in its design and construction. A low drag coefficient brings significant fuel savings and increased engine power efficiency. In designing and developing vehicles trough computer simulation process to determine the vehicles aerodynamic characteristics are using dedicated CFD (Computer Fluid Dynamics) software packages. However, the results obtained by this faster and cheaper method, are validated by experiments in wind tunnels tests, which are expensive and were complex testing equipment are used in relatively high costs. Therefore, the emergence and development of new low-cost testing methods to validate CFD simulation results would bring great economic benefits for auto vehicles prototyping process. This paper presents the initial development process of a Formula SAE Student race-car prototype using CFD simulation and also present a measurement system based on low-cost sensors through which CFD simulation results were experimentally validated. CFD software package used for simulation was Solid Works with the FloXpress add-on and experimental measurement system was built using four piezoresistive force sensors FlexiForce type.

  6. ''Do-it-yourself'' software program calculates boiler efficiency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1984-03-01

    An easy-to-use software package is described which runs on the IBM Personal Computer. The package calculates boiler efficiency, an important parameter of operating costs and equipment wellbeing. The program stores inputs and calculated results for 20 sets of boiler operating data, called cases. Cases can be displayed and modified on the CRT screen through multiple display pages or copied to a printer. All intermediate calculations are performed by this package. They include: steam enthalpy; water enthalpy; air humidity; gas, oil, coal, and wood heat capacity; and radiation losses.

  7. Design and Implementation of Scientific Software Components to Enable Multiscale Modeling: The Effective Fragment Potential (QM/EFP) Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gaenko, Alexander; Windus, Theresa L.; Sosonkina, Masha

    2012-10-19

    The design and development of scientific software components to provide an interface to the effective fragment potential (EFP) methods are reported. Multiscale modeling of physical and chemical phenomena demands the merging of software packages developed by research groups in significantly different fields. Componentization offers an efficient way to realize new high performance scientific methods by combining the best models available in different software packages without a need for package readaptation after the initial componentization is complete. The EFP method is an efficient electronic structure theory based model potential that is suitable for predictive modeling of intermolecular interactions in large molecularmore » systems, such as liquids, proteins, atmospheric aerosols, and nanoparticles, with an accuracy that is comparable to that of correlated ab initio methods. The developed components make the EFP functionality accessible for any scientific component-aware software package. The performance of the component is demonstrated on a protein interaction model, and its accuracy is compared with results obtained with coupled cluster methods.« less

  8. PLATSIM: A Simulation and Analysis Package for Large-Order Flexible Systems. Version 2.0

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Kenny, Sean P.; Giesy, Daniel P.

    1997-01-01

    The software package PLATSIM provides efficient time and frequency domain analysis of large-order generic space platforms. PLATSIM can perform open-loop analysis or closed-loop analysis with linear or nonlinear control system models. PLATSIM exploits the particular form of sparsity of the plant matrices for very efficient linear and nonlinear time domain analysis, as well as frequency domain analysis. A new, original algorithm for the efficient computation of open-loop and closed-loop frequency response functions for large-order systems has been developed and is implemented within the package. Furthermore, a novel and efficient jitter analysis routine which determines jitter and stability values from time simulations in a very efficient manner has been developed and is incorporated in the PLATSIM package. In the time domain analysis, PLATSIM simulates the response of the space platform to disturbances and calculates the jitter and stability values from the response time histories. In the frequency domain analysis, PLATSIM calculates frequency response function matrices and provides the corresponding Bode plots. The PLATSIM software package is written in MATLAB script language. A graphical user interface is developed in the package to provide convenient access to its various features.

  9. Nipype: a flexible, lightweight and extensible neuroimaging data processing framework in python.

    PubMed

    Gorgolewski, Krzysztof; Burns, Christopher D; Madison, Cindee; Clark, Dav; Halchenko, Yaroslav O; Waskom, Michael L; Ghosh, Satrajit S

    2011-01-01

    Current neuroimaging software offer users an incredible opportunity to analyze their data in different ways, with different underlying assumptions. Several sophisticated software packages (e.g., AFNI, BrainVoyager, FSL, FreeSurfer, Nipy, R, SPM) are used to process and analyze large and often diverse (highly multi-dimensional) data. However, this heterogeneous collection of specialized applications creates several issues that hinder replicable, efficient, and optimal use of neuroimaging analysis approaches: (1) No uniform access to neuroimaging analysis software and usage information; (2) No framework for comparative algorithm development and dissemination; (3) Personnel turnover in laboratories often limits methodological continuity and training new personnel takes time; (4) Neuroimaging software packages do not address computational efficiency; and (5) Methods sections in journal articles are inadequate for reproducing results. To address these issues, we present Nipype (Neuroimaging in Python: Pipelines and Interfaces; http://nipy.org/nipype), an open-source, community-developed, software package, and scriptable library. Nipype solves the issues by providing Interfaces to existing neuroimaging software with uniform usage semantics and by facilitating interaction between these packages using Workflows. Nipype provides an environment that encourages interactive exploration of algorithms, eases the design of Workflows within and between packages, allows rapid comparative development of algorithms and reduces the learning curve necessary to use different packages. Nipype supports both local and remote execution on multi-core machines and clusters, without additional scripting. Nipype is Berkeley Software Distribution licensed, allowing anyone unrestricted usage. An open, community-driven development philosophy allows the software to quickly adapt and address the varied needs of the evolving neuroimaging community, especially in the context of increasing demand for reproducible research.

  10. Nipype: A Flexible, Lightweight and Extensible Neuroimaging Data Processing Framework in Python

    PubMed Central

    Gorgolewski, Krzysztof; Burns, Christopher D.; Madison, Cindee; Clark, Dav; Halchenko, Yaroslav O.; Waskom, Michael L.; Ghosh, Satrajit S.

    2011-01-01

    Current neuroimaging software offer users an incredible opportunity to analyze their data in different ways, with different underlying assumptions. Several sophisticated software packages (e.g., AFNI, BrainVoyager, FSL, FreeSurfer, Nipy, R, SPM) are used to process and analyze large and often diverse (highly multi-dimensional) data. However, this heterogeneous collection of specialized applications creates several issues that hinder replicable, efficient, and optimal use of neuroimaging analysis approaches: (1) No uniform access to neuroimaging analysis software and usage information; (2) No framework for comparative algorithm development and dissemination; (3) Personnel turnover in laboratories often limits methodological continuity and training new personnel takes time; (4) Neuroimaging software packages do not address computational efficiency; and (5) Methods sections in journal articles are inadequate for reproducing results. To address these issues, we present Nipype (Neuroimaging in Python: Pipelines and Interfaces; http://nipy.org/nipype), an open-source, community-developed, software package, and scriptable library. Nipype solves the issues by providing Interfaces to existing neuroimaging software with uniform usage semantics and by facilitating interaction between these packages using Workflows. Nipype provides an environment that encourages interactive exploration of algorithms, eases the design of Workflows within and between packages, allows rapid comparative development of algorithms and reduces the learning curve necessary to use different packages. Nipype supports both local and remote execution on multi-core machines and clusters, without additional scripting. Nipype is Berkeley Software Distribution licensed, allowing anyone unrestricted usage. An open, community-driven development philosophy allows the software to quickly adapt and address the varied needs of the evolving neuroimaging community, especially in the context of increasing demand for reproducible research. PMID:21897815

  11. Los Alamos National Security, LLC Request for Information on how industry may partner with the Laboratory on KIVA software.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mcdonald, Kathleen Herrera

    2016-02-29

    KIVA is a family of Fortran-based computational fluid dynamics software developed by LANL. The software predicts complex fuel and air flows as well as ignition, combustion, and pollutant-formation processes in engines. The KIVA models have been used to understand combustion chemistry processes, such as auto-ignition of fuels, and to optimize diesel engines for high efficiency and low emissions. Fuel economy is heavily dependent upon engine efficiency, which in turn depends to a large degree on how fuel is burned within the cylinders of the engine. Higher in-cylinder pressures and temperatures lead to increased fuel economy, but they also create moremore » difficulty in controlling the combustion process. Poorly controlled and incomplete combustion can cause higher levels of emissions and lower engine efficiencies.« less

  12. G-Guidance Interface Design for Small Body Mission Simulation

    NASA Technical Reports Server (NTRS)

    Acikmese, Behcet; Carson, John; Phan, Linh

    2008-01-01

    The G-Guidance software implements a guidance and control (G and C) algorithm for small-body, autonomous proximity operations, developed under the Small Body GN and C task at JPL. The software is written in Matlab and interfaces with G-OPT, a JPL-developed optimization package written in C that provides G-Guidance with guaranteed convergence to a solution in a finite computation time with a prescribed accuracy. The resulting program is computationally efficient and is a prototype of an onboard, real-time algorithm for autonomous guidance and control. Two thruster firing schemes are available in G-Guidance, allowing tailoring of the software for specific mission maneuvers. For example, descent, landing, or rendezvous benefit from a thruster firing at the maneuver termination to mitigate velocity errors. Conversely, ascent or separation maneuvers benefit from an immediate firing to avoid potential drift toward a second body. The guidance portion of this software explicitly enforces user-defined control constraints and thruster silence times while minimizing total fuel usage. This program is currently specialized to small-body proximity operations, but the underlying method can be generalized to other applications.

  13. Lean Development with the Morpheus Simulation Software

    NASA Technical Reports Server (NTRS)

    Brogley, Aaron C.

    2013-01-01

    The Morpheus project is an autonomous robotic testbed currently in development at NASA's Johnson Space Center (JSC) with support from other centers. Its primary objectives are to test new 'green' fuel propulsion systems and to demonstrate the capability of the Autonomous Lander Hazard Avoidance Technology (ALHAT) sensor, provided by the Jet Propulsion Laboratory (JPL) on a lunar landing trajectory. If successful, these technologies and lessons learned from the Morpheus testing cycle may be incorporated into a landing descent vehicle used on the moon, an asteroid, or Mars. In an effort to reduce development costs and cycle time, the project employs lean development engineering practices in its development of flight and simulation software. The Morpheus simulation makes use of existing software packages where possible to reduce the development time. The development and testing of flight software occurs primarily through the frequent test operation of the vehicle and incrementally increasing the scope of the test. With rapid development cycles, risk of loss of the vehicle and loss of the mission are possible, but efficient progress in development would not be possible without that risk.

  14. Adapting iterative algorithms for solving large sparse linear systems for efficient use on the CDC CYBER 205

    NASA Technical Reports Server (NTRS)

    Kincaid, D. R.; Young, D. M.

    1984-01-01

    Adapting and designing mathematical software to achieve optimum performance on the CYBER 205 is discussed. Comments and observations are made in light of recent work done on modifying the ITPACK software package and on writing new software for vector supercomputers. The goal was to develop very efficient vector algorithms and software for solving large sparse linear systems using iterative methods.

  15. EXPERIMENTAL EVALUATION OF FUEL OIL ADDITIVES FOR REDUCING EMISSIONS AND INCREASING EFFICIENCY OF BOILERS

    EPA Science Inventory

    The report gives results of an evaluation of the effectiveness of combustion-type fuel oil additives to reduce emissions and increase efficiency in a 50-bhp (500 kw) commercial oil-fired packaged boiler. Most additive evaluation runs were made during continuous firing, constant-l...

  16. A physical and economic model of the nuclear fuel cycle

    NASA Astrophysics Data System (ADS)

    Schneider, Erich Alfred

    A model of the nuclear fuel cycle that is suitable for use in strategic planning and economic forecasting is presented. The model, to be made available as a stand-alone software package, requires only a small set of fuel cycle and reactor specific input parameters. Critical design criteria include ease of use by nonspecialists, suppression of errors to within a range dictated by unit cost uncertainties, and limitation of runtime to under one minute on a typical desktop computer. Collision probability approximations to the neutron transport equation that lead to a computationally efficient decoupling of the spatial and energy variables are presented and implemented. The energy dependent flux, governed by coupled integral equations, is treated by multigroup or continuous thermalization methods. The model's output includes a comprehensive nuclear materials flowchart that begins with ore requirements, calculates the buildup of 24 actinides as well as fission products, and concludes with spent fuel or reprocessed material composition. The costs, direct and hidden, of the fuel cycle under study are also computed. In addition to direct disposal and plutonium recycling strategies in current use, the model addresses hypothetical cycles. These include cycles chosen for minor actinide burning and for their low weapons-usable content.

  17. Reference Gene Validation for RT-qPCR, a Note on Different Available Software Packages

    PubMed Central

    De Spiegelaere, Ward; Dern-Wieloch, Jutta; Weigel, Roswitha; Schumacher, Valérie; Schorle, Hubert; Nettersheim, Daniel; Bergmann, Martin; Brehm, Ralph; Kliesch, Sabine; Vandekerckhove, Linos; Fink, Cornelia

    2015-01-01

    Background An appropriate normalization strategy is crucial for data analysis from real time reverse transcription polymerase chain reactions (RT-qPCR). It is widely supported to identify and validate stable reference genes, since no single biological gene is stably expressed between cell types or within cells under different conditions. Different algorithms exist to validate optimal reference genes for normalization. Applying human cells, we here compare the three main methods to the online available RefFinder tool that integrates these algorithms along with R-based software packages which include the NormFinder and GeNorm algorithms. Results 14 candidate reference genes were assessed by RT-qPCR in two sample sets, i.e. a set of samples of human testicular tissue containing carcinoma in situ (CIS), and a set of samples from the human adult Sertoli cell line (FS1) either cultured alone or in co-culture with the seminoma like cell line (TCam-2) or with equine bone marrow derived mesenchymal stem cells (eBM-MSC). Expression stabilities of the reference genes were evaluated using geNorm, NormFinder, and BestKeeper. Similar results were obtained by the three approaches for the most and least stably expressed genes. The R-based packages NormqPCR, SLqPCR and the NormFinder for R script gave identical gene rankings. Interestingly, different outputs were obtained between the original software packages and the RefFinder tool, which is based on raw Cq values for input. When the raw data were reanalysed assuming 100% efficiency for all genes, then the outputs of the original software packages were similar to the RefFinder software, indicating that RefFinder outputs may be biased because PCR efficiencies are not taken into account. Conclusions This report shows that assay efficiency is an important parameter for reference gene validation. New software tools that incorporate these algorithms should be carefully validated prior to use. PMID:25825906

  18. Reference gene validation for RT-qPCR, a note on different available software packages.

    PubMed

    De Spiegelaere, Ward; Dern-Wieloch, Jutta; Weigel, Roswitha; Schumacher, Valérie; Schorle, Hubert; Nettersheim, Daniel; Bergmann, Martin; Brehm, Ralph; Kliesch, Sabine; Vandekerckhove, Linos; Fink, Cornelia

    2015-01-01

    An appropriate normalization strategy is crucial for data analysis from real time reverse transcription polymerase chain reactions (RT-qPCR). It is widely supported to identify and validate stable reference genes, since no single biological gene is stably expressed between cell types or within cells under different conditions. Different algorithms exist to validate optimal reference genes for normalization. Applying human cells, we here compare the three main methods to the online available RefFinder tool that integrates these algorithms along with R-based software packages which include the NormFinder and GeNorm algorithms. 14 candidate reference genes were assessed by RT-qPCR in two sample sets, i.e. a set of samples of human testicular tissue containing carcinoma in situ (CIS), and a set of samples from the human adult Sertoli cell line (FS1) either cultured alone or in co-culture with the seminoma like cell line (TCam-2) or with equine bone marrow derived mesenchymal stem cells (eBM-MSC). Expression stabilities of the reference genes were evaluated using geNorm, NormFinder, and BestKeeper. Similar results were obtained by the three approaches for the most and least stably expressed genes. The R-based packages NormqPCR, SLqPCR and the NormFinder for R script gave identical gene rankings. Interestingly, different outputs were obtained between the original software packages and the RefFinder tool, which is based on raw Cq values for input. When the raw data were reanalysed assuming 100% efficiency for all genes, then the outputs of the original software packages were similar to the RefFinder software, indicating that RefFinder outputs may be biased because PCR efficiencies are not taken into account. This report shows that assay efficiency is an important parameter for reference gene validation. New software tools that incorporate these algorithms should be carefully validated prior to use.

  19. Detection And Mapping (DAM) package. Volume 4A: Software System Manual, part 1

    NASA Technical Reports Server (NTRS)

    Schlosser, E. H.

    1980-01-01

    The package is an integrated set of manual procedures, computer programs, and graphic devices designed for efficient production of precisely registered and formatted maps from digital LANDSAT multispectral scanner (MSS) data. The software can be readily implemented on any Univac 1100 series computer with standard peripheral equipment. This version of the software includes predefined spectral limits for use in classifying and mapping surface water for LANDSAT-1, LANDSAT-2, and LANDSAT-3. Tape formats supported include X, AM, and PM.

  20. DMRfinder: efficiently identifying differentially methylated regions from MethylC-seq data.

    PubMed

    Gaspar, John M; Hart, Ronald P

    2017-11-29

    DNA methylation is an epigenetic modification that is studied at a single-base resolution with bisulfite treatment followed by high-throughput sequencing. After alignment of the sequence reads to a reference genome, methylation counts are analyzed to determine genomic regions that are differentially methylated between two or more biological conditions. Even though a variety of software packages is available for different aspects of the bioinformatics analysis, they often produce results that are biased or require excessive computational requirements. DMRfinder is a novel computational pipeline that identifies differentially methylated regions efficiently. Following alignment, DMRfinder extracts methylation counts and performs a modified single-linkage clustering of methylation sites into genomic regions. It then compares methylation levels using beta-binomial hierarchical modeling and Wald tests. Among its innovative attributes are the analyses of novel methylation sites and methylation linkage, as well as the simultaneous statistical analysis of multiple sample groups. To demonstrate its efficiency, DMRfinder is benchmarked against other computational approaches using a large published dataset. Contrasting two replicates of the same sample yielded minimal genomic regions with DMRfinder, whereas two alternative software packages reported a substantial number of false positives. Further analyses of biological samples revealed fundamental differences between DMRfinder and another software package, despite the fact that they utilize the same underlying statistical basis. For each step, DMRfinder completed the analysis in a fraction of the time required by other software. Among the computational approaches for identifying differentially methylated regions from high-throughput bisulfite sequencing datasets, DMRfinder is the first that integrates all the post-alignment steps in a single package. Compared to other software, DMRfinder is extremely efficient and unbiased in this process. DMRfinder is free and open-source software, available on GitHub ( github.com/jsh58/DMRfinder ); it is written in Python and R, and is supported on Linux.

  1. Applications of multigrid software in the atmospheric sciences

    NASA Technical Reports Server (NTRS)

    Adams, J.; Garcia, R.; Gross, B.; Hack, J.; Haidvogel, D.; Pizzo, V.

    1992-01-01

    Elliptic partial differential equations from different areas in the atmospheric sciences are efficiently and easily solved utilizing the multigrid software package named MUDPACK. It is demonstrated that the multigrid method is more efficient than other commonly employed techniques, such as Gaussian elimination and fixed-grid relaxation. The efficiency relative to other techniques, both in terms of storage requirement and computational time, increases quickly with grid size.

  2. PLATSIM: An efficient linear simulation and analysis package for large-order flexible systems

    NASA Technical Reports Server (NTRS)

    Maghami, Periman; Kenny, Sean P.; Giesy, Daniel P.

    1995-01-01

    PLATSIM is a software package designed to provide efficient time and frequency domain analysis of large-order generic space platforms implemented with any linear time-invariant control system. Time domain analysis provides simulations of the overall spacecraft response levels due to either onboard or external disturbances. The time domain results can then be processed by the jitter analysis module to assess the spacecraft's pointing performance in a computationally efficient manner. The resulting jitter analysis algorithms have produced an increase in speed of several orders of magnitude over the brute force approach of sweeping minima and maxima. Frequency domain analysis produces frequency response functions for uncontrolled and controlled platform configurations. The latter represents an enabling technology for large-order flexible systems. PLATSIM uses a sparse matrix formulation for the spacecraft dynamics model which makes both the time and frequency domain operations quite efficient, particularly when a large number of modes are required to capture the true dynamics of the spacecraft. The package is written in MATLAB script language. A graphical user interface (GUI) is included in the PLATSIM software package. This GUI uses MATLAB's Handle graphics to provide a convenient way for setting simulation and analysis parameters.

  3. ATLAS software configuration and build tool optimisation

    NASA Astrophysics Data System (ADS)

    Rybkin, Grigory; Atlas Collaboration

    2014-06-01

    ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of reads of requirements files that are now read once per package by a CMT build command that generates cached requirements files for subsequent CMT build commands; introduction of more fine-grained build parallelism at package task level, i.e., dependent applications and libraries are compiled in parallel; code optimisation of CMT commands used for build; introduction of package level build parallelism, i. e., parallelise the build of independent packages. By default, CMT launches NUMBER-OF-PROCESSORS build commands in parallel. The other focus was on CMT commands optimisation in general that made them approximately 2 times faster. CMT can generate a cached requirements file for the environment setup command, which is especially useful for deployment on distributed file systems like AFS or CERN VMFS. The use of parallelism, caching and code optimisation significantly-by several times-reduced software build time, environment setup time, increased the efficiency of multi-core computing resources utilisation, and considerably improved software developer and user experience.

  4. Evaluation of copy number variation detection for a SNP array platform

    PubMed Central

    2014-01-01

    Background Copy Number Variations (CNVs) are usually inferred from Single Nucleotide Polymorphism (SNP) arrays by use of some software packages based on given algorithms. However, there is no clear understanding of the performance of these software packages; it is therefore difficult to select one or several software packages for CNV detection based on the SNP array platform. We selected four publicly available software packages designed for CNV calling from an Affymetrix SNP array, including Birdsuite, dChip, Genotyping Console (GTC) and PennCNV. The publicly available dataset generated by Array-based Comparative Genomic Hybridization (CGH), with a resolution of 24 million probes per sample, was considered to be the “gold standard”. Compared with the CGH-based dataset, the success rate, average stability rate, sensitivity, consistence and reproducibility of these four software packages were assessed compared with the “gold standard”. Specially, we also compared the efficiency of detecting CNVs simultaneously by two, three and all of the software packages with that by a single software package. Results Simply from the quantity of the detected CNVs, Birdsuite detected the most while GTC detected the least. We found that Birdsuite and dChip had obvious detecting bias. And GTC seemed to be inferior because of the least amount of CNVs it detected. Thereafter we investigated the detection consistency produced by one certain software package and the rest three software suits. We found that the consistency of dChip was the lowest while GTC was the highest. Compared with the CNVs detecting result of CGH, in the matching group, GTC called the most matching CNVs, PennCNV-Affy ranked second. In the non-overlapping group, GTC called the least CNVs. With regards to the reproducibility of CNV calling, larger CNVs were usually replicated better. PennCNV-Affy shows the best consistency while Birdsuite shows the poorest. Conclusion We found that PennCNV outperformed the other three packages in the sensitivity and specificity of CNV calling. Obviously, each calling method had its own limitations and advantages for different data analysis. Therefore, the optimized calling methods might be identified using multiple algorithms to evaluate the concordance and discordance of SNP array-based CNV calling. PMID:24555668

  5. Motofit - integrating neutron reflectometry acquisition, reduction and analysis into one, easy to use, package

    NASA Astrophysics Data System (ADS)

    Nelson, Andrew

    2010-11-01

    The efficient use of complex neutron scattering instruments is often hindered by the complex nature of their operating software. This complexity exists at each experimental step: data acquisition, reduction and analysis, with each step being as important as the previous. For example, whilst command line interfaces are powerful at automated acquisition they often reduce accessibility by novice users and sometimes reduce the efficiency for advanced users. One solution to this is the development of a graphical user interface which allows the user to operate the instrument by a simple and intuitive "push button" approach. This approach was taken by the Motofit software package for analysis of multiple contrast reflectometry data. Here we describe the extension of this package to cover the data acquisition and reduction steps for the Platypus time-of-flight neutron reflectometer. Consequently, the complete operation of an instrument is integrated into a single, easy to use, program, leading to efficient instrument usage.

  6. StreamThermal: A software package for calculating thermal metrics from stream temperature data

    USGS Publications Warehouse

    Tsang, Yin-Phan; Infante, Dana M.; Stewart, Jana S.; Wang, Lizhu; Tingly, Ralph; Thornbrugh, Darren; Cooper, Arthur; Wesley, Daniel

    2016-01-01

    Improving quality and better availability of continuous stream temperature data allows natural resource managers, particularly in fisheries, to understand associations between different characteristics of stream thermal regimes and stream fishes. However, there is no convenient tool to efficiently characterize multiple metrics reflecting stream thermal regimes with the increasing amount of data. This article describes a software program packaged as a library in R to facilitate this process. With this freely-available package, users will be able to quickly summarize metrics that describe five categories of stream thermal regimes: magnitude, variability, frequency, timing, and rate of change. The installation and usage instruction of this package, the definition of calculated thermal metrics, as well as the output format from the package are described, along with an application showing the utility for multiple metrics. We believe this package can be widely utilized by interested stakeholders and greatly assist more studies in fisheries.

  7. Optimization of polymer electrolyte membrane fuel cell flow channels using a genetic algorithm

    NASA Astrophysics Data System (ADS)

    Catlin, Glenn; Advani, Suresh G.; Prasad, Ajay K.

    The design of the flow channels in PEM fuel cells directly impacts the transport of reactant gases to the electrodes and affects cell performance. This paper presents results from a study to optimize the geometry of the flow channels in a PEM fuel cell. The optimization process implements a genetic algorithm to rapidly converge on the channel geometry that provides the highest net power output from the cell. In addition, this work implements a method for the automatic generation of parameterized channel domains that are evaluated for performance using a commercial computational fluid dynamics package from ANSYS. The software package includes GAMBIT as the solid modeling and meshing software, the solver FLUENT, and a PEMFC Add-on Module capable of modeling the relevant physical and electrochemical mechanisms that describe PEM fuel cell operation. The result of the optimization process is a set of optimal channel geometry values for the single-serpentine channel configuration. The performance of the optimal geometry is contrasted with a sub-optimal one by comparing contour plots of current density, oxygen and hydrogen concentration. In addition, the role of convective bypass in bringing fresh reactant to the catalyst layer is examined in detail. The convergence to the optimal geometry is confirmed by a bracketing study which compares the performance of the best individual to those of its neighbors with adjacent parameter values.

  8. Multi-canister overpack project -- verification and validation, MCNP 4A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldmann, L.H.

    This supporting document contains the software verification and validation (V and V) package used for Phase 2 design of the Spent Nuclear Fuel Multi-Canister Overpack. V and V packages for both ANSYS and MCNP are included. Description of Verification Run(s): This software requires that it be compiled specifically for the machine it is to be used on. Therefore to facilitate ease in the verification process the software automatically runs 25 sample problems to ensure proper installation and compilation. Once the runs are completed the software checks for verification by performing a file comparison on the new output file and themore » old output file. Any differences between any of the files will cause a verification error. Due to the manner in which the verification is completed a verification error does not necessarily indicate a problem. This indicates that a closer look at the output files is needed to determine the cause of the error.« less

  9. Study of connected system of automatic control of load and operation efficiency of a steam boiler with extremal controller on a simulation model

    NASA Astrophysics Data System (ADS)

    Sabanin, V. R.; Starostin, A. A.; Repin, A. I.; Popov, A. I.

    2017-02-01

    The problems of operation effectiveness increase of steam boilers are considered. To maintain the optimum fuel combustion modes, it is proposed to use an extremal controller (EC) determining the value of airflow rate, at which the boiler generating the desired amount of heat will consume a minimum amount of fuel. EC sets the determined value of airflow rate to airflow rate controller (ARC). The test results of numerical simulation dynamic nonlinear model of steam boiler with the connected system of automatic control of load and combustion efficiency using EC are presented. The model is created in the Simulink modeling package of MATLAB software and can be used to optimize the combustion modes. Based on the modeling results, the conclusion was drawn about the possibility in principle of simultaneously boiler load control and optimizing by EC the combustion modes when changing the fuel combustion heat and the boiler characteristics and its operating mode. It is shown that it is possible to automatically control the operation efficiency of steam boilers when using EC without applying the standard flue gas analyzers. The article considers the numerical simulation dynamic model of steam boiler with the schemes of control of fuel consumption and airflow rate, the steam pressure and EC; the purpose of using EC in the scheme with linear controllers and the requirements to the quality of its operation; the results of operation of boiler control schemes without EC with estimation of influence of roughness of thermal mode maps on the nature of static and dynamic connection of the control units of fuel consumption and airflow rate; the phase trajectories and the diagrams of transient processes occurring in the control scheme with EC with stepped changing the fuel quality and boiler characteristics; analysis of modeling results and prospects for using EC in the control schemes of boilers.

  10. Software engineering the mixed model for genome-wide association studies on large samples.

    PubMed

    Zhang, Zhiwu; Buckler, Edward S; Casstevens, Terry M; Bradbury, Peter J

    2009-11-01

    Mixed models improve the ability to detect phenotype-genotype associations in the presence of population stratification and multiple levels of relatedness in genome-wide association studies (GWAS), but for large data sets the resource consumption becomes impractical. At the same time, the sample size and number of markers used for GWAS is increasing dramatically, resulting in greater statistical power to detect those associations. The use of mixed models with increasingly large data sets depends on the availability of software for analyzing those models. While multiple software packages implement the mixed model method, no single package provides the best combination of fast computation, ability to handle large samples, flexible modeling and ease of use. Key elements of association analysis with mixed models are reviewed, including modeling phenotype-genotype associations using mixed models, population stratification, kinship and its estimation, variance component estimation, use of best linear unbiased predictors or residuals in place of raw phenotype, improving efficiency and software-user interaction. The available software packages are evaluated, and suggestions made for future software development.

  11. Hierarchical Petascale Simulation Framework For Stress Corrosion Cracking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grama, Ananth

    2013-12-18

    A number of major accomplishments resulted from the project. These include: • Data Structures, Algorithms, and Numerical Methods for Reactive Molecular Dynamics. We have developed a range of novel data structures, algorithms, and solvers (amortized ILU, Spike) for use with ReaxFF and charge equilibration. • Parallel Formulations of ReactiveMD (Purdue ReactiveMolecular Dynamics Package, PuReMD, PuReMD-GPU, and PG-PuReMD) for Messaging, GPU, and GPU Cluster Platforms. We have developed efficient serial, parallel (MPI), GPU (Cuda), and GPU Cluster (MPI/Cuda) implementations. Our implementations have been demonstrated to be significantly better than the state of the art, both in terms of performance and scalability.more » • Comprehensive Validation in the Context of Diverse Applications. We have demonstrated the use of our software in diverse systems, including silica-water, silicon-germanium nanorods, and as part of other projects, extended it to applications ranging from explosives (RDX) to lipid bilayers (biomembranes under oxidative stress). • Open Source Software Packages for Reactive Molecular Dynamics. All versions of our soft- ware have been released over the public domain. There are over 100 major research groups worldwide using our software. • Implementation into the Department of Energy LAMMPS Software Package. We have also integrated our software into the Department of Energy LAMMPS software package.« less

  12. Analyses of Field Test Data at the Atucha-1 Spent Fuel Pools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sitaraman, S.

    A field test was conducted at the Atucha-1 spent nuclear fuel pools to validate a software package for gross defect detection that is used in conjunction with the inspection tool, Spent Fuel Neutron Counter (SFNC). A set of measurements was taken with the SFNC and the software predictions were compared with these data and analyzed. The data spanned a wide range of cooling times and a set of burnup levels leading to count rates from the several hundreds to around twenty per second. The current calibration in the software using linear fitting required the use of multiple calibration factors tomore » cover the entire range of count rates recorded. The solution to this was to use power regression data fitting to normalize the predicted response and derive one calibration factor that can be applied to the entire set of data. The resulting comparisons between the predicted and measured responses were generally good and provided a quantitative method of detecting missing fuel in virtually all situations. Since the current version of the software uses the linear calibration method, it would need to be updated with the new power regression method to make it more user-friendly for real time verification and fieldable for the range of responses that will be encountered.« less

  13. Kokhanok Renewable Energy Retrofit Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baring-Gould, Edward I.; Haase, Scott G.; Jimenez, Antonio

    In 2010, the community of Kokhanok, Alaska, installed two 90-kW wind turbines, battery storage, a converter, and equipment for integration. Researchers at the National Renewable Energy Laboratory performed an analysis and modeling using the HOMER and REopt software modeling packages.The analysis was designed to answer the following questions: 1) What is required to achieve a 50 percent reduction in power plant diesel fuel consumption in a diesel microgrid? 2) What is required to achieve a 50 percent reduction in 'total' (diesel and heating oil) consumption in a remote community? 3) What is the impact and role of energy efficiency? Thismore » presentation provides an introduction to the community of Kokhanok, Alaska; a summary of energy data; and an overview of analysis results and conceptual design.« less

  14. Characterization of fission gas bubbles in irradiated U-10Mo fuel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Casella, Andrew M.; Burkes, Douglas E.; MacFarlan, Paul J.

    2017-09-01

    Irradiated U-10Mo fuel samples were prepared with traditional mechanical potting and polishing methods with in a hot cell. They were then removed and imaged with an SEM located outside of a hot cell. The images were then processed with basic imaging techniques from 3 separate software packages. The results were compared and a baseline method for characterization of fission gas bubbles in the samples is proposed. It is hoped that through adoption of or comparison to this baseline method that sample characterization can be somewhat standardized across the field of post irradiated examination of metal fuels.

  15. Characterization of fission gas bubbles in irradiated U-10Mo fuel

    DOE PAGES

    Casella, Andrew M.; Burkes, Douglas E.; MacFarlan, Paul J.; ...

    2017-06-06

    A simple, repeatable method for characterization of fission gas bubbles in irradiated U-Mo fuels has been developed. This method involves mechanical potting and polishing of samples along with examination with a scanning electron microscope located outside of a hot cell. The commercially available software packages CellProfiler, MATLAB, and Mathematica are used to segment and analyze the captured images. The results are compared and contrasted. Finally, baseline methods for fission gas bubble characterization are suggested for consideration and further development.

  16. Rapid Building Assessment Project

    DTIC Science & Technology

    2014-05-01

    ongoing management of commercial energy efficiency. No other company offers all of these proven services on a seamless, integrated Software -as-a- Service ...FirstFuel has added a suite of additional Software -as-a- Service analytics capabilities to support the entire energy efficiency lifecycle, including...the client side. In this document, we refer to the service side software as “BUILDER” and the client software as “BuilderRED,” following the Army

  17. Economic Risk Analysis of Agricultural Tillage Systems Using the SMART Stochastic Efficiency Software Package

    USDA-ARS?s Scientific Manuscript database

    Recently, a variant of stochastic dominance called stochastic efficiency with respect to a function (SERF) has been developed and applied. Unlike traditional stochastic dominance approaches, SERF uses the concept of certainty equivalents (CEs) to rank a set of risk-efficient alternatives instead of...

  18. Refuse Derived Fuel (RDF) production and gasification in a pilot plant integrated with an Otto cycle ICE through Aspen plus™ modelling: Thermodynamic and economic viability.

    PubMed

    Násner, Albany Milena Lozano; Lora, Electo Eduardo Silva; Palacio, José Carlos Escobar; Rocha, Mateus Henrique; Restrepo, Julian Camilo; Venturini, Osvaldo José; Ratner, Albert

    2017-11-01

    This work deals with the development of a Refuse Derived Fuel (RDF) gasification pilot plant using air as a gasification agent. A downdraft fixed bed reactor is integrated with an Otto cycle Internal Combustion Engine (ICE). Modelling was carried out using the Aspen Plus™ software to predict the ideal operational conditions for maximum efficiency. Thermodynamics package used in the simulation comprised the Non-Random Two-Liquid (NRTL) model and the Hayden-O'Connell (HOC) equation of state. As expected, the results indicated that the Equivalence Ratio (ER) has a direct influence over the gasification temperature and the composition of the Raw Produced Gas (RPG), and effects of ER over the Lower Heating Value (LHV) and Cold Gasification Efficiency (CGE) of the RPG are also discussed. A maximum CGE efficiency of 57-60% was reached for ER values between 0.25 and 0.3, also an average reactor temperature values in the range of 680-700°C, with a peak LHV of 5.8MJ/Nm 3 . RPG was burned in an ICE, reaching an electrical power of 50kW el . The economic assessment of the pilot plant implementation was also performed, showing the project is feasible, with power above 120kW el with an initial investment of approximately US$ 300,000. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Mixed Oxide Fresh Fuel Package Auxiliary Equipment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yapuncich, F.; Ross, A.; Clark, R.H.

    2008-07-01

    The United States Department of Energy's National Nuclear Security Administration (NNSA) is overseeing the construction the Mixed Oxide (MOX) Fuel Fabrication Facility (MFFF) on the Savannah River Site. The new facility, being constructed by NNSA's contractor Shaw AREVA MOX Services, will fabricate fuel assemblies utilizing surplus plutonium as feedstock. The fuel will be used in designated commercial nuclear reactors. The MOX Fresh Fuel Package (MFFP), which has recently been licensed by the Nuclear Regulatory Commission (NRC) as a type B package (USA/9295/B(U)F-96), will be utilized to transport the fabricated fuel assemblies from the MFFF to the nuclear reactors. It wasmore » necessary to develop auxiliary equipment that would be able to efficiently handle the high precision fuel assemblies. Also, the physical constraints of the MFFF and the nuclear power plants require that the equipment be capable of loading and unloading the fuel assemblies both vertically and horizontally. The ability to reconfigure the load/unload evolution builds in a large degree of flexibility for the MFFP for the handling of many types of both fuel and non fuel payloads. The design and analysis met various technical specifications including dynamic and static seismic criteria. The fabrication was completed by three major fabrication facilities within the United States. The testing was conducted by Sandia National Laboratories. The unique design specifications and successful testing sequences will be discussed. (authors)« less

  20. MOPEX: a software package for astronomical image processing and visualization

    NASA Astrophysics Data System (ADS)

    Makovoz, David; Roby, Trey; Khan, Iffat; Booth, Hartley

    2006-06-01

    We present MOPEX - a software package for astronomical image processing and display. The package is a combination of command-line driven image processing software written in C/C++ with a Java-based GUI. The main image processing capabilities include creating mosaic images, image registration, background matching, point source extraction, as well as a number of minor image processing tasks. The combination of the image processing and display capabilities allows for much more intuitive and efficient way of performing image processing. The GUI allows for the control over the image processing and display to be closely intertwined. Parameter setting, validation, and specific processing options are entered by the user through a set of intuitive dialog boxes. Visualization feeds back into further processing by providing a prompt feedback of the processing results. The GUI also allows for further analysis by accessing and displaying data from existing image and catalog servers using a virtual observatory approach. Even though originally designed for the Spitzer Space Telescope mission, a lot of functionalities are of general usefulness and can be used for working with existing astronomical data and for new missions. The software used in the package has undergone intensive testing and benefited greatly from effective software reuse. The visualization part has been used for observation planning for both the Spitzer and Herschel Space Telescopes as part the tool Spot. The visualization capabilities of Spot have been enhanced and integrated with the image processing functionality of the command-line driven MOPEX. The image processing software is used in the Spitzer automated pipeline processing, which has been in operation for nearly 3 years. The image processing capabilities have also been tested in off-line processing by numerous astronomers at various institutions around the world. The package is multi-platform and includes automatic update capabilities. The software package has been developed by a small group of software developers and scientists at the Spitzer Science Center. It is available for distribution at the Spitzer Science Center web page.

  1. A streamlined Python framework for AT-TPC data analysis

    NASA Astrophysics Data System (ADS)

    Taylor, J. Z.; Bradt, J.; Bazin, D.; Kuchera, M. P.

    2017-09-01

    User-friendly data analysis software has been developed for the Active-Target Time Projection Chamber (AT-TPC) experiment at the National Superconducting Cyclotron Laboratory at Michigan State University. The AT-TPC, commissioned in 2014, is a gas-filled detector that acts as both the detector and target for high-efficiency detection of low-intensity, exotic nuclear reactions. The pytpc framework is a Python package for analyzing AT-TPC data. The package was developed for the analysis of 46Ar(p, p) data. The existing software was used to analyze data produced by the 40Ar(p, p) experiment that ran in August, 2015. Usage of the package was documented in an analysis manual both to improve analysis steps and aid in the work of future AT-TPC users. Software features and analysis methods in the pytpc framework will be presented along with the 40Ar results.

  2. Sampling and sensitivity analyses tools (SaSAT) for computational modelling

    PubMed Central

    Hoare, Alexander; Regan, David G; Wilson, David P

    2008-01-01

    SaSAT (Sampling and Sensitivity Analysis Tools) is a user-friendly software package for applying uncertainty and sensitivity analyses to mathematical and computational models of arbitrary complexity and context. The toolbox is built in Matlab®, a numerical mathematical software package, and utilises algorithms contained in the Matlab® Statistics Toolbox. However, Matlab® is not required to use SaSAT as the software package is provided as an executable file with all the necessary supplementary files. The SaSAT package is also designed to work seamlessly with Microsoft Excel but no functionality is forfeited if that software is not available. A comprehensive suite of tools is provided to enable the following tasks to be easily performed: efficient and equitable sampling of parameter space by various methodologies; calculation of correlation coefficients; regression analysis; factor prioritisation; and graphical output of results, including response surfaces, tornado plots, and scatterplots. Use of SaSAT is exemplified by application to a simple epidemic model. To our knowledge, a number of the methods available in SaSAT for performing sensitivity analyses have not previously been used in epidemiological modelling and their usefulness in this context is demonstrated. PMID:18304361

  3. Detection And Mapping (DAM) package. Volume 4B: Software System Manual, part 2

    NASA Technical Reports Server (NTRS)

    Schlosser, E. H.

    1980-01-01

    Computer programs, graphic devices, and an integrated set of manual procedures designed for efficient production of precisely registered and formatted maps from digital data are presented. The software can be used on any Univac 1100 series computer. The software includes pre-defined spectral limits for use in classifying and mapping surface water for LANDSAT-1, LANDSAT-2, and LANDSAT-3.

  4. Casting technology for manufacturing metal rods from simulated metallic spent fuels

    NASA Astrophysics Data System (ADS)

    Leeand, Y. S.; Lee, D. B.; Kim, C. K.; Shin, Y. J.; Lee, J. H.

    2000-09-01

    A uranium metal rod 13.5 mm in diameter and 1,150 mm long was produced from simulated metallic spent fuels with advanced casting equipment using the directional-solidification method. A vacuum casting furnace equipped with a four-zone heater to prevent surface oxidation and the formation of surface shrinkage holes was designed. By controlling the axial temperature gradient of the casting furnace, deformation by the surface shrinkage phenomena was diminished, and a sound rod was manufactured. The cooling behavior of the molten uranium was analyzed using the computer software package MAGMAsoft.

  5. Modern software approaches applied to a Hydrological model: the GEOtop Open-Source Software Project

    NASA Astrophysics Data System (ADS)

    Cozzini, Stefano; Endrizzi, Stefano; Cordano, Emanuele; Bertoldi, Giacomo; Dall'Amico, Matteo

    2017-04-01

    The GEOtop hydrological scientific package is an integrated hydrological model that simulates the heat and water budgets at and below the soil surface. It describes the three-dimensional water flow in the soil and the energy exchange with the atmosphere, considering the radiative and turbulent fluxes. Furthermore, it reproduces the highly non-linear interactions between the water and energy balance during soil freezing and thawing, and simulates the temporal evolution of snow cover, soil temperature and moisture. The core components of the package were presented in the 2.0 version (Endrizzi et al, 2014), which was released as Free Software Open-source project. However, despite the high scientific quality of the project, a modern software engineering approach was still missing. Such weakness hindered its scientific potential and its use both as a standalone package and, more importantly, in an integrate way with other hydrological software tools. In this contribution we present our recent software re-engineering efforts to create a robust and stable scientific software package open to the hydrological community, easily usable by researchers and experts, and interoperable with other packages. The activity takes as a starting point the 2.0 version, scientifically tested and published. This version, together with several test cases based on recent published or available GEOtop applications (Cordano and Rigon, 2013, WRR, Kollet et al, 2016, WRR) provides the baseline code and a certain number of referenced results as benchmark. Comparison and scientific validation can then be performed for each software re-engineering activity performed on the package. To keep track of any single change the package is published on its own github repository geotopmodel.github.io/geotop/ under GPL v3.0 license. A Continuous Integration mechanism by means of Travis-CI has been enabled on the github repository on master and main development branches. The usage of CMake configuration tool and the suite of tests (easily manageable by means of ctest tools) greatly reduces the burden of the installation and allows us to enhance portability on different compilers and Operating system platforms. The package was also complemented by several software tools which provide web-based visualization of results based on R plugins, in particular "shiny" (Chang at al, 2016), "geotopbricks" and "geotopOptim2" (Cordano et al, 2016) packages, which allow rapid and efficient scientific validation of new examples and tests. The software re-engineering activities are still under development. However, our first results are promising enough to eventually reach a robust and stable software project that manages in a flexible way a complex state-of-the-art hydrological model like GEOtop and integrates it into wider workflows.

  6. Nonstationary Extreme Value Analysis in a Changing Climate: A Software Package

    NASA Astrophysics Data System (ADS)

    Cheng, L.; AghaKouchak, A.; Gilleland, E.

    2013-12-01

    Numerous studies show that climatic extremes have increased substantially in the second half of the 20th century. For this reason, analysis of extremes under a nonstationary assumption has received a great deal of attention. This paper presents a software package developed for estimation of return levels, return periods, and risks of climatic extremes in a changing climate. This MATLAB software package offers tools for analysis of climate extremes under both stationary and non-stationary assumptions. The Nonstationary Extreme Value Analysis (hereafter, NEVA) provides an efficient and generalized framework for analyzing extremes using Bayesian inference. NEVA estimates the extreme value parameters using a Differential Evolution Markov Chain (DE-MC) which utilizes the genetic algorithm Differential Evolution (DE) for global optimization over the real parameter space with the Markov Chain Monte Carlo (MCMC) approach and has the advantage of simplicity, speed of calculation and convergence over conventional MCMC. NEVA also offers the confidence interval and uncertainty bounds of estimated return levels based on the sampled parameters. NEVA integrates extreme value design concepts, data analysis tools, optimization and visualization, explicitly designed to facilitate analysis extremes in geosciences. The generalized input and output files of this software package make it attractive for users from across different fields. Both stationary and nonstationary components of the package are validated for a number of case studies using empirical return levels. The results show that NEVA reliably describes extremes and their return levels.

  7. Investigation of charge dissipation in jet fuel in a dielectric fuel tank

    NASA Astrophysics Data System (ADS)

    Kitanin, E. L.; Kravtsov, P. A.; Trofimov, V. A.; Kitanina, E. E.; Bondarenko, D. A.

    2017-09-01

    The electrostatic charge dissipation process in jet fuel in a polypropylene tank was investigated experimentally. Groundable metallic terminals were installed in the tank walls to accelerate the dissipation process. Several sensors and an electrometer with a current measuring range from 10-11 to 10-3 A were specifically designed to study the dissipation rates. It was demonstrated that thanks to the sensors and the electrometer one can obtain reliable measurements of the dissipation rate and look at how it is influenced by the number and locations of the terminals. Conductivity of jet fuel and effective conductivity of the tank walls were investigated in addition. The experimental data agree well with the numerical simulation results obtained using COMSOL software package.

  8. Engineering Margin Factors Used in the Design of the VVER Fuel Cycles

    NASA Astrophysics Data System (ADS)

    Lizorkin, M. P.; Shishkov, L. K.

    2017-12-01

    The article describes methods for determination of the engineering margin factors currently used to estimate the uncertainties of the VVER reactor design parameters calculated via the KASKAD software package developed at the National Research Center Kurchatov Institute. These margin factors ensure the meeting of the operating (design) limits and a number of other restrictions under normal operating conditions.

  9. sparse-msrf:A package for sparse modeling and estimation of fossil-fuel CO2 emission fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2014-10-06

    The software is used to fit models of emission fields (e.g., fossil-fuel CO2 emissions) to sparse measurements of gaseous concentrations. Its primary aim is to provide an implementation and a demonstration for the algorithms and models developed in J. Ray, V. Yadav, A. M. Michalak, B. van Bloemen Waanders and S. A. McKenna, "A multiresolution spatial parameterization for the estimation of fossil-fuel carbon dioxide emissions via atmospheric inversions", accepted, Geoscientific Model Development, 2014. The software can be used to estimate emissions of non-reactive gases such as fossil-fuel CO2, methane etc. The software uses a proxy of the emission field beingmore » estimated (e.g., for fossil-fuel CO2, a population density map is a good proxy) to construct a wavelet model for the emission field. It then uses a shrinkage regression algorithm called Stagewise Orthogonal Matching Pursuit (StOMP) to fit the wavelet model to concentration measurements, using an atmospheric transport model to relate emission and concentration fields. Algorithmic novelties described in the paper above (1) ensure that the estimated emission fields are non-negative, (2) allow the use of guesses for emission fields to accelerate the estimation processes and (3) ensure that under/overestimates in the guesses do not skew the estimation.« less

  10. DAM package version 7807: Software fixes and enhancements

    NASA Technical Reports Server (NTRS)

    Schlosser, E.

    1979-01-01

    The Detection and Mapping package is an integrated set of manual procedures, computer programs, and graphic devices designed for efficient production of precisely registered, formatted, and interpreted maps from digital LANDSAT multispectral scanner data. This report documents changes to the DAM package in support of its use by the Corps of Engineers for inventorying impounded surface water. Although these changes are presented in terms of their application to detecting and mapping surface water, they are equally relevant to other land surface materials.

  11. Computer Facilitated Mathematical Methods in Chemical Engineering--Similarity Solution

    ERIC Educational Resources Information Center

    Subramanian, Venkat R.

    2006-01-01

    High-performance computers coupled with highly efficient numerical schemes and user-friendly software packages have helped instructors to teach numerical solutions and analysis of various nonlinear models more efficiently in the classroom. One of the main objectives of a model is to provide insight about the system of interest. Analytical…

  12. Current trends for customized biomedical software tools.

    PubMed

    Khan, Haseeb Ahmad

    2017-01-01

    In the past, biomedical scientists were solely dependent on expensive commercial software packages for various applications. However, the advent of user-friendly programming languages and open source platforms has revolutionized the development of simple and efficient customized software tools for solving specific biomedical problems. Many of these tools are designed and developed by biomedical scientists independently or with the support of computer experts and often made freely available for the benefit of scientific community. The current trends for customized biomedical software tools are highlighted in this short review.

  13. Surface CHEMKIN (Version 4. 0): A Fortran package for analyzing heterogeneous chemical kinetics at a solid-surface---gas-phase interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coltrin, M.E.; Kee, R.J.; Rupley, F.M.

    1991-07-01

    Heterogeneous reaction at the interface between a solid surface and adjacent gas is central to many chemical processes. Our purpose for developing the software package SURFACE CHEMKIN was motivated by our need to understand the complex surface chemistry in chemical vapor deposition systems involving silicon, silicon nitride, and gallium arsenide. However, we have developed the approach and implemented the software in a general setting. Thus, we expect it will find use in such diverse applications as chemical vapor deposition, chemical etching, combustion of solids, and catalytic processes, and for a wide range of chemical systems. We believe that it providesmore » a powerful capability to help model, understand, and optimize important industrial and research chemical processes. The SURFACE CHEMKIN software is designed to work in conjunction with the CHEMKIN-2 software, which handles the chemical kinetics in the gas phase. It may also be used in conjunction with the Transport Property Package, which provides information about molecular diffusion. Thus, these three packages provide a foundation on which a user can build applications software to analyze gas-phase and heterogeneous chemistry in flowing systems. These packages should not be considered programs'' in the ordinary sense. That is, they are not designed to accept input, solve a particular problem, and report the answer. Instead, they are software tools intended to help a user work efficiently with large systems of chemical reactions and develop Fortran representations of systems of equations that define a particular problem. It is up the user to solve the problem and interpret the answer. 11 refs., 15 figs., 5 tabs.« less

  14. Method of forming a package for MEMS-based fuel cell

    DOEpatents

    Morse, Jeffrey D; Jankowski, Alan F

    2013-05-21

    A MEMS-based fuel cell package and method thereof is disclosed. The fuel cell package comprises seven layers: (1) a sub-package fuel reservoir interface layer, (2) an anode manifold support layer, (3) a fuel/anode manifold and resistive heater layer, (4) a Thick Film Microporous Flow Host Structure layer containing a fuel cell, (5) an air manifold layer, (6) a cathode manifold support structure layer, and (7) a cap. Fuel cell packages with more than one fuel cell are formed by positioning stacks of these layers in series and/or parallel. The fuel cell package materials such as a molded plastic or a ceramic green tape material can be patterned, aligned and stacked to form three dimensional microfluidic channels that provide electrical feedthroughs from various layers which are bonded together and mechanically support a MEMS-based miniature fuel cell. The package incorporates resistive heating elements to control the temperature of the fuel cell stack. The package is fired to form a bond between the layers and one or more microporous flow host structures containing fuel cells are inserted within the Thick Film Microporous Flow Host Structure layer of the package.

  15. Method of forming a package for mems-based fuel cell

    DOEpatents

    Morse, Jeffrey D.; Jankowski, Alan F.

    2004-11-23

    A MEMS-based fuel cell package and method thereof is disclosed. The fuel cell package comprises seven layers: (1) a sub-package fuel reservoir interface layer, (2) an anode manifold support layer, (3) a fuel/anode manifold and resistive heater layer, (4) a Thick Film Microporous Flow Host Structure layer containing a fuel cell, (5) an air manifold layer, (6) a cathode manifold support structure layer, and (7) a cap. Fuel cell packages with more than one fuel cell are formed by positioning stacks of these layers in series and/or parallel. The fuel cell package materials such as a molded plastic or a ceramic green tape material can be patterned, aligned and stacked to form three dimensional microfluidic channels that provide electrical feedthroughs from various layers which are bonded together and mechanically support a MEMOS-based miniature fuel cell. The package incorporates resistive heating elements to control the temperature of the fuel cell stack. The package is fired to form a bond between the layers and one or more microporous flow host structures containing fuel cells are inserted within the Thick Film Microporous Flow Host Structure layer of the package.

  16. Design of a fuel-efficient guidance system for a STOL aircraft

    NASA Technical Reports Server (NTRS)

    Mclean, J. D.; Erzberger, H.

    1981-01-01

    In the predictive mode, the system synthesizes a horizontal path from an initial aircraft position and heading to a desired final position and heading and then synthesizes a fuel-efficient speed-altitude profile along the path. In the track mode, the synthesized trajectory is reconstructed and tracked automatically. An analytical basis for the design of the system is presented and a description of the airborne computer implementation is given. A detailed discussion of the software, which should be helpful to those who use the actual software developed for these tests, is also provided.

  17. pyLIMA: An Open-source Package for Microlensing Modeling. I. Presentation of the Software and Analysis of Single-lens Models

    NASA Astrophysics Data System (ADS)

    Bachelet, E.; Norbury, M.; Bozza, V.; Street, R.

    2017-11-01

    Microlensing is a unique tool, capable of detecting the “cold” planets between ˜1 and 10 au from their host stars and even unbound “free-floating” planets. This regime has been poorly sampled to date owing to the limitations of alternative planet-finding methods, but a watershed in discoveries is anticipated in the near future thanks to the planned microlensing surveys of WFIRST-AFTA and Euclid's Extended Mission. Of the many challenges inherent in these missions, the modeling of microlensing events will be of primary importance, yet it is often time-consuming, complex, and perceived as a daunting barrier to participation in the field. The large scale of future survey data products will require thorough but efficient modeling software, but, unlike other areas of exoplanet research, microlensing currently lacks a publicly available, well-documented package to conduct this type of analysis. We present version 1.0 of the python Lightcurve Identification and Microlensing Analysis (pyLIMA). This software is written in Python and uses existing packages as much as possible to make it widely accessible. In this paper, we describe the overall architecture of the software and the core modules for modeling single-lens events. To verify the performance of this software, we use it to model both real data sets from events published in the literature and generated test data produced using pyLIMA's simulation module. The results demonstrate that pyLIMA is an efficient tool for microlensing modeling. We will expand pyLIMA to consider more complex phenomena in the following papers.

  18. Melanie II--a third-generation software package for analysis of two-dimensional electrophoresis images: I. Features and user interface.

    PubMed

    Appel, R D; Palagi, P M; Walther, D; Vargas, J R; Sanchez, J C; Ravier, F; Pasquali, C; Hochstrasser, D F

    1997-12-01

    Although two-dimensional electrophoresis (2-DE) computer analysis software packages have existed ever since 2-DE technology was developed, it is only now that the hardware and software technology allows large-scale studies to be performed on low-cost personal computers or workstations, and that setting up a 2-DE computer analysis system in a small laboratory is no longer considered a luxury. After a first attempt in the seventies and early eighties to develop 2-DE analysis software systems on hardware that had poor or even no graphical capabilities, followed in the late eighties by a wave of innovative software developments that were possible thanks to new graphical interface standards such as XWindows, a third generation of 2-DE analysis software packages has now come to maturity. It can be run on a variety of low-cost, general-purpose personal computers, thus making the purchase of a 2-DE analysis system easily attainable for even the smallest laboratory that is involved in proteome research. Melanie II 2-D PAGE, developed at the University Hospital of Geneva, is such a third-generation software system for 2-DE analysis. Based on unique image processing algorithms, this user-friendly object-oriented software package runs on multiple platforms, including Unix, MS-Windows 95 and NT, and Power Macintosh. It provides efficient spot detection and quantitation, state-of-the-art image comparison, statistical data analysis facilities, and is Internet-ready. Linked to proteome databases such as those available on the World Wide Web, it represents a valuable tool for the "Virtual Lab" of the post-genome area.

  19. Indoor air pollution in slum neighbourhoods of Addis Ababa, Ethiopia

    NASA Astrophysics Data System (ADS)

    Sanbata, Habtamu; Asfaw, Araya; Kumie, Abera

    2014-06-01

    An estimated 95% of the population of Ethiopia uses traditional biomass fuels, such as wood, dung, charcoal, or crop residues, to meet household energy needs. As a result of the harmful smoke emitted from the combustion of biomass fuels, indoor air pollution is responsible for more than 50,000 deaths annually and causes nearly 5% of the burden of disease in Ethiopia. Very limited research on indoor air pollution and its health impacts exists in Ethiopia. This study was, therefore, undertaken to assess the magnitude of indoor air pollution from household fuel use in Addis Ababa, the capital city of Ethiopia. During January and February, 2012, the concentration of fine particulate matter (PM2.5) in 59 households was measured using the University of California at Berkeley Particle Monitor (UCB PM). The raw data was analysed using Statistical Package of Social Science (SPSS version 20.0) software to determine variance between groups and descriptive statistics. The geometric mean of 24-h indoor PM2.5 concentration is approximately 818 μg m-3 (Standard deviation (SD = 3.61)). The highest 24-h geometric mean of PM2.5 concentration observed were 1134 μg m-3 (SD = 3.36), 637 μg m-3 (SD = 4.44), and 335 μg m-3 (SD = 2.51), respectively, in households using predominantly solid fuel, kerosene, and clean fuel. Although 24-h mean PM2.5 concentration between fuel types differed statistically (P < 0.05), post hoc pairwise comparison indicated no significant difference in mean concentration of PM2.5 between improved biomass stoves and traditional stoves (P > 0.05). The study revealed indoor air pollution is a major environmental and health hazard from home using biomass fuel in Addis Ababa. The use of clean fuels and efficient cooking stoves is recommended.

  20. Development of a Solid-Oxide Fuel Cell/Gas Turbine Hybrid System Model for Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Freeh, Joshua E.; Pratt, Joseph W.; Brouwer, Jacob

    2004-01-01

    Recent interest in fuel cell-gas turbine hybrid applications for the aerospace industry has led to the need for accurate computer simulation models to aid in system design and performance evaluation. To meet this requirement, solid oxide fuel cell (SOFC) and fuel processor models have been developed and incorporated into the Numerical Propulsion Systems Simulation (NPSS) software package. The SOFC and reformer models solve systems of equations governing steady-state performance using common theoretical and semi-empirical terms. An example hybrid configuration is presented that demonstrates the new capability as well as the interaction with pre-existing gas turbine and heat exchanger models. Finally, a comparison of calculated SOFC performance with experimental data is presented to demonstrate model validity. Keywords: Solid Oxide Fuel Cell, Reformer, System Model, Aerospace, Hybrid System, NPSS

  1. Geometric modeling for computer aided design

    NASA Technical Reports Server (NTRS)

    Schwing, James L.; Olariu, Stephen

    1995-01-01

    The primary goal of this grant has been the design and implementation of software to be used in the conceptual design of aerospace vehicles particularly focused on the elements of geometric design, graphical user interfaces, and the interaction of the multitude of software typically used in this engineering environment. This has resulted in the development of several analysis packages and design studies. These include two major software systems currently used in the conceptual level design of aerospace vehicles. These tools are SMART, the Solid Modeling Aerospace Research Tool, and EASIE, the Environment for Software Integration and Execution. Additional software tools were designed and implemented to address the needs of the engineer working in the conceptual design environment. SMART provides conceptual designers with a rapid prototyping capability and several engineering analysis capabilities. In addition, SMART has a carefully engineered user interface that makes it easy to learn and use. Finally, a number of specialty characteristics have been built into SMART which allow it to be used efficiently as a front end geometry processor for other analysis packages. EASIE provides a set of interactive utilities that simplify the task of building and executing computer aided design systems consisting of diverse, stand-alone, analysis codes. Resulting in a streamlining of the exchange of data between programs reducing errors and improving the efficiency. EASIE provides both a methodology and a collection of software tools to ease the task of coordinating engineering design and analysis codes.

  2. Biochemical Network Stochastic Simulator (BioNetS): software for stochastic modeling of biochemical networks.

    PubMed

    Adalsteinsson, David; McMillen, David; Elston, Timothy C

    2004-03-08

    Intrinsic fluctuations due to the stochastic nature of biochemical reactions can have large effects on the response of biochemical networks. This is particularly true for pathways that involve transcriptional regulation, where generally there are two copies of each gene and the number of messenger RNA (mRNA) molecules can be small. Therefore, there is a need for computational tools for developing and investigating stochastic models of biochemical networks. We have developed the software package Biochemical Network Stochastic Simulator (BioNetS) for efficiently and accurately simulating stochastic models of biochemical networks. BioNetS has a graphical user interface that allows models to be entered in a straightforward manner, and allows the user to specify the type of random variable (discrete or continuous) for each chemical species in the network. The discrete variables are simulated using an efficient implementation of the Gillespie algorithm. For the continuous random variables, BioNetS constructs and numerically solves the appropriate chemical Langevin equations. The software package has been developed to scale efficiently with network size, thereby allowing large systems to be studied. BioNetS runs as a BioSpice agent and can be downloaded from http://www.biospice.org. BioNetS also can be run as a stand alone package. All the required files are accessible from http://x.amath.unc.edu/BioNetS. We have developed BioNetS to be a reliable tool for studying the stochastic dynamics of large biochemical networks. Important features of BioNetS are its ability to handle hybrid models that consist of both continuous and discrete random variables and its ability to model cell growth and division. We have verified the accuracy and efficiency of the numerical methods by considering several test systems.

  3. Design and Operation of an Electrochemical Methanol Concentration Sensor for Direct Methanol Fuel Cell Systems

    NASA Technical Reports Server (NTRS)

    Narayanan, S. R.; Valdez, T. I.; Chun, W.

    2000-01-01

    The development of a 150-Watt packaged power source based on liquid feed direct methanol fuel cells is being pursued currently at the Jet propulsion Laboratory for defense applications. In our studies we find that the concentration of methanol in the fuel circulation loop affects the electrical performance and efficiency the direct methanol fuel cell systems significantly. The practical operation of direct methanol fuel cell systems, therefore, requires accurate monitoring and control of methanol concentration. The present paper reports on the principle and demonstration of an in-house developed electrochemical sensor suitable for direct methanol fuel cell systems.

  4. 48 CFR 908.7109 - Fuels and packaged petroleum products.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Fuels and packaged petroleum products. 908.7109 Section 908.7109 Federal Acquisition Regulations System DEPARTMENT OF ENERGY....7109 Fuels and packaged petroleum products. Acquisitions of fuel and packaged petroleum products by DOE...

  5. High-Performance Computing for the Electromagnetic Modeling and Simulation of Interconnects

    NASA Technical Reports Server (NTRS)

    Schutt-Aine, Jose E.

    1996-01-01

    The electromagnetic modeling of packages and interconnects plays a very important role in the design of high-speed digital circuits, and is most efficiently performed by using computer-aided design algorithms. In recent years, packaging has become a critical area in the design of high-speed communication systems and fast computers, and the importance of the software support for their development has increased accordingly. Throughout this project, our efforts have focused on the development of modeling and simulation techniques and algorithms that permit the fast computation of the electrical parameters of interconnects and the efficient simulation of their electrical performance.

  6. Temperature-package power correlations for open-mode geologic disposal concepts.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hardin, Ernest.

    2013-02-01

    Logistical simulation of spent nuclear fuel (SNF) management in the U.S. combines storage, transportation and disposal elements to evaluate schedule, cost and other resources needed for all major operations leading to final geologic disposal. Geologic repository reference options are associated with limits on waste package thermal power output at emplacement, in order to meet limits on peak temperature for certain key engineered and natural barriers. These package power limits are used in logistical simulation software such as CALVIN, as threshold requirements that must be met by means of decay storage or SNF blending in waste packages, before emplacement in amore » repository. Geologic repository reference options include enclosed modes developed for crystalline rock, clay or shale, and salt. In addition, a further need has been addressed for open modes in which SNF can be emplaced in a repository, then ventilated for decades or longer to remove heat, prior to permanent repository closure. For each open mode disposal concept there are specified durations for surface decay storage (prior to emplacement), repository ventilation, and repository closure operations. This study simulates those steps for several timing cases, and for SNF with three fuel-burnup characteristics, to develop package power limits at which waste packages can be emplaced without exceeding specified temperature limits many years later after permanent closure. The results are presented in the form of correlations that span a range of package power and peak postclosure temperature, for each open-mode disposal concept, and for each timing case. Given a particular temperature limit value, the corresponding package power limit for each case can be selected for use in CALVIN and similar tools.« less

  7. Impact of waste heat recovery systems on energy efficiency improvement of a heavy-duty diesel engine

    NASA Astrophysics Data System (ADS)

    Ma, Zheshu; Chen, Hua; Zhang, Yong

    2017-09-01

    The increase of ship's energy utilization efficiency and the reduction of greenhouse gas emissions have been high lightened in recent years and have become an increasingly important subject for ship designers and owners. The International Maritime Organization (IMO) is seeking measures to reduce the CO2 emissions from ships, and their proposed energy efficiency design index (EEDI) and energy efficiency operational indicator (EEOI) aim at ensuring that future vessels will be more efficient. Waste heat recovery can be employed not only to improve energy utilization efficiency but also to reduce greenhouse gas emissions. In this paper, a typical conceptual large container ship employing a low speed marine diesel engine as the main propulsion machinery is introduced and three possible types of waste heat recovery systems are designed. To calculate the EEDI and EEOI of the given large container ship, two software packages are developed. From the viewpoint of operation and maintenance, lowering the ship speed and improving container load rate can greatly reduce EEOI and further reduce total fuel consumption. Although the large container ship itself can reach the IMO requirements of EEDI at the first stage with a reduction factor 10% under the reference line value, the proposed waste heat recovery systems can improve the ship EEDI reduction factor to 20% under the reference line value.

  8. GRID INDEPENDENT FUEL CELL OPERATED SMART HOME

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dr. Mohammad S. Alam

    2003-12-07

    A fuel cell power plant, which utilizes a smart energy management and control (SEMaC) system, supplying the power need of laboratory based ''home'' has been purchased and installed. The ''home'' consists of two rooms, each approximately 250 sq. ft. Every appliance and power outlet is under the control of a host computer, running the SEMaC software package. It is possible to override the computer, in the event that an appliance or power outage is required. Detailed analysis and simulation of the fuel cell operated smart home has been performed. Two journal papers has been accepted for publication and another journalmore » paper is under review. Three theses have been completed and three additional theses are in progress.« less

  9. STEFFY-software to calculate nuclide-specific total counting efficiency in well-type γ-ray detectors.

    PubMed

    Pommé, S

    2012-09-01

    A software package is presented to calculate the total counting efficiency for the decay of radionuclides in a well-type γ-ray detector. It is specifically applied to primary standardisation of activity by means of 4πγ-counting with a NaI(Tl) well-type scintillation detector. As an alternative to Monte Carlo simulations, the software combines good accuracy with superior speed and ease-of-use. It is also well suited to investigate uncertainties associated with the 4πγ-counting method for a variety of radionuclides and detector dimensions. In this paper, the underlying analytical models for the radioactive decay and subsequent counting efficiency of the emitted radiation in the detector are summarised. Copyright © 2012 Elsevier Ltd. All rights reserved.

  10. An open-source software package for multivariate modeling and clustering: applications to air quality management.

    PubMed

    Wang, Xiuquan; Huang, Guohe; Zhao, Shan; Guo, Junhong

    2015-09-01

    This paper presents an open-source software package, rSCA, which is developed based upon a stepwise cluster analysis method and serves as a statistical tool for modeling the relationships between multiple dependent and independent variables. The rSCA package is efficient in dealing with both continuous and discrete variables, as well as nonlinear relationships between the variables. It divides the sample sets of dependent variables into different subsets (or subclusters) through a series of cutting and merging operations based upon the theory of multivariate analysis of variance (MANOVA). The modeling results are given by a cluster tree, which includes both intermediate and leaf subclusters as well as the flow paths from the root of the tree to each leaf subcluster specified by a series of cutting and merging actions. The rSCA package is a handy and easy-to-use tool and is freely available at http://cran.r-project.org/package=rSCA . By applying the developed package to air quality management in an urban environment, we demonstrate its effectiveness in dealing with the complicated relationships among multiple variables in real-world problems.

  11. Common Data Models and Efficient Reproducible Workflows for Distributed Ocean Model Skill Assessment

    NASA Astrophysics Data System (ADS)

    Signell, R. P.; Snowden, D. P.; Howlett, E.; Fernandes, F. A.

    2014-12-01

    Model skill assessment requires discovery, access, analysis, and visualization of information from both sensors and models, and traditionally has been possible only by a few experts. The US Integrated Ocean Observing System (US-IOOS) consists of 17 Federal Agencies and 11 Regional Associations that produce data from various sensors and numerical models; exactly the information required for model skill assessment. US-IOOS is seeking to develop documented skill assessment workflows that are standardized, efficient, and reproducible so that a much wider community can participate in the use and assessment of model results. Standardization requires common data models for observational and model data. US-IOOS relies on the CF Conventions for observations and structured grid data, and on the UGRID Conventions for unstructured (e.g. triangular) grid data. This allows applications to obtain only the data they require in a uniform and parsimonious way using web services: OPeNDAP for model output and OGC Sensor Observation Service (SOS) for observed data. Reproducibility is enabled with IPython Notebooks shared on GitHub (http://github.com/ioos). These capture the entire skill assessment workflow, including user input, search, access, analysis, and visualization, ensuring that workflows are self-documenting and reproducible by anyone, using free software. Python packages for common data models are Pyugrid and the British Met Office Iris package. Python packages required to run the workflows (pyugrid, pyoos, and the British Met Office Iris package) are also available on GitHub and on Binstar.org so that users can run scenarios using the free Anaconda Python distribution. Hosted services such as Wakari enable anyone to reproduce these workflows for free, without installing any software locally, using just their web browser. We are also experimenting with Wakari Enterprise, which allows multi-user access from a web browser to an IPython Server running where large quantities of model output reside, increasing the efficiency. The open development and distribution of these workflows, and the software on which they depend, is an educational resource for those new to the field and a center of focus where practitioners can contribute new software and ideas.

  12. Tractor Mechanics: Maintaining and Servicing the Fuel System. Learning Activity Packages 20-33.

    ERIC Educational Resources Information Center

    Clemson Univ., SC. Vocational Education Media Center.

    Learning activity packages are presented for instruction in tractor mechanics. The packages deal with the duties involved in maintaining the fuel system. The following fourteen learning activity packages are included: servicing fuel and air filters, servicing fuel tanks and lines, adjusting a carburetor, servicing a carburetor, servicing the…

  13. Enhancing the Breadth and Efficacy of Therapeutic Vaccines for Breast Cancer

    DTIC Science & Technology

    2014-10-01

    sequence data produced by the Slansky team following their single-cell emulsion RT-PCR technique; however, it can be packaged and shared for use...cell emulsion RT-PCR. Additional modifications were made to our epitope discovery workflow to increase efficacy of transcript and neoantigen candidate...the MiTCR [8] open source software package developed by MiLaboratory. MiTCR is a highly efficient and fast approach to CDR3 extraction, clonotype

  14. MIDAS: Software for the detection and analysis of lunar impact flashes

    NASA Astrophysics Data System (ADS)

    Madiedo, José M.; Ortiz, José L.; Morales, Nicolás; Cabrera-Caño, Jesús

    2015-06-01

    Since 2009 we are running a project to identify flashes produced by the impact of meteoroids on the surface of the Moon. For this purpose we are employing small telescopes and high-sensitivity CCD video cameras. To automatically identify these events a software package called MIDAS was developed and tested. This package can also perform the photometric analysis of these flashes and estimate the value of the luminous efficiency. Besides, we have implemented in MIDAS a new method to establish which is the likely source of the meteoroids (known meteoroid stream or sporadic background). The main features of this computer program are analyzed here, and some examples of lunar impact events are presented.

  15. Wide-Field Imaging Telescope-0 (WIT0) with automatic observing system

    NASA Astrophysics Data System (ADS)

    Ji, Tae-Geun; Byeon, Seoyeon; Lee, Hye-In; Park, Woojin; Lee, Sang-Yun; Hwang, Sungyong; Choi, Changsu; Gibson, Coyne Andrew; Kuehne, John W.; Prochaska, Travis; Marshall, Jennifer L.; Im, Myungshin; Pak, Soojong

    2018-01-01

    We introduce Wide-Field Imaging Telescope-0 (WIT0), with an automatic observing system. It is developed for monitoring the variabilities of many sources at a time, e.g. young stellar objects and active galactic nuclei. It can also find the locations of transient sources such as a supernova or gamma-ray bursts. In 2017 February, we installed the wide-field 10-inch telescope (Takahashi CCA-250) as a piggyback system on the 30-inch telescope at the McDonald Observatory in Texas, US. The 10-inch telescope has a 2.35 × 2.35 deg field-of-view with a 4k × 4k CCD Camera (FLI ML16803). To improve the observational efficiency of the system, we developed a new automatic observing software, KAOS30 (KHU Automatic Observing Software for McDonald 30-inch telescope), which was developed by Visual C++ on the basis of a windows operating system. The software consists of four control packages: the Telescope Control Package (TCP), the Data Acquisition Package (DAP), the Auto Focus Package (AFP), and the Script Mode Package (SMP). Since it also supports the instruments that are using the ASCOM driver, the additional hardware installations become quite simplified. We commissioned KAOS30 in 2017 August and are in the process of testing. Based on the WIT0 experiences, we will extend KAOS30 to control multiple telescopes in future projects.

  16. Moving code - Sharing geoprocessing logic on the Web

    NASA Astrophysics Data System (ADS)

    Müller, Matthias; Bernard, Lars; Kadner, Daniel

    2013-09-01

    Efficient data processing is a long-standing challenge in remote sensing. Effective and efficient algorithms are required for product generation in ground processing systems, event-based or on-demand analysis, environmental monitoring, and data mining. Furthermore, the increasing number of survey missions and the exponentially growing data volume in recent years have created demand for better software reuse as well as an efficient use of scalable processing infrastructures. Solutions that address both demands simultaneously have begun to slowly appear, but they seldom consider the possibility to coordinate development and maintenance efforts across different institutions, community projects, and software vendors. This paper presents a new approach to share, reuse, and possibly standardise geoprocessing logic in the field of remote sensing. Drawing from the principles of service-oriented design and distributed processing, this paper introduces moving-code packages as self-describing software components that contain algorithmic code and machine-readable descriptions of the provided functionality, platform, and infrastructure, as well as basic information about exploitation rights. Furthermore, the paper presents a lean publishing mechanism by which to distribute these packages on the Web and to integrate them in different processing environments ranging from monolithic workstations to elastic computational environments or "clouds". The paper concludes with an outlook toward community repositories for reusable geoprocessing logic and their possible impact on data-driven science in general.

  17. Stackfile Database

    NASA Technical Reports Server (NTRS)

    deVarvalho, Robert; Desai, Shailen D.; Haines, Bruce J.; Kruizinga, Gerhard L.; Gilmer, Christopher

    2013-01-01

    This software provides storage retrieval and analysis functionality for managing satellite altimetry data. It improves the efficiency and analysis capabilities of existing database software with improved flexibility and documentation. It offers flexibility in the type of data that can be stored. There is efficient retrieval either across the spatial domain or the time domain. Built-in analysis tools are provided for frequently performed altimetry tasks. This software package is used for storing and manipulating satellite measurement data. It was developed with a focus on handling the requirements of repeat-track altimetry missions such as Topex and Jason. It was, however, designed to work with a wide variety of satellite measurement data [e.g., Gravity Recovery And Climate Experiment -- GRACE). The software consists of several command-line tools for importing, retrieving, and analyzing satellite measurement data.

  18. Thermodynamic and kinetic modelling of fuel oxidation behaviour in operating defective fuel

    NASA Astrophysics Data System (ADS)

    Lewis, operating defective fuel B. J.; Thompson, W. T.; Akbari, F.; Thompson, D. M.; Thurgood, C.; Higgs, J.

    2004-07-01

    A theoretical treatment has been developed to predict the fuel oxidation behaviour in operating defective nuclear fuel elements. The equilibrium stoichiometry deviation in the hyper-stoichiometric fuel has been derived from thermodynamic considerations using a self-consistent set of thermodynamic properties for the U-O system, which emphasizes replication of solubilities and three-phase invariant conditions displayed in the U-O binary phase diagram. The kinetics model accounts for multi-phase transport including interstitial oxygen diffusion in the solid and gas-phase transport of hydrogen and steam in the fuel cracks. The fuel oxidation model is further coupled to a heat conduction model to account for the feedback effect of a reduced thermal conductivity in the hyper-stoichiometric fuel. A numerical solution has been developed using a finite-element technique with the FEMLAB software package. The model has been compared to available data from several in-reactor X-2 loop experiments with defective fuel conducted at the Chalk River Laboratories. The model has also been benchmarked against an O/U profile measurement for a spent defective fuel element discharged from a commercial reactor.

  19. Verification of statistical method CORN for modeling of microfuel in the case of high grain concentration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chukbar, B. K., E-mail: bchukbar@mail.ru

    Two methods of modeling a double-heterogeneity fuel are studied: the deterministic positioning and the statistical method CORN of the MCU software package. The effect of distribution of microfuel in a pebble bed on the calculation results is studied. The results of verification of the statistical method CORN for the cases of the microfuel concentration up to 170 cm{sup –3} in a pebble bed are presented. The admissibility of homogenization of the microfuel coating with the graphite matrix is studied. The dependence of the reactivity on the relative location of fuel and graphite spheres in a pebble bed is found.

  20. IMS software developments for the detection of chemical warfare agent

    NASA Technical Reports Server (NTRS)

    Klepel, ST.; Graefenhain, U.; Lippe, R.; Stach, J.; Starrock, V.

    1995-01-01

    Interference compounds like gasoline, diesel, burning wood or fuel, etc. are presented in common battlefield situations. These compounds can cause detectors to respond as a false positive or interfere with the detector's ability to respond to target compounds such as chemical warfare agents. To ensure proper response of the ion mobility spectrometer to chemical warfare agents, two special software packages were developed and incorporated into the Bruker RAID-1. The programs suppress interferring signals caused by car exhaust or smoke gases resulting from burning materials and correct the influence of variable sample gas humidity which is important for detection and quantification of blister agents like mustard gas or lewisite.

  1. The results of pre-design studies on the development of a new design of gas turbine compressor package of GPA-C-16 type

    NASA Astrophysics Data System (ADS)

    Smirnov, A. V.; Chobenko, V. M.; Shcherbakov, O. M.; Ushakov, S. M.; Parafiynyk, V. P.; Sereda, R. M.

    2017-08-01

    The article summarizes the results of analysis of data concerning the operation of turbocompressor packages at compressor stations for the natural gas transmission system of Ukraine. The basic requirements for gas turbine compressor packages used for modernization and reconstruction of compressor stations are considered. Using a 16 MW gas turbine package GPA-C-16S/76-1,44M1 as an example, the results of pre-design studies and some technical solutions that improve the energy efficiency of gas turbine compressor packages and their reliability, as well as its environmental performance are given. In particular, the article deals with the matching of performance characteristics of a centrifugal compressor (hereinafter compressor) and gas turbine drive to reduce fuel gas consumption; as well as application of energy efficient technologies, in particular, exhaust gas heat recovery units and gas-oil heat exchangers in turbocompressor packages oil system; as well as reducing emissions of carbon monoxide into the atmosphere using a catalytic exhaust system. Described technical solutions can be used for development of other types of gas turbine compressor packages.

  2. Software requirements flow-down and preliminary software design for the G-CLEF spectrograph

    NASA Astrophysics Data System (ADS)

    Evans, Ian N.; Budynkiewicz, Jamie A.; DePonte Evans, Janet; Miller, Joseph B.; Onyuksel, Cem; Paxson, Charles; Plummer, David A.

    2016-08-01

    The Giant Magellan Telescope (GMT)-Consortium Large Earth Finder (G-CLEF) is a fiber-fed, precision radial velocity (PRV) optical echelle spectrograph that will be the first light instrument on the GMT. The G-CLEF instrument device control subsystem (IDCS) provides software control of the instrument hardware, including the active feedback loops that are required to meet the G-CLEF PRV stability requirements. The IDCS is also tasked with providing operational support packages that include data reduction pipelines and proposal preparation tools. A formal, but ultimately pragmatic approach is being used to establish a complete and correct set of requirements for both the G-CLEF device control and operational support packages. The device control packages must integrate tightly with the state-machine driven software and controls reference architecture designed by the GMT Organization. A model-based systems engineering methodology is being used to develop a preliminary design that meets these requirements. Through this process we have identified some lessons that have general applicability to the development of software for ground-based instrumentation. For example, tasking an individual with overall responsibility for science/software/hardware integration is a key step to ensuring effective integration between these elements. An operational concept document that includes detailed routine and non- routine operational sequences should be prepared in parallel with the hardware design process to tie together these elements and identify any gaps. Appropriate time-phasing of the hardware and software design phases is important, but revisions to driving requirements that impact software requirements and preliminary design are inevitable. Such revisions must be carefully managed to ensure efficient use of resources.

  3. Long-Haul Truck Sleeper Heating Load Reduction Package for Rest Period Idling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lustbader, Jason Aaron; Kekelia, Bidzina; Tomerlin, Jeff

    Annual fuel use for sleeper cab truck rest period idling is estimated at 667 million gallons in the United States, or 6.8% of long-haul truck fuel use. Truck idling during a rest period represents zero freight efficiency and is largely done to supply accessory power for climate conditioning of the cab. The National Renewable Energy Laboratory's CoolCab project aims to reduce heating, ventilating, and air conditioning (HVAC) loads and resulting fuel use from rest period idling by working closely with industry to design efficient long-haul truck thermal management systems while maintaining occupant comfort. Enhancing the thermal performance of cab/sleepers willmore » enable smaller, lighter, and more cost-effective idle reduction solutions. In addition, if the fuel savings provide a one- to three-year payback period, fleet owners will be economically motivated to incorporate them. For candidate idle reduction technologies to be implemented by original equipment manufacturers and fleets, their effectiveness must be quantified. To address this need, several promising candidate technologies were evaluated through experimentation and modeling to determine their effectiveness in reducing rest period HVAC loads. Load reduction strategies were grouped into the focus areas of solar envelope, occupant environment, conductive pathways, and efficient equipment. Technologies in each of these focus areas were investigated in collaboration with industry partners. The most promising of these technologies were then combined with the goal of exceeding a 30% reduction in HVAC loads. These technologies included 'ultra-white' paint, advanced insulation, and advanced curtain design. Previous testing showed more than a 35.7% reduction in air conditioning loads. This paper describes the overall heat transfer coefficient testing of this advanced load reduction technology package that showed more than a 43% reduction in heating load. Adding an additional layer of advanced insulation with a reflective barrier to the thermal load reduction package resulted in a 53.3% reduction in the overall heat transfer coefficient.« less

  4. Long-Haul Truck Sleeper Heating Load Reduction Package for Rest Period Idling: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lustbader, Jason; Kekelia, Bidzina; Tomerlin, Jeff

    Annual fuel use for sleeper cab truck rest period idling is estimated at 667 million gallons in the United States, or 6.8% of long-haul truck fuel use. Truck idling during a rest period represents zero freight efficiency and is largely done to supply accessory power for climate conditioning of the cab. The National Renewable Energy Laboratory's CoolCab project aims to reduce heating, ventilating, and air conditioning (HVAC) loads and resulting fuel use from rest period idling by working closely with industry to design efficient long-haul truck thermal management systems while maintaining occupant comfort. Enhancing the thermal performance of cab/sleepers willmore » enable smaller, lighter, and more cost-effective idle reduction solutions. In addition, if the fuel savings provide a one- to three-year payback period, fleet owners will be economically motivated to incorporate them. For candidate idle reduction technologies to be implemented by original equipment manufacturers and fleets, their effectiveness must be quantified. To address this need, several promising candidate technologies were evaluated through experimentation and modeling to determine their effectiveness in reducing rest period HVAC loads. Load reduction strategies were grouped into the focus areas of solar envelope, occupant environment, conductive pathways, and efficient equipment. Technologies in each of these focus areas were investigated in collaboration with industry partners. The most promising of these technologies were then combined with the goal of exceeding a 30% reduction in HVAC loads. These technologies included 'ultra-white' paint, advanced insulation, and advanced curtain design. Previous testing showed more than a 35.7% reduction in air conditioning loads. This paper describes the overall heat transfer coefficient testing of this advanced load reduction technology package that showed more than a 43% reduction in heating load. Adding an additional layer of advanced insulation with a reflective barrier to the thermal load reduction package resulted in a 53.3% reduction in the overall heat transfer coefficient.« less

  5. Reusable software parts and the semi-abstract data type

    NASA Technical Reports Server (NTRS)

    Cohen, Sanford G.

    1986-01-01

    The development of reuable software parts has been an area of intense discussion within the software community for many years. An approach is described for developing reusable parts for the applications of missile guidance, navigation and control which meet the following criteria: (1) Reusable; (2) Tailorable; (3) Efficient; (4) Simple to use; and (5) Protected against misuse. Validating the feasibility of developing reusable parts which possess these characteristics is the basis of the Common Ada Missile Packages Program (CAMP). Under CAMP, over 200 reusable software parts were developed, including part for navigation, Kalman filter, signal processing and autopilot. Six different methods are presented for designing reusable software parts.

  6. The software system development for the TAMU real-time fan beam scatterometer data processors

    NASA Technical Reports Server (NTRS)

    Clark, B. V.; Jean, B. R.

    1980-01-01

    A software package was designed and written to process in real-time any one quadrature channel pair of radar scatterometer signals form the NASA L- or C-Band radar scatterometer systems. The software was successfully tested in the C-Band processor breadboard hardware using recorded radar and NERDAS (NASA Earth Resources Data Annotation System) signals as the input data sources. The processor development program and the overall processor theory of operation and design are described. The real-time processor software system is documented and the results of the laboratory software tests, and recommendations for the efficient application of the data processing capabilities are presented.

  7. Evaluation and selection of open-source EMR software packages based on integrated AHP and TOPSIS.

    PubMed

    Zaidan, A A; Zaidan, B B; Al-Haiqi, Ahmed; Kiah, M L M; Hussain, Muzammil; Abdulnabi, Mohamed

    2015-02-01

    Evaluating and selecting software packages that meet the requirements of an organization are difficult aspects of software engineering process. Selecting the wrong open-source EMR software package can be costly and may adversely affect business processes and functioning of the organization. This study aims to evaluate and select open-source EMR software packages based on multi-criteria decision-making. A hands-on study was performed and a set of open-source EMR software packages were implemented locally on separate virtual machines to examine the systems more closely. Several measures as evaluation basis were specified, and the systems were selected based a set of metric outcomes using Integrated Analytic Hierarchy Process (AHP) and TOPSIS. The experimental results showed that GNUmed and OpenEMR software can provide better basis on ranking score records than other open-source EMR software packages. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. GenoMatrix: A Software Package for Pedigree-Based and Genomic Prediction Analyses on Complex Traits.

    PubMed

    Nazarian, Alireza; Gezan, Salvador Alejandro

    2016-07-01

    Genomic and pedigree-based best linear unbiased prediction methodologies (G-BLUP and P-BLUP) have proven themselves efficient for partitioning the phenotypic variance of complex traits into its components, estimating the individuals' genetic merits, and predicting unobserved (or yet-to-be observed) phenotypes in many species and fields of study. The GenoMatrix software, presented here, is a user-friendly package to facilitate the process of using genome-wide marker data and parentage information for G-BLUP and P-BLUP analyses on complex traits. It provides users with a collection of applications which help them on a set of tasks from performing quality control on data to constructing and manipulating the genomic and pedigree-based relationship matrices and obtaining their inverses. Such matrices will be then used in downstream analyses by other statistical packages. The package also enables users to obtain predicted values for unobserved individuals based on the genetic values of observed related individuals. GenoMatrix is available to the research community as a Windows 64bit executable and can be downloaded free of charge at: http://compbio.ufl.edu/software/genomatrix/. © The American Genetic Association. 2016. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  9. Parallel Fortran-MPI software for numerical inversion of the Laplace transform and its application to oscillatory water levels in groundwater environments

    USGS Publications Warehouse

    Zhan, X.

    2005-01-01

    A parallel Fortran-MPI (Message Passing Interface) software for numerical inversion of the Laplace transform based on a Fourier series method is developed to meet the need of solving intensive computational problems involving oscillatory water level's response to hydraulic tests in a groundwater environment. The software is a parallel version of ACM (The Association for Computing Machinery) Transactions on Mathematical Software (TOMS) Algorithm 796. Running 38 test examples indicated that implementation of MPI techniques with distributed memory architecture speedups the processing and improves the efficiency. Applications to oscillatory water levels in a well during aquifer tests are presented to illustrate how this package can be applied to solve complicated environmental problems involved in differential and integral equations. The package is free and is easy to use for people with little or no previous experience in using MPI but who wish to get off to a quick start in parallel computing. ?? 2004 Elsevier Ltd. All rights reserved.

  10. Software for the Integration of Multiomics Experiments in Bioconductor.

    PubMed

    Ramos, Marcel; Schiffer, Lucas; Re, Angela; Azhar, Rimsha; Basunia, Azfar; Rodriguez, Carmen; Chan, Tiffany; Chapman, Phil; Davis, Sean R; Gomez-Cabrero, David; Culhane, Aedin C; Haibe-Kains, Benjamin; Hansen, Kasper D; Kodali, Hanish; Louis, Marie S; Mer, Arvind S; Riester, Markus; Morgan, Martin; Carey, Vince; Waldron, Levi

    2017-11-01

    Multiomics experiments are increasingly commonplace in biomedical research and add layers of complexity to experimental design, data integration, and analysis. R and Bioconductor provide a generic framework for statistical analysis and visualization, as well as specialized data classes for a variety of high-throughput data types, but methods are lacking for integrative analysis of multiomics experiments. The MultiAssayExperiment software package, implemented in R and leveraging Bioconductor software and design principles, provides for the coordinated representation of, storage of, and operation on multiple diverse genomics data. We provide the unrestricted multiple 'omics data for each cancer tissue in The Cancer Genome Atlas as ready-to-analyze MultiAssayExperiment objects and demonstrate in these and other datasets how the software simplifies data representation, statistical analysis, and visualization. The MultiAssayExperiment Bioconductor package reduces major obstacles to efficient, scalable, and reproducible statistical analysis of multiomics data and enhances data science applications of multiple omics datasets. Cancer Res; 77(21); e39-42. ©2017 AACR . ©2017 American Association for Cancer Research.

  11. Development problem analysis of correlation leak detector’s software

    NASA Astrophysics Data System (ADS)

    Faerman, V. A.; Avramchuk, V. S.; Marukyan, V. M.

    2018-05-01

    In the article, the practical application and the structure of the correlation leak detectors’ software is studied and the task of its designing is analyzed. In the first part of the research paper, the expediency of the facilities development of correlation leak detectors for the following operating efficiency of public utilities exploitation is shown. The analysis of the functional structure of correlation leak detectors is conducted and its program software tasks are defined. In the second part of the research paper some development steps of the software package – requirement forming, program structure definition and software concept creation – are examined in the context of the usage experience of the hardware-software prototype of correlation leak detector.

  12. Deferred Maintenance.

    ERIC Educational Resources Information Center

    DeLong, Richard A.

    1984-01-01

    Unusually hard hit by the 1970s recession, the University of Michigan accumulated more deferred maintenance problems than could be analyzed efficiently either by hand or with existing computer systems. Using an existing microcomputer and a database management software package, the maintenance service developed its own database to support…

  13. A Roadmap to Continuous Integration for ATLAS Software Development

    NASA Astrophysics Data System (ADS)

    Elmsheuser, J.; Krasznahorkay, A.; Obreshkov, E.; Undrus, A.; ATLAS Collaboration

    2017-10-01

    The ATLAS software infrastructure facilitates efforts of more than 1000 developers working on the code base of 2200 packages with 4 million lines of C++ and 1.4 million lines of python code. The ATLAS offline code management system is the powerful, flexible framework for processing new package versions requests, probing code changes in the Nightly Build System, migration to new platforms and compilers, deployment of production releases for worldwide access and supporting physicists with tools and interfaces for efficient software use. It maintains multi-stream, parallel development environment with about 70 multi-platform branches of nightly releases and provides vast opportunities for testing new packages, for verifying patches to existing software and for migrating to new platforms and compilers. The system evolution is currently aimed on the adoption of modern continuous integration (CI) practices focused on building nightly releases early and often, with rigorous unit and integration testing. This paper describes the CI incorporation program for the ATLAS software infrastructure. It brings modern open source tools such as Jenkins and GitLab into the ATLAS Nightly System, rationalizes hardware resource allocation and administrative operations, provides improved feedback and means to fix broken builds promptly for developers. Once adopted, ATLAS CI practices will improve and accelerate innovation cycles and result in increased confidence in new software deployments. The paper reports the status of Jenkins integration with the ATLAS Nightly System as well as short and long term plans for the incorporation of CI practices.

  14. Report of AAPM Task Group 162: Software for planar image quality metrology.

    PubMed

    Samei, Ehsan; Ikejimba, Lynda C; Harrawood, Brian P; Rong, John; Cunningham, Ian A; Flynn, Michael J

    2018-02-01

    The AAPM Task Group 162 aimed to provide a standardized approach for the assessment of image quality in planar imaging systems. This report offers a description of the approach as well as the details of the resultant software bundle to measure detective quantum efficiency (DQE) as well as its basis components and derivatives. The methodology and the associated software include the characterization of the noise power spectrum (NPS) from planar images acquired under specific acquisition conditions, modulation transfer function (MTF) using an edge test object, the DQE, and effective DQE (eDQE). First, a methodological framework is provided to highlight the theoretical basis of the work. Then, a step-by-step guide is included to assist in proper execution of each component of the code. Lastly, an evaluation of the method is included to validate its accuracy against model-based and experimental data. The code was built using a Macintosh OSX operating system. The software package contains all the source codes to permit an experienced user to build the suite on a Linux or other *nix type system. The package further includes manuals and sample images and scripts to demonstrate use of the software for new users. The results of the code are in close alignment with theoretical expectations and published results of experimental data. The methodology and the software package offered in AAPM TG162 can be used as baseline for characterization of inherent image quality attributes of planar imaging systems. © 2017 American Association of Physicists in Medicine.

  15. Klusters, NeuroScope, NDManager: a free software suite for neurophysiological data processing and visualization.

    PubMed

    Hazan, Lynn; Zugaro, Michaël; Buzsáki, György

    2006-09-15

    Recent technological advances now allow for simultaneous recording of large populations of anatomically distributed neurons in behaving animals. The free software package described here was designed to help neurophysiologists process and view recorded data in an efficient and user-friendly manner. This package consists of several well-integrated applications, including NeuroScope (http://neuroscope.sourceforce.net), an advanced viewer for electrophysiological and behavioral data with limited editing capabilities, Klusters (http://klusters.sourceforge.net), a graphical cluster cutting application for manual and semi-automatic spike sorting, NDManager (GPL,see http://www.gnu.org/licenses/gpl.html), an experimental parameter and data processing manager. All of these programs are distributed under the GNU General Public License (GPL, see ), which gives its users legal permission to copy, distribute and/or modify the software. Also included are extensive user manuals and sample data, as well as source code and documentation.

  16. Packaging Software Assets for Reuse

    NASA Astrophysics Data System (ADS)

    Mattmann, C. A.; Marshall, J. J.; Downs, R. R.

    2010-12-01

    The reuse of existing software assets such as code, architecture, libraries, and modules in current software and systems development projects can provide many benefits, including reduced costs, in time and effort, and increased reliability. Many reusable assets are currently available in various online catalogs and repositories, usually broken down by disciplines such as programming language (Ibiblio for Maven/Java developers, PyPI for Python developers, CPAN for Perl developers, etc.). The way these assets are packaged for distribution can play a role in their reuse - an asset that is packaged simply and logically is typically easier to understand, install, and use, thereby increasing its reusability. A well-packaged asset has advantages in being more reusable and thus more likely to provide benefits through its reuse. This presentation will discuss various aspects of software asset packaging and how they can affect the reusability of the assets. The characteristics of well-packaged software will be described. A software packaging domain model will be introduced, and some existing packaging approaches examined. An example case study of a Reuse Enablement System (RES), currently being created by near-term Earth science decadal survey missions, will provide information about the use of the domain model. Awareness of these factors will help software developers package their reusable assets so that they can provide the most benefits for software reuse.

  17. Performance of a natural gas fuel processor for residential PEFC system using a novel CO preferential oxidation catalyst

    NASA Astrophysics Data System (ADS)

    Echigo, Mitsuaki; Shinke, Norihisa; Takami, Susumu; Tabata, Takeshi

    Natural gas fuel processors have been developed for 500 W and 1 kW class residential polymer electrolyte fuel cell (PEFC) systems. These fuel processors contain all the elements—desulfurizers, steam reformers, CO shift converters, CO preferential oxidation (PROX) reactors, steam generators, burners and heat exchangers—in one package. For the PROX reactor, a single-stage PROX process using a novel PROX catalyst was adopted. In the 1 kW class fuel processor, thermal efficiency of 83% at HHV was achieved at nominal output assuming a H 2 utilization rate in the cell stack of 76%. CO concentration below 1 ppm in the product gas was achieved even under the condition of [O 2]/[CO]=1.5 at the PROX reactor. The long-term durability of the fuel processor was demonstrated with almost no deterioration in thermal efficiency and CO concentration for 10,000 h, 1000 times start and stop cycles, 25,000 cycles of load change.

  18. Substructured multibody molecular dynamics.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grest, Gary Stephen; Stevens, Mark Jackson; Plimpton, Steven James

    2006-11-01

    We have enhanced our parallel molecular dynamics (MD) simulation software LAMMPS (Large-scale Atomic/Molecular Massively Parallel Simulator, lammps.sandia.gov) to include many new features for accelerated simulation including articulated rigid body dynamics via coupling to the Rensselaer Polytechnic Institute code POEMS (Parallelizable Open-source Efficient Multibody Software). We use new features of the LAMMPS software package to investigate rhodopsin photoisomerization, and water model surface tension and capillary waves at the vapor-liquid interface. Finally, we motivate the recipes of MD for practitioners and researchers in numerical analysis and computational mechanics.

  19. A parallel multi-domain solution methodology applied to nonlinear thermal transport problems in nuclear fuel pins

    DOE PAGES

    Philip, Bobby; Berrill, Mark A.; Allu, Srikanth; ...

    2015-01-26

    We describe an efficient and nonlinearly consistent parallel solution methodology for solving coupled nonlinear thermal transport problems that occur in nuclear reactor applications over hundreds of individual 3D physical subdomains. Efficiency is obtained by leveraging knowledge of the physical domains, the physics on individual domains, and the couplings between them for preconditioning within a Jacobian Free Newton Krylov method. Details of the computational infrastructure that enabled this work, namely the open source Advanced Multi-Physics (AMP) package developed by the authors are described. The details of verification and validation experiments, and parallel performance analysis in weak and strong scaling studies demonstratingmore » the achieved efficiency of the algorithm are presented. Moreover, numerical experiments demonstrate that the preconditioner developed is independent of the number of fuel subdomains in a fuel rod, which is particularly important when simulating different types of fuel rods. Finally, we demonstrate the power of the coupling methodology by considering problems with couplings between surface and volume physics and coupling of nonlinear thermal transport in fuel rods to an external radiation transport code.« less

  20. Canputer Science and Technology: Introduction to Software Packages

    DTIC Science & Technology

    1984-04-01

    Table 5 Sources of Software Packages.20 Table 6 Reference Services Matrix . 33 Table 7 Reference Matrix.40 LIST OF FIGURES Figure 1 Document...consideration should be given to the acquisition of appropriate software packages to replace or upgrade existing services and to provide services not...Consequently, there are many companies that produce only software packages, and are committed to providing training, service , and support. These vendors

  1. The International Atomic Energy Agency software package for the analysis of scintigraphic renal dynamic studies: a tool for the clinician, teacher, and researcher.

    PubMed

    Zaknun, John J; Rajabi, Hossein; Piepsz, Amy; Roca, Isabel; Dondi, Maurizio

    2011-01-01

    Under the auspices of the International Atomic Energy Agency, a new-generation, platform-independent, and x86-compatible software package was developed for the analysis of scintigraphic renal dynamic imaging studies. It provides nuclear medicine professionals cost-free access to the most recent developments in the field. The software package is a step forward towards harmonization and standardization. Embedded functionalities render it a suitable tool for education, research, and for receiving distant expert's opinions. Another objective of this effort is to allow introducing clinically useful parameters of drainage, including normalized residual activity and outflow efficiency. Furthermore, it provides an effective teaching tool for young professionals who are being introduced to dynamic kidney studies by selected teaching case studies. The software facilitates a better understanding through practically approaching different variables and settings and their effect on the numerical results. An effort was made to introduce instruments of quality assurance at the various levels of the program's execution, including visual inspection and automatic detection and correction of patient's motion, automatic placement of regions of interest around the kidneys, cortical regions, and placement of reproducible background region on both primary dynamic and on postmicturition studies. The user can calculate the differential renal function through 2 independent methods, the integral or the Rutland-Patlak approaches. Standardized digital reports, storage and retrieval of regions of interest, and built-in database operations allow the generation and tracing of full image reports and of numerical outputs. The software package is undergoing quality assurance procedures to verify the accuracy and the interuser reproducibility with the final aim of launching the program for use by professionals and teaching institutions worldwide. Copyright © 2011 Elsevier Inc. All rights reserved.

  2. Design Optimization Toolkit: Users' Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aguilo Valentin, Miguel Alejandro

    The Design Optimization Toolkit (DOTk) is a stand-alone C++ software package intended to solve complex design optimization problems. DOTk software package provides a range of solution methods that are suited for gradient/nongradient-based optimization, large scale constrained optimization, and topology optimization. DOTk was design to have a flexible user interface to allow easy access to DOTk solution methods from external engineering software packages. This inherent flexibility makes DOTk barely intrusive to other engineering software packages. As part of this inherent flexibility, DOTk software package provides an easy-to-use MATLAB interface that enables users to call DOTk solution methods directly from the MATLABmore » command window.« less

  3. Selection of software for mechanical engineering undergraduates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheah, C. T.; Yin, C. S.; Halim, T.

    A major problem with the undergraduate mechanical course is the limited exposure of students to software packages coupled with the long learning curve on the existing software packages. This work proposes the use of appropriate software packages for the entire mechanical engineering curriculum to ensure students get sufficient exposure real life design problems. A variety of software packages are highlighted as being suitable for undergraduate work in mechanical engineering, e.g. simultaneous non-linear equations; uncertainty analysis; 3-D modeling software with the FEA; analysis tools for the solution of problems in thermodynamics, fluid mechanics, mechanical system design, and solid mechanics.

  4. An Object-Oriented Serial DSMC Simulation Package

    NASA Astrophysics Data System (ADS)

    Liu, Hongli; Cai, Chunpei

    2011-05-01

    A newly developed three-dimensional direct simulation Monte Carlo (DSMC) simulation package, named GRASP ("Generalized Rarefied gAs Simulation Package"), is reported in this paper. This package utilizes the concept of simulation engine, many C++ features and software design patterns. The package has an open architecture which can benefit further development and maintenance of the code. In order to reduce the engineering time for three-dimensional models, a hybrid grid scheme, combined with a flexible data structure compiled by C++ language, are implemented in this package. This scheme utilizes a local data structure based on the computational cell to achieve high performance on workstation processors. This data structure allows the DSMC algorithm to be very efficiently parallelized with domain decomposition and it provides much flexibility in terms of grid types. This package can utilize traditional structured, unstructured or hybrid grids within the framework of a single code to model arbitrarily complex geometries and to simulate rarefied gas flows. Benchmark test cases indicate that this package has satisfactory accuracy for complex rarefied gas flows.

  5. An Integration, Long Range Planning, and Migration Guide for the Stock Point Logistics Integrated Communications Project.

    DTIC Science & Technology

    1986-03-01

    and universal terminal/printer interface mapping ( TMAP ) software. When the Burroughs HYPERchannel software package (i.e., Burroughs NETEX) provided...and terminal device and security functions placed under the control of the FDC’s SAS/ TMAP processes. Without processing efficiency enhancements, TAPS...FDC’s SAS/ TMAP processes. As was also previously indicated, the performance of TAPS II on TANDEM is poor today, and there are questions as whether

  6. NASA Tech Briefs, February 2008

    NASA Technical Reports Server (NTRS)

    2008-01-01

    Topics discussed include: Optical Measurement of Mass Flow of a Two-Phase Fluid; Selectable-Tip Corrosion-Testing Electrochemical Cell; Piezoelectric Bolt Breakers and Bolt Fatigue Testers; Improved Measurement of B(sub 22) of Macromolecules in a Flow Cell; Measurements by a Vector Network Analyzer at 325 to 508 GHz; Using Light to Treat Mucositis and Help Wounds Heal; Increasing Discharge Capacities of Li-(CF)(sub n) Cells; Dot-in-Well Quantum-Dot Infrared Photodetectors; Integrated Microbatteries for Implantable Medical Devices; Oxidation Behavior of Carbon Fiber-Reinforced Composites; GIDEP Batching Tool; Generic Spacecraft Model for Real-Time Simulation; Parallel-Processing Software for Creating Mosaic Images; Software for Verifying Image-Correlation Tie Points; Flexcam Image Capture Viewing and Spot Tracking; Low-Pt-Content Anode Catalyst for Direct Methanol Fuel Cells; Graphite/Cyanate Ester Face Sheets for Adaptive Optics; Atomized BaF2-CaF7 for Better-Flowing Plasma-Spray Feedstock; Nanophase Nickel-Zirconium Alloys for Fuel Cells; Vacuum Packaging of MEMS With Multiple Internal Seal Rings; Compact Two-Dimensional Spectrometer Optics; and Fault-Tolerant Coding for State Machines.

  7. An efficient approach to the deployment of complex open source information systems

    PubMed Central

    Cong, Truong Van Chi; Groeneveld, Eildert

    2011-01-01

    Complex open source information systems are usually implemented as component-based software to inherit the available functionality of existing software packages developed by third parties. Consequently, the deployment of these systems not only requires the installation of operating system, application framework and the configuration of services but also needs to resolve the dependencies among components. The problem becomes more challenging when the application must be installed and used on different platforms such as Linux and Windows. To address this, an efficient approach using the virtualization technology is suggested and discussed in this paper. The approach has been applied in our project to deploy a web-based integrated information system in molecular genetics labs. It is a low-cost solution to benefit both software developers and end-users. PMID:22102770

  8. Design and simulation of a novel high-efficiency cooling heat-sink structure using fluid-thermodynamics

    NASA Astrophysics Data System (ADS)

    Hongqi, Jing; Li, Zhong; Yuxi, Ni; Junjie, Zhang; Suping, Liu; Xiaoyu, Ma

    2015-10-01

    A novel high-efficiency cooling mini-channel heat-sink structure has been designed to meet the package technology demands of high power density laser diode array stacks. Thermal and water flowing characteristics have been simulated using the Ansys-Fluent software. Owing to the increased effective cooling area, this mini-channel heat-sink structure has a better cooling effect when compared with the traditional macro-channel heat-sinks. Owing to the lower flow velocity in this novel high efficient cooling structure, the chillers' water-pressure requirement is reduced. Meanwhile, the machining process of this high-efficiency cooling mini-channel heat-sink structure is simple and the cost is relatively low, it also has advantages in terms of high durability and long lifetime. This heat-sink is an ideal choice for the package of high power density laser diode array stacks. Project supported by the Defense Industrial Technology Development Program (No. B1320133033).

  9. Efficient Predictions of Excited State for Nanomaterials Using Aces 3 and 4

    DTIC Science & Technology

    2017-12-20

    by first-principle methods in the software package ACES by using large parallel computers, growing tothe exascale. 15. SUBJECT TERMS Computer...modeling, excited states, optical properties, structure, stability, activation barriers first principle methods , parallel computing 16. SECURITY...2 Progress with new density functional methods

  10. UQTools: The Uncertainty Quantification Toolbox - Introduction and Tutorial

    NASA Technical Reports Server (NTRS)

    Kenny, Sean P.; Crespo, Luis G.; Giesy, Daniel P.

    2012-01-01

    UQTools is the short name for the Uncertainty Quantification Toolbox, a software package designed to efficiently quantify the impact of parametric uncertainty on engineering systems. UQTools is a MATLAB-based software package and was designed to be discipline independent, employing very generic representations of the system models and uncertainty. Specifically, UQTools accepts linear and nonlinear system models and permits arbitrary functional dependencies between the system s measures of interest and the probabilistic or non-probabilistic parametric uncertainty. One of the most significant features incorporated into UQTools is the theoretical development centered on homothetic deformations and their application to set bounding and approximating failure probabilities. Beyond the set bounding technique, UQTools provides a wide range of probabilistic and uncertainty-based tools to solve key problems in science and engineering.

  11. D-GENIES: dot plot large genomes in an interactive, efficient and simple way.

    PubMed

    Cabanettes, Floréal; Klopp, Christophe

    2018-01-01

    Dot plots are widely used to quickly compare sequence sets. They provide a synthetic similarity overview, highlighting repetitions, breaks and inversions. Different tools have been developed to easily generated genomic alignment dot plots, but they are often limited in the input sequence size. D-GENIES is a standalone and web application performing large genome alignments using minimap2 software package and generating interactive dot plots. It enables users to sort query sequences along the reference, zoom in the plot and download several image, alignment or sequence files. D-GENIES is an easy-to-install, open-source software package (GPL) developed in Python and JavaScript. The source code is available at https://github.com/genotoul-bioinfo/dgenies and it can be tested at http://dgenies.toulouse.inra.fr/.

  12. Reliability and accuracy of three imaging software packages used for 3D analysis of the upper airway on cone beam computed tomography images.

    PubMed

    Chen, Hui; van Eijnatten, Maureen; Wolff, Jan; de Lange, Jan; van der Stelt, Paul F; Lobbezoo, Frank; Aarab, Ghizlane

    2017-08-01

    The aim of this study was to assess the reliability and accuracy of three different imaging software packages for three-dimensional analysis of the upper airway using CBCT images. To assess the reliability of the software packages, 15 NewTom 5G ® (QR Systems, Verona, Italy) CBCT data sets were randomly and retrospectively selected. Two observers measured the volume, minimum cross-sectional area and the length of the upper airway using Amira ® (Visage Imaging Inc., Carlsbad, CA), 3Diagnosys ® (3diemme, Cantu, Italy) and OnDemand3D ® (CyberMed, Seoul, Republic of Korea) software packages. The intra- and inter-observer reliability of the upper airway measurements were determined using intraclass correlation coefficients and Bland & Altman agreement tests. To assess the accuracy of the software packages, one NewTom 5G ® CBCT data set was used to print a three-dimensional anthropomorphic phantom with known dimensions to be used as the "gold standard". This phantom was subsequently scanned using a NewTom 5G ® scanner. Based on the CBCT data set of the phantom, one observer measured the volume, minimum cross-sectional area, and length of the upper airway using Amira ® , 3Diagnosys ® , and OnDemand3D ® , and compared these measurements with the gold standard. The intra- and inter-observer reliability of the measurements of the upper airway using the different software packages were excellent (intraclass correlation coefficient ≥0.75). There was excellent agreement between all three software packages in volume, minimum cross-sectional area and length measurements. All software packages underestimated the upper airway volume by -8.8% to -12.3%, the minimum cross-sectional area by -6.2% to -14.6%, and the length by -1.6% to -2.9%. All three software packages offered reliable volume, minimum cross-sectional area and length measurements of the upper airway. The length measurements of the upper airway were the most accurate results in all software packages. All software packages underestimated the upper airway dimensions of the anthropomorphic phantom.

  13. Compact gasoline fuel processor for passenger vehicle APU

    NASA Astrophysics Data System (ADS)

    Severin, Christopher; Pischinger, Stefan; Ogrzewalla, Jürgen

    Due to the increasing demand for electrical power in today's passenger vehicles, and with the requirements regarding fuel consumption and environmental sustainability tightening, a fuel cell-based auxiliary power unit (APU) becomes a promising alternative to the conventional generation of electrical energy via internal combustion engine, generator and battery. It is obvious that the on-board stored fuel has to be used for the fuel cell system, thus, gasoline or diesel has to be reformed on board. This makes the auxiliary power unit a complex integrated system of stack, air supply, fuel processor, electrics as well as heat and water management. Aside from proving the technical feasibility of such a system, the development has to address three major barriers:start-up time, costs, and size/weight of the systems. In this paper a packaging concept for an auxiliary power unit is presented. The main emphasis is placed on the fuel processor, as good packaging of this large subsystem has the strongest impact on overall size. The fuel processor system consists of an autothermal reformer in combination with water-gas shift and selective oxidation stages, based on adiabatic reactors with inter-cooling. The configuration was realized in a laboratory set-up and experimentally investigated. The results gained from this confirm a general suitability for mobile applications. A start-up time of 30 min was measured, while a potential reduction to 10 min seems feasible. An overall fuel processor efficiency of about 77% was measured. On the basis of the know-how gained by the experimental investigation of the laboratory set-up a packaging concept was developed. Using state-of-the-art catalyst and heat exchanger technology, the volumes of these components are fixed. However, the overall volume is higher mainly due to mixing zones and flow ducts, which do not contribute to the chemical or thermal function of the system. Thus, the concept developed mainly focuses on minimization of those component volumes. Therefore, the packaging utilizes rectangular catalyst bricks and integrates flow ducts into the heat exchangers. A concept is presented with a 25 l fuel processor volume including thermal isolation for a 3 kW el auxiliary power unit. The overall size of the system, i.e. including stack, air supply and auxiliaries can be estimated to 44 l.

  14. Design requirements for SRB production control system. Volume 3: Package evaluation, modification and hardware

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The software package evaluation was designed to analyze commercially available, field-proven, production control or manufacturing resource planning management technology and software package. The analysis was conducted by comparing SRB production control software requirements and conceptual system design to software package capabilities. The methodology of evaluation and the findings at each stage of evaluation are described. Topics covered include: vendor listing; request for information (RFI) document; RFI response rate and quality; RFI evaluation process; and capabilities versus requirements.

  15. Validation of the new code package APOLLO2.8 for accurate PWR neutronics calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santamarina, A.; Bernard, D.; Blaise, P.

    2013-07-01

    This paper summarizes the Qualification work performed to demonstrate the accuracy of the new APOLLO2.S/SHEM-MOC package based on JEFF3.1.1 nuclear data file for the prediction of PWR neutronics parameters. This experimental validation is based on PWR mock-up critical experiments performed in the EOLE/MINERVE zero-power reactors and on P.I. Es on spent fuel assemblies from the French PWRs. The Calculation-Experiment comparison for the main design parameters is presented: reactivity of UOX and MOX lattices, depletion calculation and fuel inventory, reactivity loss with burnup, pin-by-pin power maps, Doppler coefficient, Moderator Temperature Coefficient, Void coefficient, UO{sub 2}-Gd{sub 2}O{sub 3} poisoning worth, Efficiency ofmore » Ag-In-Cd and B4C control rods, Reflector Saving for both standard 2-cm baffle and GEN3 advanced thick SS reflector. From this qualification process, calculation biases and associated uncertainties are derived. This code package APOLLO2.8 is already implemented in the ARCADIA new AREVA calculation chain for core physics and is currently under implementation in the future neutronics package of the French utility Electricite de France. (authors)« less

  16. Design of an integrated fuel processor for residential PEMFCs applications

    NASA Astrophysics Data System (ADS)

    Seo, Yu Taek; Seo, Dong Joo; Jeong, Jin Hyeok; Yoon, Wang Lai

    KIER has been developing a novel fuel processing system to provide hydrogen rich gas to residential PEMFCs system. For the effective design of a compact hydrogen production system, each unit process for steam reforming and water gas shift, has a steam generator and internal heat exchangers which are thermally and physically integrated into a single packaged hardware system. The newly designed fuel processor (prototype II) showed a thermal efficiency of 78% as a HHV basis with methane conversion of 89%. The preferential oxidation unit with two staged cascade reactors, reduces, the CO concentration to below 10 ppm without complicated temperature control hardware, which is the prerequisite CO limit for the PEMFC stack. After we achieve the initial performance of the fuel processor, partial load operation was carried out to test the performance and reliability of the fuel processor at various loads. The stability of the fuel processor was also demonstrated for three successive days with a stable composition of product gas and thermal efficiency. The CO concentration remained below 10 ppm during the test period and confirmed the stable performance of the two-stage PrOx reactors.

  17. Development of compact fuel processor for 2 kW class residential PEMFCs

    NASA Astrophysics Data System (ADS)

    Seo, Yu Taek; Seo, Dong Joo; Jeong, Jin Hyeok; Yoon, Wang Lai

    Korea Institute of Energy Research (KIER) has been developing a novel fuel processing system to provide hydrogen rich gas to residential polymer electrolyte membrane fuel cells (PEMFCs) cogeneration system. For the effective design of a compact hydrogen production system, the unit processes of steam reforming, high and low temperature water gas shift, steam generator and internal heat exchangers are thermally and physically integrated into a packaged hardware system. Several prototypes are under development and the prototype I fuel processor showed thermal efficiency of 73% as a HHV basis with methane conversion of 81%. Recently tested prototype II has been shown the improved performance of thermal efficiency of 76% with methane conversion of 83%. In both prototypes, two-stage PrOx reactors reduce CO concentration less than 10 ppm, which is the prerequisite CO limit condition of product gas for the PEMFCs stack. After confirming the initial performance of prototype I fuel processor, it is coupled with PEMFC single cell to test the durability and demonstrated that the fuel processor is operated for 3 days successfully without any failure of fuel cell voltage. Prototype II fuel processor also showed stable performance during the durability test.

  18. Methodology and Software for Gross Defect Detection of Spent Nuclear Fuel at the Atucha-I Reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sitaraman, Shivakumar; Ham, Young S.; Gharibyan, Narek

    At the Atucha-I pressurized heavy water reactor in Argentina, fuel assemblies in the spent fuel pools are stored by suspending them in two vertically stacked layers. This introduces the unique problem of verifying the presence of fuel in either layer without physically moving the fuel assemblies. Since much of the fuel is very old, Cerenkov viewing devices are often not very useful even for the top layer. Given that the facility uses both natural uranium and slightly enriched uranium at 0.85 w% {sup 235}U, and has been in operation since 1974, a wide range of burnups and cooling times canmore » exist in any given pool. A spent fuel neutron counting tool consisting of a fission chamber, SFNC, has been used at the site to verify the presence of fuel up to burnups of 8000 MWd/t. At higher discharge burnups to levels up 11,000 MWd/t, the existing signal processing software of the tool was found to fail due to non-linearity of the source term with burnup. A new Graphical User Interface software package based on the LabVIEW platform was developed to predict expected neutron signals covering all ranges of burnups and cooling times and establish maps of expected signals at various pool locations. The algorithm employed in the software uses a set of transfer functions in a 47-energy group structure which are coupled with a 47-energy group neutron source spectrum based on various cooling times and burnups for each of the two enrichment levels. The database of the software consists of these transfer functions for the three different inter-assembly pitches that the fuel is stored in at the site. The transfer functions were developed for a 6 by 6 matrix of fuel assemblies with the detector placed at the center surrounded by four near neighbors, eight next nearest neighbors and so on for the 36 assemblies. These calculations were performed using Monte Carlo radiation transport methods. The basic methodology consisted of starting sources in each of the assemblies and tallying the contribution to the detector by a single neutron in each of the 47 energy groups used. Thus for the single existing symmetric pitch in the pools, where the vertical and horizontal separations are equal, only 6 sets of transfer functions are required. For the two asymmetrical pitches, nine sets of transfer functions are stored. In addition, source spectra at burnups ranging from 4000 to 20000 MWd/t and cooling times up to 40 years are stored. These source terms were established based on CANDU 37-rod fuel that is very similar to the Atucha fuel. Linear interpolation is used by the software for both burnup and cooling time to establish source terms at any intermediate condition. Using the burnup, cooling time and initial enrichment of the surrounding assemblies a set of source strengths in the 47-group structure for each of the 36 assemblies is established and multiplied group-wise with the appropriate transfer function set. The grand total over the 47 groups for all 36 assemblies is the predicted signal at the detector. The software was initially calibrated against a set of typically 5-6 measurements chosen from among the measured data at each level of the six pools and calibration factors were established. The set used for calibration is chosen such that it is fairly representative of the range of spent fuel assembly characteristics present in each level. Once established, these calibration factors can be repeatedly used for verification purposes. Recalibration will be required if the hardware or pool configurations has changed. It will also be required if a long enough time has elapsed since they were established thus making a cooling time correction necessary. The objective of the inspection is to detect missing fuel from one or more nearest neighbors of the detector. During the verification mode of the software, the predicted and measured signals are compared and the inspector is alerted if the difference between the two signals is beyond a set tolerance limit. Based on the uncertainties associated with both the calculations and measurements, a lower limit of the tolerance will be 15% with an upper limit of 20%. For the most part a 20% tolerance limit will be able to detect a missing assembly since in the vast majority of cases the drop in signal due to a single missing nearest neighbor assembly will be in the range 24-27%. The software was benchmarked against an extensive set of measured data taken at the site in 2004. Overall, 326 data points were examined and the prediction of the calibrated software was compared to the measurements within a set tolerance of ±20%. Of these, 283 of the predicted signals representing 87% of the total matched the measured data within ±10%. A further 27 or 8% were in the range of ±10-15% and 8 or 2.5% were in the range of ±15-20%. Thus, 97.5% of the data matched the measurements within the set tolerance limit of 20%, with 95% matching measured data with the lowest allowed tolerance limit of ±15%. The remaining 2.5% had measured signals that were very different from those at locations with very similar surrounding assemblies and the cause of these discrepancies could not be ascertained from the measurement logs. In summary, 97.5% of the predictions matched the measurements within the set 20% tolerance limit providing proof of the robustness of the software. This software package linked to SFNC will be deployed at the site and will enhance the capability of gross defect verification for the whole range of burnup, cooling time and initial enrichments of the spent fuel being discharged into the various pools at the Atucha-I reactor site.« less

  19. Arrow 227: Air transport system design simulation

    NASA Technical Reports Server (NTRS)

    Bontempi, Michael; Bose, Dave; Brophy, Georgeann; Cashin, Timothy; Kanarios, Michael; Ryan, Steve; Peterson, Timothy

    1992-01-01

    The Arrow 227 is a student-designed commercial transport for use in a overnight package delivery network. The major goal of the concept was to provide the delivery service with the greatest potential return on investment. The design objectives of the Arrow 227 were based on three parameters; production cost, payload weight, and aerodynamic efficiency. Low production cost helps to reduce initial investment. Increased payload weight allows for a decrease in flight cycles and, therefore, less fuel consumption than an aircraft carrying less payload weight and requiring more flight cycles. In addition, fewer flight cycles will allow a fleet to last longer. Finally, increased aerodynamic efficiency in the form of high L/D will decrease fuel consumption.

  20. Improved heavy-duty vehicle fuel efficiency in India, benefits, costs and environmental impacts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gopal, Anand R.; Karali, Nihan; Sharpe, Ben

    The main objectives of this analysis are to examine the benefits and costs of fuel-saving technologies for new heavy-duty vehicles (HDVs) in India over the next 10 years and, to explore how various scenarios for the deployment of vehicles with these technologies will impact petroleum consumption and carbon dioxide (CO 2) emissions over the next three decades. The study team developed simulation models for three representative HDV types—a 40-tonne tractor-trailer, 25-tonne rigid truck, and 16-tonne transit bus—based on top-selling vehicle models in the Indian market. The baseline technology profiles for all three vehicles were developed using India-specific engine data andmore » vehicle specification information from manufacturer literature and input from industry experts. For each of the three vehicles we developed a comprehensive set of seven efficiency technology packages drawing from five major areas: engine, transmission and driveline, tires, aerodynamics, and weight reduction. Our analysis finds that India has substantial opportunity to improve HDV fuel efficiency levels using cost-effective technologies. Results from our simulation modeling of three representative HDV types—a tractor-trailer, rigid truck, and transit bus—reveal that per-vehicle fuel consumption reductions between roughly 20% and 35% are possible with technologies that provide a return on the initial capital investment within 1 to 2 years. Though most of these technologies are currently unavailable in India, experiences in other more advanced markets such as the US and EU suggest that with sufficient incentives and robust regulatory design, significant progress can be made in developing and deploying efficiency technologies that can provide real-world fuel savings for new commercial vehicles in India over the next 10 years. Bringing HDVs in India up to world-class technology levels will yield substantial petroleum and GHG reductions. By 2030, the fuel and CO2 reductions of the scenarios range from 10% to 34%, and at the end of 2050, these reductions grow to 13% and 41%. If we constrain the analysis to select the most efficient technology package that provides the fleets with payback times of 3 years or less, there are annual fleet-wide savings of roughly 11 MTOE of diesel and 34 MMT of CO 2 in 2030, and this grows to 31 MTOE and 97 MMT by 2050.« less

  1. FluxPyt: a Python-based free and open-source software for 13C-metabolic flux analyses.

    PubMed

    Desai, Trunil S; Srivastava, Shireesh

    2018-01-01

    13 C-Metabolic flux analysis (MFA) is a powerful approach to estimate intracellular reaction rates which could be used in strain analysis and design. Processing and analysis of labeling data for calculation of fluxes and associated statistics is an essential part of MFA. However, various software currently available for data analysis employ proprietary platforms and thus limit accessibility. We developed FluxPyt, a Python-based truly open-source software package for conducting stationary 13 C-MFA data analysis. The software is based on the efficient elementary metabolite unit framework. The standard deviations in the calculated fluxes are estimated using the Monte-Carlo analysis. FluxPyt also automatically creates flux maps based on a template for visualization of the MFA results. The flux distributions calculated by FluxPyt for two separate models: a small tricarboxylic acid cycle model and a larger Corynebacterium glutamicum model, were found to be in good agreement with those calculated by a previously published software. FluxPyt was tested in Microsoft™ Windows 7 and 10, as well as in Linux Mint 18.2. The availability of a free and open 13 C-MFA software that works in various operating systems will enable more researchers to perform 13 C-MFA and to further modify and develop the package.

  2. FluxPyt: a Python-based free and open-source software for 13C-metabolic flux analyses

    PubMed Central

    Desai, Trunil S.

    2018-01-01

    13C-Metabolic flux analysis (MFA) is a powerful approach to estimate intracellular reaction rates which could be used in strain analysis and design. Processing and analysis of labeling data for calculation of fluxes and associated statistics is an essential part of MFA. However, various software currently available for data analysis employ proprietary platforms and thus limit accessibility. We developed FluxPyt, a Python-based truly open-source software package for conducting stationary 13C-MFA data analysis. The software is based on the efficient elementary metabolite unit framework. The standard deviations in the calculated fluxes are estimated using the Monte-Carlo analysis. FluxPyt also automatically creates flux maps based on a template for visualization of the MFA results. The flux distributions calculated by FluxPyt for two separate models: a small tricarboxylic acid cycle model and a larger Corynebacterium glutamicum model, were found to be in good agreement with those calculated by a previously published software. FluxPyt was tested in Microsoft™ Windows 7 and 10, as well as in Linux Mint 18.2. The availability of a free and open 13C-MFA software that works in various operating systems will enable more researchers to perform 13C-MFA and to further modify and develop the package. PMID:29736347

  3. Ultra Clean 1.1MW High Efficiency Natural Gas Engine Powered System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zurlo, James; Lueck, Steve

    Dresser, Inc. (GE Energy, Waukesha gas engines) will develop, test, demonstrate, and commercialize a 1.1 Megawatt (MW) natural gas fueled combined heat and power reciprocating engine powered package. This package will feature a total efficiency > 75% and ultra low CARB permitting emissions. Our modular design will cover the 1 – 6 MW size range, and this scalable technology can be used in both smaller and larger engine powered CHP packages. To further advance one of the key advantages of reciprocating engines, the engine, generator and CHP package will be optimized for low initial and operating costs. Dresser, Inc. willmore » leverage the knowledge gained in the DOE - ARES program. Dresser, Inc. will work with commercial, regulatory, and government entities to help break down barriers to wider deployment of CHP. The outcome of this project will be a commercially successful 1.1 MW CHP package with high electrical and total efficiency that will significantly reduce emissions compared to the current central power plant paradigm. Principal objectives by phases for Budget Period 1 include: • Phase 1 – market study to determine optimum system performance, target first cost, lifecycle cost, and creation of a detailed product specification. • Phase 2 – Refinement of the Waukesha CHP system design concepts, identification of critical characteristics, initial evaluation of technical solutions, and risk mitigation plans. Background« less

  4. Tough2{_}MP: A parallel version of TOUGH2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Keni; Wu, Yu-Shu; Ding, Chris

    2003-04-09

    TOUGH2{_}MP is a massively parallel version of TOUGH2. It was developed for running on distributed-memory parallel computers to simulate large simulation problems that may not be solved by the standard, single-CPU TOUGH2 code. The new code implements an efficient massively parallel scheme, while preserving the full capacity and flexibility of the original TOUGH2 code. The new software uses the METIS software package for grid partitioning and AZTEC software package for linear-equation solving. The standard message-passing interface is adopted for communication among processors. Numerical performance of the current version code has been tested on CRAY-T3E and IBM RS/6000 SP platforms. Inmore » addition, the parallel code has been successfully applied to real field problems of multi-million-cell simulations for three-dimensional multiphase and multicomponent fluid and heat flow, as well as solute transport. In this paper, we will review the development of the TOUGH2{_}MP, and discuss the basic features, modules, and their applications.« less

  5. Savannah River Site Spent Nuclear Fuel Management Final Environmental Impact Statement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    N /A

    The proposed DOE action considered in this environmental impact statement (EIS) is to implement appropriate processes for the safe and efficient management of spent nuclear fuel and targets at the Savannah River Site (SRS) in Aiken County, South Carolina, including placing these materials in forms suitable for ultimate disposition. Options to treat, package, and store this material are discussed. The material included in this EIS consists of approximately 68 metric tons heavy metal (MTHM) of spent nuclear fuel 20 MTHM of aluminum-based spent nuclear fuel at SRS, as much as 28 MTHM of aluminum-clad spent nuclear fuel from foreign andmore » domestic research reactors to be shipped to SRS through 2035, and 20 MTHM of stainless-steel or zirconium-clad spent nuclear fuel and some Americium/Curium Targets stored at SRS. Alternatives considered in this EIS encompass a range of new packaging, new processing, and conventional processing technologies, as well as the No Action Alternative. A preferred alternative is identified in which DOE would prepare about 97% by volume (about 60% by mass) of the aluminum-based fuel for disposition using a melt and dilute treatment process. The remaining 3% by volume (about 40% by mass) would be managed using chemical separation. Impacts are assessed primarily in the areas of water resources, air resources, public and worker health, waste management, socioeconomic, and cumulative impacts.« less

  6. A comparison of six software packages for evaluation of solid lung nodules using semi-automated volumetry: what is the minimum increase in size to detect growth in repeated CT examinations.

    PubMed

    de Hoop, Bartjan; Gietema, Hester; van Ginneken, Bram; Zanen, Pieter; Groenewegen, Gerard; Prokop, Mathias

    2009-04-01

    We compared interexamination variability of CT lung nodule volumetry with six currently available semi-automated software packages to determine the minimum change needed to detect the growth of solid lung nodules. We had ethics committee approval. To simulate a follow-up examination with zero growth, we performed two low-dose unenhanced CT scans in 20 patients referred for pulmonary metastases. Between examinations, patients got off and on the table. Volumes of all pulmonary nodules were determined on both examinations using six nodule evaluation software packages. Variability (upper limit of the 95% confidence interval of the Bland-Altman plot) was calculated for nodules for which segmentation was visually rated as adequate. We evaluated 214 nodules (mean diameter 10.9 mm, range 3.3 mm-30.0 mm). Software packages provided adequate segmentation in 71% to 86% of nodules (p < 0.001). In case of adequate segmentation, variability in volumetry between scans ranged from 16.4% to 22.3% for the various software packages. Variability with five to six software packages was significantly less for nodules >or=8 mm in diameter (range 12.9%-17.1%) than for nodules <8 mm (range 18.5%-25.6%). Segmented volumes of each package were compared to each of the other packages. Systematic volume differences were detected in 11/15 comparisons. This hampers comparison of nodule volumes between software packages.

  7. Software and package applicating for network meta-analysis: A usage-based comparative study.

    PubMed

    Xu, Chang; Niu, Yuming; Wu, Junyi; Gu, Huiyun; Zhang, Chao

    2017-12-21

    To compare and analyze the characteristics and functions of software applications for network meta-analysis (NMA). PubMed, EMbase, The Cochrane Library, the official websites of Bayesian inference Using Gibbs Sampling (BUGS), Stata and R, and Google were searched to collect the software and packages for performing NMA; software and packages published up to March 2016 were included. After collecting the software, packages, and their user guides, we used the software and packages to calculate a typical example. All characteristics, functions, and computed results were compared and analyzed. Ten types of software were included, including programming and non-programming software. They were developed mainly based on Bayesian or frequentist theory. Most types of software have the characteristics of easy operation, easy mastery, exact calculation, or excellent graphing. However, there was no single software that performed accurate calculations with superior graphing; this could only be achieved through the combination of two or more types of software. This study suggests that the user should choose the appropriate software according to personal programming basis, operational habits, and financial ability. Then, the choice of the combination of BUGS and R (or Stata) software to perform the NMA is considered. © 2017 Chinese Cochrane Center, West China Hospital of Sichuan University and John Wiley & Sons Australia, Ltd.

  8. Streaming fragment assignment for real-time analysis of sequencing experiments

    PubMed Central

    Roberts, Adam; Pachter, Lior

    2013-01-01

    We present eXpress, a software package for highly efficient probabilistic assignment of ambiguously mapping sequenced fragments. eXpress uses a streaming algorithm with linear run time and constant memory use. It can determine abundances of sequenced molecules in real time, and can be applied to ChIP-seq, metagenomics and other large-scale sequencing data. We demonstrate its use on RNA-seq data, showing greater efficiency than other quantification methods. PMID:23160280

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, William Eugene

    These slides describe different strategies for installing Python software. Although I am a big fan of Python software development, robust strategies for software installation remains a challenge. This talk describes several different installation scenarios. The Good: the user has administrative privileges - Installing on Windows with an installer executable, Installing with Linux application utility, Installing a Python package from the PyPI repository, and Installing a Python package from source. The Bad: the user does not have administrative privileges - Using a virtual environment to isolate package installations, and Using an installer executable on Windows with a virtual environment. The Ugly:more » the user needs to install an extension package from source - Installing a Python extension package from source, and PyCoinInstall - Managing builds for Python extension packages. The last item referring to PyCoinInstall describes a utility being developed for the COIN-OR software, which is used within the operations research community. COIN-OR includes a variety of Python and C++ software packages, and this script uses a simple plug-in system to support the management of package builds and installation.« less

  10. QUENCH: A software package for the determination of quenching curves in Liquid Scintillation counting.

    PubMed

    Cassette, Philippe

    2016-03-01

    In Liquid Scintillation Counting (LSC), the scintillating source is part of the measurement system and its detection efficiency varies with the scintillator used, the vial and the volume and the chemistry of the sample. The detection efficiency is generally determined using a quenching curve, describing, for a specific radionuclide, the relationship between a quenching index given by the counter and the detection efficiency. A quenched set of LS standard sources are prepared by adding a quenching agent and the quenching index and detection efficiency are determined for each source. Then a simple formula is fitted to the experimental points to define the quenching curve function. The paper describes a software package specifically devoted to the determination of quenching curves with uncertainties. The experimental measurements are described by their quenching index and detection efficiency with uncertainties on both quantities. Random Gaussian fluctuations of these experimental measurements are sampled and a polynomial or logarithmic function is fitted on each fluctuation by χ(2) minimization. This Monte Carlo procedure is repeated many times and eventually the arithmetic mean and the experimental standard deviation of each parameter are calculated, together with the covariances between these parameters. Using these parameters, the detection efficiency, corresponding to an arbitrary quenching index within the measured range, can be calculated. The associated uncertainty is calculated with the law of propagation of variances, including the covariance terms. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Depleted uranium as a backfill for nuclear fuel waste package

    DOEpatents

    Forsberg, Charles W.

    1998-01-01

    A method for packaging spent nuclear fuel for long-term disposal in a geological repository. At least one spent nuclear fuel assembly is first placed in an unsealed waste package and a depleted uranium fill material is added to the waste package. The depleted uranium fill material comprises flowable particles having a size sufficient to substantially fill any voids in and around the assembly and contains isotopically-depleted uranium in the +4 valence state in an amount sufficient to inhibit dissolution of the spent nuclear fuel from the assembly into a surrounding medium and to lessen the potential for nuclear criticality inside the repository in the event of failure of the waste package. Last, the waste package is sealed, thereby substantially reducing the release of radionuclides into the surrounding medium, while simultaneously providing radiation shielding and increased structural integrity of the waste package.

  12. Depleted uranium as a backfill for nuclear fuel waste package

    DOEpatents

    Forsberg, C.W.

    1998-11-03

    A method is described for packaging spent nuclear fuel for long-term disposal in a geological repository. At least one spent nuclear fuel assembly is first placed in an unsealed waste package and a depleted uranium fill material is added to the waste package. The depleted uranium fill material comprises flowable particles having a size sufficient to substantially fill any voids in and around the assembly and contains isotopically-depleted uranium in the +4 valence state in an amount sufficient to inhibit dissolution of the spent nuclear fuel from the assembly into a surrounding medium and to lessen the potential for nuclear criticality inside the repository in the event of failure of the waste package. Last, the waste package is sealed, thereby substantially reducing the release of radionuclides into the surrounding medium, while simultaneously providing radiation shielding and increased structural integrity of the waste package. 6 figs.

  13. Extending R packages to support 64-bit compiled code: An illustration with spam64 and GIMMS NDVI3g data

    NASA Astrophysics Data System (ADS)

    Gerber, Florian; Mösinger, Kaspar; Furrer, Reinhard

    2017-07-01

    Software packages for spatial data often implement a hybrid approach of interpreted and compiled programming languages. The compiled parts are usually written in C, C++, or Fortran, and are efficient in terms of computational speed and memory usage. Conversely, the interpreted part serves as a convenient user-interface and calls the compiled code for computationally demanding operations. The price paid for the user friendliness of the interpreted component is-besides performance-the limited access to low level and optimized code. An example of such a restriction is the 64-bit vector support of the widely used statistical language R. On the R side, users do not need to change existing code and may not even notice the extension. On the other hand, interfacing 64-bit compiled code efficiently is challenging. Since many R packages for spatial data could benefit from 64-bit vectors, we investigate strategies to efficiently pass 64-bit vectors to compiled languages. More precisely, we show how to simply extend existing R packages using the foreign function interface to seamlessly support 64-bit vectors. This extension is shown with the sparse matrix algebra R package spam. The new capabilities are illustrated with an example of GIMMS NDVI3g data featuring a parametric modeling approach for a non-stationary covariance matrix.

  14. Validation of thermal effects of LED package by using Elmer finite element simulation method

    NASA Astrophysics Data System (ADS)

    Leng, Lai Siang; Retnasamy, Vithyacharan; Mohamad Shahimin, Mukhzeer; Sauli, Zaliman; Taniselass, Steven; Bin Ab Aziz, Muhamad Hafiz; Vairavan, Rajendaran; Kirtsaeng, Supap

    2017-02-01

    The overall performance of the Light-emitting diode, LED package is critically affected by the heat attribution. In this study, open source software - Elmer FEM has been utilized to study the thermal analysis of the LED package. In order to perform a complete simulation study, both Salome software and ParaView software were introduced as Pre and Postprocessor. The thermal effect of the LED package was evaluated by this software. The result has been validated with commercially licensed software based on previous work. The percentage difference from both simulation results is less than 5% which is tolerable and comparable.

  15. Hubert: Software for efficient analysis of in-situ nuclear forward scattering experiments

    NASA Astrophysics Data System (ADS)

    Vrba, Vlastimil; Procházka, Vít; Smrčka, David; Miglierini, Marcel

    2016-10-01

    Combination of short data acquisition time and local investigation of a solid state through hyperfine parameters makes nuclear forward scattering (NFS) a unique experimental technique for investigation of fast processes. However, the total number of acquired NFS time spectra may be very high. Therefore an efficient way of the data evaluation is needed. In this paper we report the development of Hubert software package as a response to the rapidly developing field of in-situ NFS experiments. Hubert offers several useful features for data files processing and could significantly shorten the evaluation time by using a simple connection between the neighboring time spectra through their input and output parameter values.

  16. Computer Grading As an Instructional Tool.

    ERIC Educational Resources Information Center

    Rottmann, Ray M.; Hudson, H. T.

    1983-01-01

    Describes computer grading system providing/storing scores and giving feedback to instructors on how students are performing on a day-to-day basis and how they are handling course concepts. Focuses on the hardware and software of this efficient computerized grading package, which can be used with classes of 250 students (or larger). (Author/JN)

  17. The Importance of Computer Programming Skills to Educational Researchers.

    ERIC Educational Resources Information Center

    Lawson, Stephen

    The use of the modern computer has revolutionized the field of educational research. Software packages are currently available that allow almost anyone to analyze data efficiently and rapidly. Yet, caution must temper the widespread acceptance and use of these programs. It is recommended that the researcher not rely solely on the use of…

  18. NDE Software Developed at NASA Glenn Research Center

    NASA Technical Reports Server (NTRS)

    Roth, Donald J.; Martin, Richard E.; Rauser, Richard W.; Nichols, Charles; Bonacuse, Peter J.

    2014-01-01

    NASA Glenn Research Center has developed several important Nondestructive Evaluation (NDE) related software packages for different projects in the last 10 years. Three of the software packages have been created with commercial-grade user interfaces and are available to United States entities for download on the NASA Technology Transfer and Partnership Office server (https://sr.grc.nasa.gov/). This article provides brief overviews of the software packages.

  19. The dynamic and steady state behavior of a PEM fuel cell as an electric energy source

    NASA Astrophysics Data System (ADS)

    Costa, R. A.; Camacho, J. R.

    The main objective of this work is to extract information on the internal behavior of three small polymer electrolyte membrane fuel cells under static and dynamic load conditions. A computational model was developed using Scilab [SCILAB 4, Scilab-a free scientific software package, http://www.scilab.org/, INRIA, France, December, 2005] to simulate the static and dynamic performance [J.M. Correa, A.F. Farret, L.N. Canha, An analysis of the dynamic performance of proton exchange membrane fuel cells using an electrochemical model, in: 27th Annual Conference of IEEE Industrial Electronics Society, 2001, pp. 141-146] of this particular type of fuel cell. This dynamic model is based on electrochemical equations and takes into consideration most of the chemical and physical characteristics of the device in order to generate electric power. The model takes into consideration the operating, design parameters and physical material properties. The results show the internal losses and concentration effects behavior, which are of interest for power engineers and researchers.

  20. Three Dimensional Transient Turbulent Simulations of Scramjet Fuel Injection and Combustion

    NASA Astrophysics Data System (ADS)

    Bahbaz, Marwane

    2011-11-01

    Scramjet is a propulsion system that is more effective for hypersonic flights (M >5). The main objective of the simulation is to understand both the mixing and combustion process of air flow using hydrogen fuel in high speed environment s. The understanding of this phenomenon is used to determine the number of fuel injectors required to increase combustion efficiency and energy transfer. Due to the complexity of this simulation, multiple software tools are used to achieve this objective. First, Solid works is used to draw a scramjet combustor with accurate measurements. Second software tool used is Gambit; It is used to make several types of meshes for the scramjet combustor. Finally, Open Foam and CFD++ are software used to process and post process the scramjet combustor. At this stage, the simulation is divided into two categories. The cold flow category is a series of simulations that include subsonic and supersonic turbulent air flow across the combustor channel with fuel interaction from one or more injectors'. The second category is the combustion simulations which involve fluid flow and fuel mixing with ignition. The simulation and modeling of scramjet combustor will assist to investigate and understand the combustion process and energy transfer in hypersonic environment.

  1. NLM microcomputer-based tutorials (for microcomputers). Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perkins, M.

    1990-04-01

    The package consists of TOXLEARN--a microcomputer-based training package for TOXLINE (Toxicology Information Online), CHEMLEARN-a microcomputer-based training package for CHEMLINE (Chemical Information Online), MEDTUTOR--a microcomputer-based training package for MEDLINE (Medical Information Online), and ELHILL LEARN--a microcomputer-based training package for the ELHILL search and retrieval software that supports the above-mentioned databases...Software Description: The programs were developed under PILOTplus using the NLM LEARN Programmer. They run on IBM-PC, XT, AT, PS/2, and fully compatible computers. The programs require 512K RAM memory, one disk drive, and DOS 2.0 or higher. The software supports most monochrome, color graphics, enhanced color graphics, or visual graphics displays.

  2. BATS: a Bayesian user-friendly software for analyzing time series microarray experiments.

    PubMed

    Angelini, Claudia; Cutillo, Luisa; De Canditiis, Daniela; Mutarelli, Margherita; Pensky, Marianna

    2008-10-06

    Gene expression levels in a given cell can be influenced by different factors, namely pharmacological or medical treatments. The response to a given stimulus is usually different for different genes and may depend on time. One of the goals of modern molecular biology is the high-throughput identification of genes associated with a particular treatment or a biological process of interest. From methodological and computational point of view, analyzing high-dimensional time course microarray data requires very specific set of tools which are usually not included in standard software packages. Recently, the authors of this paper developed a fully Bayesian approach which allows one to identify differentially expressed genes in a 'one-sample' time-course microarray experiment, to rank them and to estimate their expression profiles. The method is based on explicit expressions for calculations and, hence, very computationally efficient. The software package BATS (Bayesian Analysis of Time Series) presented here implements the methodology described above. It allows an user to automatically identify and rank differentially expressed genes and to estimate their expression profiles when at least 5-6 time points are available. The package has a user-friendly interface. BATS successfully manages various technical difficulties which arise in time-course microarray experiments, such as a small number of observations, non-uniform sampling intervals and replicated or missing data. BATS is a free user-friendly software for the analysis of both simulated and real microarray time course experiments. The software, the user manual and a brief illustrative example are freely available online at the BATS website: http://www.na.iac.cnr.it/bats.

  3. NMRbox: A Resource for Biomolecular NMR Computation.

    PubMed

    Maciejewski, Mark W; Schuyler, Adam D; Gryk, Michael R; Moraru, Ion I; Romero, Pedro R; Ulrich, Eldon L; Eghbalnia, Hamid R; Livny, Miron; Delaglio, Frank; Hoch, Jeffrey C

    2017-04-25

    Advances in computation have been enabling many recent advances in biomolecular applications of NMR. Due to the wide diversity of applications of NMR, the number and variety of software packages for processing and analyzing NMR data is quite large, with labs relying on dozens, if not hundreds of software packages. Discovery, acquisition, installation, and maintenance of all these packages is a burdensome task. Because the majority of software packages originate in academic labs, persistence of the software is compromised when developers graduate, funding ceases, or investigators turn to other projects. To simplify access to and use of biomolecular NMR software, foster persistence, and enhance reproducibility of computational workflows, we have developed NMRbox, a shared resource for NMR software and computation. NMRbox employs virtualization to provide a comprehensive software environment preconfigured with hundreds of software packages, available as a downloadable virtual machine or as a Platform-as-a-Service supported by a dedicated compute cloud. Ongoing development includes a metadata harvester to regularize, annotate, and preserve workflows and facilitate and enhance data depositions to BioMagResBank, and tools for Bayesian inference to enhance the robustness and extensibility of computational analyses. In addition to facilitating use and preservation of the rich and dynamic software environment for biomolecular NMR, NMRbox fosters the development and deployment of a new class of metasoftware packages. NMRbox is freely available to not-for-profit users. Copyright © 2017 Biophysical Society. All rights reserved.

  4. Spinoff 2015

    NASA Technical Reports Server (NTRS)

    2015-01-01

    Topics covered include: 3D Endoscope to Boost Safety, Cut Cost of Surgery; Audio App Brings a Better Night's Sleep Liquid Cooling Technology Increases Exercise Efficiency; Algae-Derived Dietary Ingredients Nourish Animals; Space Grant Research Launches Rehabilitation Chair; Vision Trainer Teaches Focusing Techniques at Home; Aircraft Geared Architecture Reduces Fuel Cost and Noise; Ubiquitous Supercritical Wing Design Cuts Billions in Fuel Costs; Flight Controller Software Protects Lightweight Flexible Aircraft; Cabin Pressure Monitors Notify Pilots to Save Lives; Ionospheric Mapping Software Ensures Accuracy of Pilots' GPS; Water Mapping Technology Rebuilds Lives in Arid Regions; Shock Absorbers Save Structures and Lives during Earthquakes; Software Facilitates Sharing of Water Quality Data Worldwide; Underwater Adhesives Retrofit Pipelines with Advanced Sensors; Laser Imaging Video Camera Sees through Fire, Fog, Smoke; 3D Lasers Increase Efficiency, Safety of Moving Machines; Air Revitalization System Enables Excursions to the Stratosphere; Magnetic Fluids Deliver Better Speaker Sound Quality; Bioreactor Yields Extracts for Skin Cream; Private Astronaut Training Prepares Commercial Crews of Tomorrow; Activity Monitors Help Users Get Optimum Sun Exposure; LEDs Illuminate Bulbs for Better Sleep, Wake Cycles; Charged Particles Kill Pathogens and Round Up Dust; Balance Devices Train Golfers for a Consistent Swing; Landsat Imagery Enables Global Studies of Surface Trends; Ruggedized Spectrometers Are Built for Tough Jobs; Gas Conversion Systems Reclaim Fuel for Industry; Remote Sensing Technologies Mitigate Drought; Satellite Data Inform Forecasts of Crop Growth; Probes Measure Gases for Environmental Research; Cloud Computing Technologies Facilitate Earth Research; Software Cuts Homebuilding Costs, Increases Energy Efficiency; Portable Planetariums Teach Science; Schedule Analysis Software Saves Time for Project Planners; Sound Modeling Simplifies Vehicle Noise Management; Custom 3D Printers Revolutionize Space Supply Chain; Improved Calibration Shows Images' True Colors; Micromachined Parts Advance Medicine, Astrophysics, and More; Metalworking Techniques Unlock a Unique Alloy; Low-Cost Sensors Deliver Nanometer-Accurate Measurements; Electrical Monitoring Devices Save on Time and Cost; Dry Lubricant Smooths the Way for Space Travel, Industry; and Compact Vapor Chamber Cools Critical Components.

  5. Utilizing Visual Effects Software for Efficient and Flexible Isostatic Adjustment Modelling

    NASA Astrophysics Data System (ADS)

    Meldgaard, A.; Nielsen, L.; Iaffaldano, G.

    2017-12-01

    The isostatic adjustment signal generated by transient ice sheet loading is an important indicator of past ice sheet extent and the rheological constitution of the interior of the Earth. Finite element modelling has proved to be a very useful tool in these studies. We present a simple numerical model for 3D visco elastic Earth deformation and a new approach to the design of such models utilizing visual effects software designed for the film and game industry. The software package Houdini offers an assortment of optimized tools and libraries which greatly facilitate the creation of efficient numerical algorithms. In particular, we make use of Houdini's procedural work flow, the SIMD programming language VEX, Houdini's sparse matrix creation and inversion libraries, an inbuilt tetrahedralizer for grid creation, and the user interface, which facilitates effortless manipulation of 3D geometry. We mitigate many of the time consuming steps associated with the authoring of efficient algorithms from scratch while still keeping the flexibility that may be lost with the use of commercial dedicated finite element programs. We test the efficiency of the algorithm by comparing simulation times with off-the-shelf solutions from the Abaqus software package. The algorithm is tailored for the study of local isostatic adjustment patterns, in close vicinity to present ice sheet margins. In particular, we wish to examine possible causes for the considerable spatial differences in the uplift magnitude which are apparent from field observations in these areas. Such features, with spatial scales of tens of kilometres, are not resolvable with current global isostatic adjustment models, and may require the inclusion of local topographic features. We use the presented algorithm to study a near field area where field observations are abundant, namely, Disko Bay in West Greenland with the intention of constraining Earth parameters and ice thickness. In addition, we assess how local topographic features may influence the differential isostatic uplift in the area.

  6. A Microsoft-Excel-based tool for running and critically appraising network meta-analyses--an overview and application of NetMetaXL.

    PubMed

    Brown, Stephen; Hutton, Brian; Clifford, Tammy; Coyle, Doug; Grima, Daniel; Wells, George; Cameron, Chris

    2014-09-29

    The use of network meta-analysis has increased dramatically in recent years. WinBUGS, a freely available Bayesian software package, has been the most widely used software package to conduct network meta-analyses. However, the learning curve for WinBUGS can be daunting, especially for new users. Furthermore, critical appraisal of network meta-analyses conducted in WinBUGS can be challenging given its limited data manipulation capabilities and the fact that generation of graphical output from network meta-analyses often relies on different software packages than the analyses themselves. We developed a freely available Microsoft-Excel-based tool called NetMetaXL, programmed in Visual Basic for Applications, which provides an interface for conducting a Bayesian network meta-analysis using WinBUGS from within Microsoft Excel. . This tool allows the user to easily prepare and enter data, set model assumptions, and run the network meta-analysis, with results being automatically displayed in an Excel spreadsheet. It also contains macros that use NetMetaXL's interface to generate evidence network diagrams, forest plots, league tables of pairwise comparisons, probability plots (rankograms), and inconsistency plots within Microsoft Excel. All figures generated are publication quality, thereby increasing the efficiency of knowledge transfer and manuscript preparation. We demonstrate the application of NetMetaXL using data from a network meta-analysis published previously which compares combined resynchronization and implantable defibrillator therapy in left ventricular dysfunction. We replicate results from the previous publication while demonstrating result summaries generated by the software. Use of the freely available NetMetaXL successfully demonstrated its ability to make running network meta-analyses more accessible to novice WinBUGS users by allowing analyses to be conducted entirely within Microsoft Excel. NetMetaXL also allows for more efficient and transparent critical appraisal of network meta-analyses, enhanced standardization of reporting, and integration with health economic evaluations which are frequently Excel-based.

  7. A Microsoft-Excel-based tool for running and critically appraising network meta-analyses—an overview and application of NetMetaXL

    PubMed Central

    2014-01-01

    Background The use of network meta-analysis has increased dramatically in recent years. WinBUGS, a freely available Bayesian software package, has been the most widely used software package to conduct network meta-analyses. However, the learning curve for WinBUGS can be daunting, especially for new users. Furthermore, critical appraisal of network meta-analyses conducted in WinBUGS can be challenging given its limited data manipulation capabilities and the fact that generation of graphical output from network meta-analyses often relies on different software packages than the analyses themselves. Methods We developed a freely available Microsoft-Excel-based tool called NetMetaXL, programmed in Visual Basic for Applications, which provides an interface for conducting a Bayesian network meta-analysis using WinBUGS from within Microsoft Excel. . This tool allows the user to easily prepare and enter data, set model assumptions, and run the network meta-analysis, with results being automatically displayed in an Excel spreadsheet. It also contains macros that use NetMetaXL’s interface to generate evidence network diagrams, forest plots, league tables of pairwise comparisons, probability plots (rankograms), and inconsistency plots within Microsoft Excel. All figures generated are publication quality, thereby increasing the efficiency of knowledge transfer and manuscript preparation. Results We demonstrate the application of NetMetaXL using data from a network meta-analysis published previously which compares combined resynchronization and implantable defibrillator therapy in left ventricular dysfunction. We replicate results from the previous publication while demonstrating result summaries generated by the software. Conclusions Use of the freely available NetMetaXL successfully demonstrated its ability to make running network meta-analyses more accessible to novice WinBUGS users by allowing analyses to be conducted entirely within Microsoft Excel. NetMetaXL also allows for more efficient and transparent critical appraisal of network meta-analyses, enhanced standardization of reporting, and integration with health economic evaluations which are frequently Excel-based. PMID:25267416

  8. Efficient analysis of large-scale genome-wide data with two R packages: bigstatsr and bigsnpr.

    PubMed

    Privé, Florian; Aschard, Hugues; Ziyatdinov, Andrey; Blum, Michael G B

    2017-03-30

    Genome-wide datasets produced for association studies have dramatically increased in size over the past few years, with modern datasets commonly including millions of variants measured in dozens of thousands of individuals. This increase in data size is a major challenge severely slowing down genomic analyses, leading to some software becoming obsolete and researchers having limited access to diverse analysis tools. Here we present two R packages, bigstatsr and bigsnpr, allowing for the analysis of large scale genomic data to be performed within R. To address large data size, the packages use memory-mapping for accessing data matrices stored on disk instead of in RAM. To perform data pre-processing and data analysis, the packages integrate most of the tools that are commonly used, either through transparent system calls to existing software, or through updated or improved implementation of existing methods. In particular, the packages implement fast and accurate computations of principal component analysis and association studies, functions to remove SNPs in linkage disequilibrium and algorithms to learn polygenic risk scores on millions of SNPs. We illustrate applications of the two R packages by analyzing a case-control genomic dataset for celiac disease, performing an association study and computing Polygenic Risk Scores. Finally, we demonstrate the scalability of the R packages by analyzing a simulated genome-wide dataset including 500,000 individuals and 1 million markers on a single desktop computer. https://privefl.github.io/bigstatsr/ & https://privefl.github.io/bigsnpr/. florian.prive@univ-grenoble-alpes.fr & michael.blum@univ-grenoble-alpes.fr. Supplementary materials are available at Bioinformatics online.

  9. Closing the loop on improvement: Packaging experience in the Software Engineering Laboratory

    NASA Technical Reports Server (NTRS)

    Waligora, Sharon R.; Landis, Linda C.; Doland, Jerry T.

    1994-01-01

    As part of its award-winning software process improvement program, the Software Engineering Laboratory (SEL) has developed an effective method for packaging organizational best practices based on real project experience into useful handbooks and training courses. This paper shares the SEL's experience over the past 12 years creating and updating software process handbooks and training courses. It provides cost models and guidelines for successful experience packaging derived from SEL experience.

  10. Mocking the weak lensing universe: The LensTools Python computing package

    NASA Astrophysics Data System (ADS)

    Petri, A.

    2016-10-01

    We present a newly developed software package which implements a wide range of routines frequently used in Weak Gravitational Lensing (WL). With the continuously increasing size of the WL scientific community we feel that easy to use Application Program Interfaces (APIs) for common calculations are a necessity to ensure efficiency and coordination across different working groups. Coupled with existing open source codes, such as CAMB (Lewis et al., 2000) and Gadget2 (Springel, 2005), LensTools brings together a cosmic shear simulation pipeline which, complemented with a variety of WL feature measurement tools and parameter sampling routines, provides easy access to the numerics for theoretical studies of WL as well as for experiment forecasts. Being implemented in PYTHON (Rossum, 1995), LensTools takes full advantage of a range of state-of-the art techniques developed by the large and growing open-source software community (Jones et al., 2001; McKinney, 2010; Astrophy Collaboration, 2013; Pedregosa et al., 2011; Foreman-Mackey et al., 2013). We made the LensTools code available on the Python Package Index and published its documentation on http://lenstools.readthedocs.io.

  11. Packaging Strategies for Criticality Safety for "Other" DOE Fuels in a Repository

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larry L Taylor

    2004-06-01

    Since 1998, there has been an ongoing effort to gain acceptance of U.S. Department of Energy (DOE)-owned spent nuclear fuel (SNF) in the national repository. To accomplish this goal, the fuel matrix was used as a discriminating feature to segregate fuels into nine distinct groups. From each of those groups, a characteristic fuel was selected and analyzed for criticality safety based on a proposed packaging strategy. This report identifies and quantifies the important criticality parameters for the canisterized fuels within each criticality group to: (1) demonstrate how the “other” fuels in the group are bounded by the baseline calculations ormore » (2) allow identification of individual type fuels that might require special analysis and packaging.« less

  12. PrimerSuite: A High-Throughput Web-Based Primer Design Program for Multiplex Bisulfite PCR.

    PubMed

    Lu, Jennifer; Johnston, Andrew; Berichon, Philippe; Ru, Ke-Lin; Korbie, Darren; Trau, Matt

    2017-01-24

    The analysis of DNA methylation at CpG dinucleotides has become a major research focus due to its regulatory role in numerous biological processes, but the requisite need for assays which amplify bisulfite-converted DNA represents a major bottleneck due to the unique design constraints imposed on bisulfite-PCR primers. Moreover, a review of the literature indicated no available software solutions which accommodated both high-throughput primer design, support for multiplex amplification assays, and primer-dimer prediction. In response, the tri-modular software package PrimerSuite was developed to support bisulfite multiplex PCR applications. This software was constructed to (i) design bisulfite primers against multiple regions simultaneously (PrimerSuite), (ii) screen for primer-primer dimerizing artefacts (PrimerDimer), and (iii) support multiplex PCR assays (PrimerPlex). Moreover, a major focus in the development of this software package was the emphasis on extensive empirical validation, and over 1300 unique primer pairs have been successfully designed and screened, with over 94% of them producing amplicons of the expected size, and an average mapping efficiency of 93% when screened using bisulfite multiplex resequencing. The potential use of the software in other bisulfite-based applications such as methylation-specific PCR is under consideration for future updates. This resource is freely available for use at PrimerSuite website (www.primer-suite.com).

  13. Mercury⊕: An evidential reasoning image classifier

    NASA Astrophysics Data System (ADS)

    Peddle, Derek R.

    1995-12-01

    MERCURY⊕ is a multisource evidential reasoning classification software system based on the Dempster-Shafer theory of evidence. The design and implementation of this software package is described for improving the classification and analysis of multisource digital image data necessary for addressing advanced environmental and geoscience applications. In the remote-sensing context, the approach provides a more appropriate framework for classifying modern, multisource, and ancillary data sets which may contain a large number of disparate variables with different statistical properties, scales of measurement, and levels of error which cannot be handled using conventional Bayesian approaches. The software uses a nonparametric, supervised approach to classification, and provides a more objective and flexible interface to the evidential reasoning framework using a frequency-based method for computing support values from training data. The MERCURY⊕ software package has been implemented efficiently in the C programming language, with extensive use made of dynamic memory allocation procedures and compound linked list and hash-table data structures to optimize the storage and retrieval of evidence in a Knowledge Look-up Table. The software is complete with a full user interface and runs under Unix, Ultrix, VAX/VMS, MS-DOS, and Apple Macintosh operating system. An example of classifying alpine land cover and permafrost active layer depth in northern Canada is presented to illustrate the use and application of these ideas.

  14. Radiative transfer through terrestrial atmosphere and ocean: Software package SCIATRAN

    NASA Astrophysics Data System (ADS)

    Rozanov, V. V.; Rozanov, A. V.; Kokhanovsky, A. A.; Burrows, J. P.

    2014-01-01

    SCIATRAN is a comprehensive software package for the modeling of radiative transfer processes in the terrestrial atmosphere and ocean in the spectral range from the ultraviolet to the thermal infrared (0.18 - 40 μm) including multiple scattering processes, polarization, thermal emission and ocean-atmosphere coupling. The software is capable of modeling spectral and angular distributions of the intensity or the Stokes vector of the transmitted, scattered, reflected, and emitted radiation assuming either a plane-parallel or a spherical atmosphere. Simulations are done either in the scalar or in the vector mode (i.e. accounting for the polarization) for observations by space-, air-, ship- and balloon-borne, ground-based, and underwater instruments in various viewing geometries (nadir, off-nadir, limb, occultation, zenith-sky, off-axis). All significant radiative transfer processes are accounted for. These are, e.g. the Rayleigh scattering, scattering by aerosol and cloud particles, absorption by gaseous components, and bidirectional reflection by an underlying surface including Fresnel reflection from a flat or roughened ocean surface. The software package contains several radiative transfer solvers including finite difference and discrete-ordinate techniques, an extensive database, and a specific module for solving inverse problems. In contrast to many other radiative transfer codes, SCIATRAN incorporates an efficient approach to calculate the so-called Jacobians, i.e. derivatives of the intensity with respect to various atmospheric and surface parameters. In this paper we discuss numerical methods used in SCIATRAN to solve the scalar and vector radiative transfer equation, describe databases of atmospheric, oceanic, and surface parameters incorporated in SCIATRAN, and demonstrate how to solve some selected radiative transfer problems using the SCIATRAN package. During the last decades, a lot of studies have been published demonstrating that SCIATRAN is a valuable tool for a wide range of remote sensing applications. Here, we present some selected comparisons of SCIATRAN simulations to published benchmark results, independent radiative transfer models, and various measurements from satellite, ground-based, and ship instruments. Methods for solving inverse problems related to remote sensing of the Earth's atmosphere using the SCIATRAN software are outside the scope of this study and will be discussed in a follow-up paper. The SCIATRAN software package along with a detailed User's Guide is freely available for non-commercial use via the webpage of the Institute of Environmental Physics (IUP), University of Bremen: http://www.iup.physik.uni-bremen.de/sciatran.

  15. MAP - a mapping and analysis program for harvest planning

    Treesearch

    Robert N. Eli; Chris B. LeDoux; Penn A. Peters

    1984-01-01

    The Northeastern Forest Experiment Station and the Department of Civil Engineering at West Virginia University are cooperating in the development of a Mapping and Analysis Program, to be named MAP. The goal of this computer software package is to significantly improve the planning and harvest efficiency of small to moderately sized harvest units located in mountainous...

  16. Marrying Two Existing Software Packages into an Efficient Online Tutoring Tool

    ERIC Educational Resources Information Center

    Byrne, Timothy

    2007-01-01

    Many teachers today use Learning Management Systems (LMS), several of which are open-source. Specific examples are Claroline and Moodle. However, they are not specifically designed for language learning, and hence not entirely suitable. In this article, I will compare two uses of the Claroline LMS available at Louvain-la-Neuve within the framework…

  17. Problems in Analyzing Time Series with Gaps and Their Solution with the WinABD Software Package

    NASA Astrophysics Data System (ADS)

    Desherevskii, A. V.; Zhuravlev, V. I.; Nikolsky, A. N.; Sidorin, A. Ya.

    2017-12-01

    Technologies for the analysis of time series with gaps are considered. Some algorithms of signal extraction (purification) and evaluation of its characteristics, such as rhythmic components, are discussed for series with gaps. Examples are given for the analysis of data obtained during long-term observations at the Garm geophysical test site and in other regions. The technical solutions used in the WinABD software are considered to most efficiently arrange the operation of relevant algorithms in the presence of observational defects.

  18. [The need to develop demographic census systems for Latin America].

    PubMed

    Silva, A

    1987-01-01

    The author presents the case for developing new software packages specifically designed to process population census information for Latin America. The focus is on the problems faced by developing countries in handling vast amounts of data in an efficient way. First, the basic methods of census data processing are discussed, then brief descriptions of some of the available software are included. Finally, ways in which data processing programs could be geared toward and utilized for improving the accuracy of Latin American censuses in the 1990s are proposed.

  19. pyam: Python Implementation of YaM

    NASA Technical Reports Server (NTRS)

    Myint, Steven; Jain, Abhinandan

    2012-01-01

    pyam is a software development framework with tools for facilitating the rapid development of software in a concurrent software development environment. pyam provides solutions for development challenges associated with software reuse, managing multiple software configurations, developing software product lines, and multiple platform development and build management. pyam uses release-early, release-often development cycles to allow developers to integrate their changes incrementally into the system on a continual basis. It facilitates the creation and merging of branches to support the isolated development of immature software to avoid impacting the stability of the development effort. It uses modules and packages to organize and share software across multiple software products, and uses the concepts of link and work modules to reduce sandbox setup times even when the code-base is large. One sidebenefit is the enforcement of a strong module-level encapsulation of a module s functionality and interface. This increases design transparency, system stability, and software reuse. pyam is written in Python and is organized as a set of utilities on top of the open source SVN software version control package. All development software is organized into a collection of modules. pyam packages are defined as sub-collections of the available modules. Developers can set up private sandboxes for module/package development. All module/package development takes place on private SVN branches. High-level pyam commands support the setup, update, and release of modules and packages. Released and pre-built versions of modules are available to developers. Developers can tailor the source/link module mix for their sandboxes so that new sandboxes (even large ones) can be built up easily and quickly by pointing to pre-existing module releases. All inter-module interfaces are publicly exported via links. A minimal, but uniform, convention is used for building modules.

  20. Craniux: A LabVIEW-Based Modular Software Framework for Brain-Machine Interface Research

    DTIC Science & Technology

    2011-01-01

    open-source BMI software solu- tions are currently available, we feel that the Craniux software package fills a specific need in the realm of BMI...data, such as cortical source imaging using EEG or MEG recordings. It is with these characteristics in mind that we feel the Craniux software package...S. Adee, “Dean Kamen’s ‘luke arm’ prosthesis readies for clinical trials,” IEEE Spectrum, February 2008, http://spectrum .ieee.org/biomedical

  1. State-of-the-art software for window energy-efficiency rating and labeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arasteh, D.; Finlayson, E.; Huang, J.

    1998-07-01

    Measuring the thermal performance of windows in typical residential buildings is an expensive proposition. Not only is laboratory testing expensive, but each window manufacturer typically offers hundreds of individual products, each of which has different thermal performance properties. With over a thousand window manufacturers nationally, a testing-based rating system would be prohibitively expensive to the industry and to consumers. Beginning in the early 1990s, simulation software began to be used as part of a national program for rating window U-values. The rating program has since been expanded to include Solar Hear Gain Coefficients and is now being extended to annualmore » energy performance. This paper describes four software packages available to the public from Lawrence Berkeley National Laboratory (LBNL). These software packages are used to evaluate window thermal performance: RESFEN (for evaluating annual energy costs), WINDOW (for calculating a product`s thermal performance properties), THERM (a preprocessor for WINDOW that determines two-dimensional heat-transfer effects), and Optics (a preprocessor for WINDOW`s glass database). Software not only offers a less expensive means than testing to evaluate window performance, it can also be used during the design process to help manufacturers produce windows that will meet target specifications. In addition, software can show small improvements in window performance that might not be detected in actual testing because of large uncertainties in test procedures.« less

  2. Astronomical Software Directory Service

    NASA Technical Reports Server (NTRS)

    Hanisch, R. J.; Payne, H.; Hayes, J.

    1998-01-01

    This is the final report on the development of the Astronomical Software Directory Service (ASDS), a distributable, searchable, WWW-based database of software packages and their related documentation. ASDS provides integrated access to 56 astronomical software packages, with more than 16,000 URL's indexed for full-text searching.

  3. Quantitative evaluation of software packages for single-molecule localization microscopy.

    PubMed

    Sage, Daniel; Kirshner, Hagai; Pengo, Thomas; Stuurman, Nico; Min, Junhong; Manley, Suliana; Unser, Michael

    2015-08-01

    The quality of super-resolution images obtained by single-molecule localization microscopy (SMLM) depends largely on the software used to detect and accurately localize point sources. In this work, we focus on the computational aspects of super-resolution microscopy and present a comprehensive evaluation of localization software packages. Our philosophy is to evaluate each package as a whole, thus maintaining the integrity of the software. We prepared synthetic data that represent three-dimensional structures modeled after biological components, taking excitation parameters, noise sources, point-spread functions and pixelation into account. We then asked developers to run their software on our data; most responded favorably, allowing us to present a broad picture of the methods available. We evaluated their results using quantitative and user-interpretable criteria: detection rate, accuracy, quality of image reconstruction, resolution, software usability and computational resources. These metrics reflect the various tradeoffs of SMLM software packages and help users to choose the software that fits their needs.

  4. Browndye: A Software Package for Brownian Dynamics

    PubMed Central

    McCammon, J. Andrew

    2010-01-01

    A new software package, Browndye, is presented for simulating the diffusional encounter of two large biological molecules. It can be used to estimate second-order rate constants and encounter probabilities, and to explore reaction trajectories. Browndye builds upon previous knowledge and algorithms from software packages such as UHBD, SDA, and Macrodox, while implementing algorithms that scale to larger systems. PMID:21132109

  5. Development of a software package for solid-angle calculations using the Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Zhang, Jie; Chen, Xiulian; Zhang, Changsheng; Li, Gang; Xu, Jiayun; Sun, Guangai

    2014-02-01

    Solid-angle calculations play an important role in the absolute calibration of radioactivity measurement systems and in the determination of the activity of radioactive sources, which are often complicated. In the present paper, a software package is developed to provide a convenient tool for solid-angle calculations in nuclear physics. The proposed software calculates solid angles using the Monte Carlo method, in which a new type of variance reduction technique was integrated. The package, developed under the environment of Microsoft Foundation Classes (MFC) in Microsoft Visual C++, has a graphical user interface, in which, the visualization function is integrated in conjunction with OpenGL. One advantage of the proposed software package is that it can calculate the solid angle subtended by a detector with different geometric shapes (e.g., cylinder, square prism, regular triangular prism or regular hexagonal prism) to a point, circular or cylindrical source without any difficulty. The results obtained from the proposed software package were compared with those obtained from previous studies and calculated using Geant4. It shows that the proposed software package can produce accurate solid-angle values with a greater computation speed than Geant4.

  6. GST-PRIME: an algorithm for genome-wide primer design.

    PubMed

    Leister, Dario; Varotto, Claudio

    2007-01-01

    The profiling of mRNA expression based on DNA arrays has become a powerful tool to study genome-wide transcription of genes in a number of organisms. GST-PRIME is a software package created to facilitate large-scale primer design for the amplification of probes to be immobilized on arrays for transcriptome analyses, even though it can be also applied in low-throughput approaches. GST-PRIME allows highly efficient, direct amplification of gene-sequence tags (GSTs) from genomic DNA (gDNA), starting from annotated genome or transcript sequences. GST-PRIME provides a customer-friendly platform for automatic primer design, and despite the relative simplicity of the algorithm, experimental tests in the model plant species Arabidopsis thaliana confirmed the reliability of the software. This chapter describes the algorithm used for primer design, its input and output files, and the installation of the standalone package and its use.

  7. A Shifted Block Lanczos Algorithm 1: The Block Recurrence

    NASA Technical Reports Server (NTRS)

    Grimes, Roger G.; Lewis, John G.; Simon, Horst D.

    1990-01-01

    In this paper we describe a block Lanczos algorithm that is used as the key building block of a software package for the extraction of eigenvalues and eigenvectors of large sparse symmetric generalized eigenproblems. The software package comprises: a version of the block Lanczos algorithm specialized for spectrally transformed eigenproblems; an adaptive strategy for choosing shifts, and efficient codes for factoring large sparse symmetric indefinite matrices. This paper describes the algorithmic details of our block Lanczos recurrence. This uses a novel combination of block generalizations of several features that have only been investigated independently in the past. In particular new forms of partial reorthogonalization, selective reorthogonalization and local reorthogonalization are used, as is a new algorithm for obtaining the M-orthogonal factorization of a matrix. The heuristic shifting strategy, the integration with sparse linear equation solvers and numerical experience with the code are described in a companion paper.

  8. Can I Trust This Software Package? An Exercise in Validation of Computational Results

    ERIC Educational Resources Information Center

    Shacham, Mordechai; Brauner, Neima; Ashurst, W. Robert; Cutlip, Michael B.

    2008-01-01

    Mathematical software packages such as Polymath, MATLAB, and Mathcad are currently widely used for engineering problem solving. Applications of several of these packages to typical chemical engineering problems have been demonstrated by Cutlip, et al. The main characteristic of these packages is that they provide a "problem-solving environment…

  9. International Inventory of Software Packages in the Information Field.

    ERIC Educational Resources Information Center

    Keren, Carl, Ed.; Sered, Irina, Ed.

    Designed to provide guidance in selecting appropriate software for library automation, information storage and retrieval, or management of bibliographic databases, this inventory describes 188 computer software packages. The information was obtained through a questionnaire survey of 600 software suppliers and developers who were asked to describe…

  10. Glioblastoma Segmentation: Comparison of Three Different Software Packages.

    PubMed

    Fyllingen, Even Hovig; Stensjøen, Anne Line; Berntsen, Erik Magnus; Solheim, Ole; Reinertsen, Ingerid

    2016-01-01

    To facilitate a more widespread use of volumetric tumor segmentation in clinical studies, there is an urgent need for reliable, user-friendly segmentation software. The aim of this study was therefore to compare three different software packages for semi-automatic brain tumor segmentation of glioblastoma; namely BrainVoyagerTM QX, ITK-Snap and 3D Slicer, and to make data available for future reference. Pre-operative, contrast enhanced T1-weighted 1.5 or 3 Tesla Magnetic Resonance Imaging (MRI) scans were obtained in 20 consecutive patients who underwent surgery for glioblastoma. MRI scans were segmented twice in each software package by two investigators. Intra-rater, inter-rater and between-software agreement was compared by using differences of means with 95% limits of agreement (LoA), Dice's similarity coefficients (DSC) and Hausdorff distance (HD). Time expenditure of segmentations was measured using a stopwatch. Eighteen tumors were included in the analyses. Inter-rater agreement was highest for BrainVoyager with difference of means of 0.19 mL and 95% LoA from -2.42 mL to 2.81 mL. Between-software agreement and 95% LoA were very similar for the different software packages. Intra-rater, inter-rater and between-software DSC were ≥ 0.93 in all analyses. Time expenditure was approximately 41 min per segmentation in BrainVoyager, and 18 min per segmentation in both 3D Slicer and ITK-Snap. Our main findings were that there is a high agreement within and between the software packages in terms of small intra-rater, inter-rater and between-software differences of means and high Dice's similarity coefficients. Time expenditure was highest for BrainVoyager, but all software packages were relatively time-consuming, which may limit usability in an everyday clinical setting.

  11. SeaShark and Starfish opertional data processing schemes for AVHRR and SeaWiFs

    NASA Astrophysics Data System (ADS)

    Flowerdew, R. J.; Corlyon, Anaa M.; Greer, W. A. D.; Newby, Steve J.; Winder, C. P.

    1997-02-01

    SeaShark is an operational software package for processing, archiving and cataloguing AVHRR and SeaWiFS data using an operator friendly GUI. Upon receipt of a customer order, it produces standard AVHRR data products, including Sea Surface Temperature (SST) and it has recently been modified to include SeaWiFS level 2 data processing. This uses an atmospheric correction scheme developed by the Plymouth Marine Laboratory, UK (PML) that builds upon the standard Gordon and Wang approach to be applicable over both case 1 and case 2 waters. Higher level products are then generated using PML algorithms, including chlorophyll a, a CZCS-type pigment, Kd, and suspended particulate matter. Outputs are in CEOS-compatible format. The software also produces fast delivery products (FDPs) of chlorophyll a and SST. These FDPs are combined in the StarFish software package to provide maps indicating potential location of phytoplankton and the preferred thermal environment of certain pelagic fish species. Fishing vessels may obtain these maps over Inmarsat, allowing them to achieve a greater efficiency hence lower cost.

  12. chimeraviz: a tool for visualizing chimeric RNA.

    PubMed

    Lågstad, Stian; Zhao, Sen; Hoff, Andreas M; Johannessen, Bjarne; Lingjærde, Ole Christian; Skotheim, Rolf I

    2017-09-15

    Advances in high-throughput RNA sequencing have enabled more efficient detection of fusion transcripts, but the technology and associated software used for fusion detection from sequencing data often yield a high false discovery rate. Good prioritization of the results is important, and this can be helped by a visualization framework that automatically integrates RNA data with known genomic features. Here we present chimeraviz , a Bioconductor package that automates the creation of chimeric RNA visualizations. The package supports input from nine different fusion-finder tools: deFuse, EricScript, InFusion, JAFFA, FusionCatcher, FusionMap, PRADA, SOAPfuse and STAR-FUSION. chimeraviz is an R package available via Bioconductor ( https://bioconductor.org/packages/release/bioc/html/chimeraviz.html ) under Artistic-2.0. Source code and support is available at GitHub ( https://github.com/stianlagstad/chimeraviz ). rolf.i.skotheim@rr-research.no. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  13. Open source IPSEC software in manned and unmanned space missions

    NASA Astrophysics Data System (ADS)

    Edwards, Jacob

    Network security is a major topic of research because cyber attackers pose a threat to national security. Securing ground-space communications for NASA missions is important because attackers could endanger mission success and human lives. This thesis describes how an open source IPsec software package was used to create a secure and reliable channel for ground-space communications. A cost efficient, reproducible hardware testbed was also created to simulate ground-space communications. The testbed enables simulation of low-bandwidth and high latency communications links to experiment how the open source IPsec software reacts to these network constraints. Test cases were built that allowed for validation of the testbed and the open source IPsec software. The test cases also simulate using an IPsec connection from mission control ground routers to points of interest in outer space. Tested open source IPsec software did not meet all the requirements. Software changes were suggested to meet requirements.

  14. Features of free software packages in flow cytometry: a comparison between four non-commercial software sources.

    PubMed

    Sahraneshin Samani, Fazel; Moore, Jodene K; Khosravani, Pardis; Ebrahimi, Marzieh

    2014-08-01

    Flow cytometers designed to analyze large particles are enabling new applications in biology. Data analysis is a critical component of the process FCM. In this article we compare features of four free software packages including WinMDI, Cyflogic, Flowing software, and Cytobank.

  15. Development of a Nevada Statewide Database for Safety Analyst Software

    DOT National Transportation Integrated Search

    2017-02-02

    Safety Analyst is a software package developed by the Federal Highway Administration (FHWA) and twenty-seven participating state and local agencies including the Nevada Department of Transportation (NDOT). The software package implemented many of the...

  16. Constraint Network Analysis (CNA): a Python software package for efficiently linking biomacromolecular structure, flexibility, (thermo-)stability, and function.

    PubMed

    Pfleger, Christopher; Rathi, Prakash Chandra; Klein, Doris L; Radestock, Sebastian; Gohlke, Holger

    2013-04-22

    For deriving maximal advantage from information on biomacromolecular flexibility and rigidity, results from rigidity analyses must be linked to biologically relevant characteristics of a structure. Here, we describe the Python-based software package Constraint Network Analysis (CNA) developed for this task. CNA functions as a front- and backend to the graph-based rigidity analysis software FIRST. CNA goes beyond the mere identification of flexible and rigid regions in a biomacromolecule in that it (I) provides a refined modeling of thermal unfolding simulations that also considers the temperature-dependence of hydrophobic tethers, (II) allows performing rigidity analyses on ensembles of network topologies, either generated from structural ensembles or by using the concept of fuzzy noncovalent constraints, and (III) computes a set of global and local indices for quantifying biomacromolecular stability. This leads to more robust results from rigidity analyses and extends the application domain of rigidity analyses in that phase transition points ("melting points") and unfolding nuclei ("structural weak spots") are determined automatically. Furthermore, CNA robustly handles small-molecule ligands in general. Such advancements are important for applying rigidity analysis to data-driven protein engineering and for estimating the influence of ligand molecules on biomacromolecular stability. CNA maintains the efficiency of FIRST such that the analysis of a single protein structure takes a few seconds for systems of several hundred residues on a single core. These features make CNA an interesting tool for linking biomacromolecular structure, flexibility, (thermo-)stability, and function. CNA is available from http://cpclab.uni-duesseldorf.de/software for nonprofit organizations.

  17. Geometric modeling for computer aided design

    NASA Technical Reports Server (NTRS)

    Schwing, James L.

    1992-01-01

    The goal was the design and implementation of software to be used in the conceptual design of aerospace vehicles. Several packages and design studies were completed, including two software tools currently used in the conceptual level design of aerospace vehicles. These tools are the Solid Modeling Aerospace Research Tool (SMART) and the Environment for Software Integration and Execution (EASIE). SMART provides conceptual designers with a rapid prototyping capability and additionally provides initial mass property analysis. EASIE provides a set of interactive utilities that simplify the task of building and executing computer aided design systems consisting of diverse, stand alone analysis codes that result in the streamlining of the exchange of data between programs, reducing errors and improving efficiency.

  18. Modelling and validation of Proton exchange membrane fuel cell (PEMFC)

    NASA Astrophysics Data System (ADS)

    Mohiuddin, A. K. M.; Basran, N.; Khan, A. A.

    2018-01-01

    This paper is the outcome of a small scale fuel cell project. Fuel cell is an electrochemical device that converts energy from chemical reaction to electrical work. Proton Exchange Membrane Fuel Cell (PEMFC) is one of the different types of fuel cell, which is more efficient, having low operational temperature and fast start up capability results in high energy density. In this study, a mathematical model of 1.2 W PEMFC is developed and simulated using MATLAB software. This model describes the PEMFC behaviour under steady-state condition. This mathematical modeling of PEMFC determines the polarization curve, power generated, and the efficiency of the fuel cell. Simulation results were validated by comparing with experimental results obtained from the test of a single PEMFC with a 3 V motor. The performance of experimental PEMFC is little lower compared to simulated PEMFC, however both results were found in good agreement. Experiments on hydrogen flow rate also been conducted to obtain the amount of hydrogen consumed to produce electrical work on PEMFC.

  19. Computer Aided Drafting Packages for Secondary Education. Edition 2. PC DOS Compatible Programs. A MicroSIFT Quarterly Report.

    ERIC Educational Resources Information Center

    Pollard, Jim

    This report reviews eight IBM-compatible software packages that are available to secondary schools to teach computer-aided drafting (CAD). Software packages to be considered were selected following reviews of CAD periodicals, computers in education periodicals, advertisements, and recommendations of teachers. The packages were then rated by…

  20. lme4qtl: linear mixed models with flexible covariance structure for genetic studies of related individuals.

    PubMed

    Ziyatdinov, Andrey; Vázquez-Santiago, Miquel; Brunel, Helena; Martinez-Perez, Angel; Aschard, Hugues; Soria, Jose Manuel

    2018-02-27

    Quantitative trait locus (QTL) mapping in genetic data often involves analysis of correlated observations, which need to be accounted for to avoid false association signals. This is commonly performed by modeling such correlations as random effects in linear mixed models (LMMs). The R package lme4 is a well-established tool that implements major LMM features using sparse matrix methods; however, it is not fully adapted for QTL mapping association and linkage studies. In particular, two LMM features are lacking in the base version of lme4: the definition of random effects by custom covariance matrices; and parameter constraints, which are essential in advanced QTL models. Apart from applications in linkage studies of related individuals, such functionalities are of high interest for association studies in situations where multiple covariance matrices need to be modeled, a scenario not covered by many genome-wide association study (GWAS) software. To address the aforementioned limitations, we developed a new R package lme4qtl as an extension of lme4. First, lme4qtl contributes new models for genetic studies within a single tool integrated with lme4 and its companion packages. Second, lme4qtl offers a flexible framework for scenarios with multiple levels of relatedness and becomes efficient when covariance matrices are sparse. We showed the value of our package using real family-based data in the Genetic Analysis of Idiopathic Thrombophilia 2 (GAIT2) project. Our software lme4qtl enables QTL mapping models with a versatile structure of random effects and efficient computation for sparse covariances. lme4qtl is available at https://github.com/variani/lme4qtl .

  1. QuantWorm: a comprehensive software package for Caenorhabditis elegans phenotypic assays.

    PubMed

    Jung, Sang-Kyu; Aleman-Meza, Boanerges; Riepe, Celeste; Zhong, Weiwei

    2014-01-01

    Phenotypic assays are crucial in genetics; however, traditional methods that rely on human observation are unsuitable for quantitative, large-scale experiments. Furthermore, there is an increasing need for comprehensive analyses of multiple phenotypes to provide multidimensional information. Here we developed an automated, high-throughput computer imaging system for quantifying multiple Caenorhabditis elegans phenotypes. Our imaging system is composed of a microscope equipped with a digital camera and a motorized stage connected to a computer running the QuantWorm software package. Currently, the software package contains one data acquisition module and four image analysis programs: WormLifespan, WormLocomotion, WormLength, and WormEgg. The data acquisition module collects images and videos. The WormLifespan software counts the number of moving worms by using two time-lapse images; the WormLocomotion software computes the velocity of moving worms; the WormLength software measures worm body size; and the WormEgg software counts the number of eggs. To evaluate the performance of our software, we compared the results of our software with manual measurements. We then demonstrated the application of the QuantWorm software in a drug assay and a genetic assay. Overall, the QuantWorm software provided accurate measurements at a high speed. Software source code, executable programs, and sample images are available at www.quantworm.org. Our software package has several advantages over current imaging systems for C. elegans. It is an all-in-one package for quantifying multiple phenotypes. The QuantWorm software is written in Java and its source code is freely available, so it does not require use of commercial software or libraries. It can be run on multiple platforms and easily customized to cope with new methods and requirements.

  2. Comparative analysis of film cooling efficiency at coolant supply into a single array of triangular dimples

    NASA Astrophysics Data System (ADS)

    Khalatov, A. A.; Petliak, O. O.; Severin, S. D.; Panchenko, N. A.

    2018-03-01

    The purpose of this work is a comparative study of the physical structure and film cooling efficiency of the single array of inclined holes, placed in triangular dimples and in a trench. The software package ANSYS CFX 17.0 was used along with RANS SST turbulence model. Calculations were made in a wide range of the blowing ratio ranging from 0.5 to 2.0. Results of modeling have shown high efficiency of triangular film cooling configuration. At m ≥ 1.5, the triangular configuration is comparable with the trench configuration in terms of the film cooling efficiency.

  3. Vertical bone measurements from cone beam computed tomography images using different software packages.

    PubMed

    Vasconcelos, Taruska Ventorini; Neves, Frederico Sampaio; Moraes, Lívia Almeida Bueno; Freitas, Deborah Queiroz

    2015-01-01

    This article aimed at comparing the accuracy of linear measurement tools of different commercial software packages. Eight fully edentulous dry mandibles were selected for this study. Incisor, canine, premolar, first molar and second molar regions were selected. Cone beam computed tomography (CBCT) images were obtained with i-CAT Next Generation. Linear bone measurements were performed by one observer on the cross-sectional images using three different software packages: XoranCat®, OnDemand3D® and KDIS3D®, all able to assess DICOM images. In addition, 25% of the sample was reevaluated for the purpose of reproducibility. The mandibles were sectioned to obtain the gold standard for each region. Intraclass coefficients (ICC) were calculated to examine the agreement between the two periods of evaluation; the one-way analysis of variance performed with the post-hoc Dunnett test was used to compare each of the software-derived measurements with the gold standard. The ICC values were excellent for all software packages. The least difference between the software-derived measurements and the gold standard was obtained with the OnDemand3D and KDIS3D (-0.11 and -0.14 mm, respectively), and the greatest, with the XoranCAT (+0.25 mm). However, there was no statistical significant difference between the measurements obtained with the different software packages and the gold standard (p> 0.05). In conclusion, linear bone measurements were not influenced by the software package used to reconstruct the image from CBCT DICOM data.

  4. Automatic Astrometric and Photometric Calibration with SCAMP

    NASA Astrophysics Data System (ADS)

    Bertin, E.

    2006-07-01

    Astrometric and photometric calibrations have remained the most tiresome step in the reduction of large imaging surveys. I present a new software package, SCAMP which has been written to address this problem. SCAMP efficiently computes accurate astrometric and photometric solutions for any arbitrary sequence of FITS images in a completely automatic way. SCAMP is released under the GNU General Public Licence.

  5. Differential maneuvering simulator data reduction and analysis software

    NASA Technical Reports Server (NTRS)

    Beasley, G. P.; Sigman, R. S.

    1972-01-01

    A multielement data reduction and analysis software package has been developed for use with the Langley differential maneuvering simulator (DMS). This package, which has several independent elements, was developed to support all phases of DMS aircraft simulation studies with a variety of both graphical and tabular information. The overall software package is considered unique because of the number, diversity, and sophistication of the element programs available for use in a single study. The purpose of this paper is to discuss the overall DMS data reduction and analysis package by reviewing the development of the various elements of the software, showing typical results that can be obtained, and discussing how each element can be used.

  6. Modeling of electronic power steering system for IKCO SAMAND vehicle and investigating on its performance via CARSIM software

    NASA Astrophysics Data System (ADS)

    Haghgoo, Esmail; Zamani, Mohammad; Sharbati, Ali

    2017-02-01

    The point of this article is introducing the usage of electronic power steering (ESP) system in IKCO SAMAND vehicle and investigating on it's benefit's. Also the operation of electronic steering system and it's performance in IKCO SAMAND vehicle have been described. The optimization of IC engine efficiency and it's fuel consumption have been simulated via ADVISOR software used in MATLAB software. Usually, mechanical steering systems and hydraulic steering systems are producing inside IRAN that the mechanical types have not accepted because of it's too many disadvantages. The hydraulic steering systems, that have been replaced with mechanical types, indeed have the same features with mechanical types but with a difference which they have a hydraulic booster to facilitate the rotation of steering wheel. Beside advantages in hydraulic systems, they are some disadvantages in this system that one of the most important of them is reducing the output power of engine. To restore this power dissipated, we use ESP systems. In this article output diagrams given by software, are showing that IKCO SAMAND vehicle which equipped with ESP system, exerts less torque and power on steering wheel. This improves the safety of driver and also performance of the vehicle at high speeds and reduces fuel consumption beside increasing the efficiency of IC engine.

  7. The R package 'Luminescence': a history of unexpected complexity and concepts to deal with it

    NASA Astrophysics Data System (ADS)

    Kreutzer, Sebastian; Burow, Christoph; Dietze, Michael; Fuchs, Margret C.; Friedrich, Johannes; Fischer, Manfred; Schmidt, Christoph

    2017-04-01

    Overcoming limitations in the so far used standard software, developing an efficient solution of low weight for a very specific task or creating graphs of high quality: the reasons that may had initially lead a scientist to work with R are manifold. And as long as developed solutions, e.g., R scripts, are needed for personal use only, code can remain unstructured and a documentation is not compulsory. However, this changes with the first friendly request for help after the code has been reused by others. In contrast to single scripts, written without intention to ever get published, for R packages the CRAN policy demands a more structured and elaborated approach including a minimum of documentation. Nevertheless, growing projects with thousands of lines of code that need to be maintained can become overwhelming, in particular as researchers are not by definition experts on managing software projects. The R package 'Luminescence' (Kreutzer et al., 2017), a collection of tools dealing with the analysis of luminescence data in a geoscientific, geochronological context, started as one single R script, but quickly evolved into a comprehensive solution connected with various other R packages. We present (1) a very brief development history of the package 'Luminescence', before we (2) sketch technical challenges encountered over time and solutions that have been found to deal with it by using various open source tools. Our presentation is considered as a collection of concepts and approaches to set up R projects in geosciences. References. Kreutzer, S., Dietze, M., Burow, C., Fuchs, M. C., Schmidt, C., Fischer, M., Friedrich, J., 2017. Luminescence: Comprehensive Luminescence Dating Data Analysis. R package version 0.6.4. https://CRAN.R-project.org/package=Luminescence

  8. EERE-SBIR technology transfer opportunity. H2 Safety Sensors for H2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnston, Mariann R.

    2015-12-01

    The Office of Energy Efficiency and Renewable Energy’s Fuel Cell Technologies Office (FCTO) works in partnership with industry (including small businesses), academia, and DOE's national laboratories to establish fuel cell and hydrogen energy technologies as economically competitive contributors to U.S. transportation needs. The work that is envisioned between the SBIR/STTR grantee and Los Alamos National Laboratory would involve Technical Transfer of Los Alamos Intellectual Property (IP) on Thin-film Mixed Potential Sensor (U.S. Patent 7,264,700) and associated know-how for H2 sensor manufacturing and packaging.

  9. School meals: a nutritional and environmental perspective.

    PubMed

    Demas, Antonia; Kindermann, Dana; Pimentel, David

    2010-01-01

    In light of the rise in childhood obesity rates and the influence of the food system on fossil fuel use, this article analyzes current school meals in Baltimore and makes suggestions for school meal reform based on both childhood nutrition and environmental resource use. The nutrient content and estimated energy costs of a typical school lunch are compared with a proposed alternate meal. The study indicates that healthier meals can significantly limit fossil fuel energy inputs for harvesting, production, processing, packaging, and transportation. The authors also provide strategies for developing menus that are both more nutritious and more energy efficient.

  10. A Direct Algorithm Maple Package of One-Dimensional Optimal System for Group Invariant Solutions

    NASA Astrophysics Data System (ADS)

    Zhang, Lin; Han, Zhong; Chen, Yong

    2018-01-01

    To construct the one-dimensional optimal system of finite dimensional Lie algebra automatically, we develop a new Maple package One Optimal System. Meanwhile, we propose a new method to calculate the adjoint transformation matrix and find all the invariants of Lie algebra in spite of Killing form checking possible constraints of each classification. Besides, a new conception called invariance set is raised. Moreover, this Maple package is proved to be more efficiency and precise than before by applying it to some classic examples. Supported by the Global Change Research Program of China under Grant No. 2015CB95390, National Natural Science Foundation of China under Grant Nos. 11675054 and 11435005, and Shanghai Collaborative Innovation Center of Trustworthy Software for Internet of Things under Grant No. ZF1213

  11. Large Scale Software Building with CMake in ATLAS

    NASA Astrophysics Data System (ADS)

    Elmsheuser, J.; Krasznahorkay, A.; Obreshkov, E.; Undrus, A.; ATLAS Collaboration

    2017-10-01

    The offline software of the ATLAS experiment at the Large Hadron Collider (LHC) serves as the platform for detector data reconstruction, simulation and analysis. It is also used in the detector’s trigger system to select LHC collision events during data taking. The ATLAS offline software consists of several million lines of C++ and Python code organized in a modular design of more than 2000 specialized packages. Because of different workflows, many stable numbered releases are in parallel production use. To accommodate specific workflow requests, software patches with modified libraries are distributed on top of existing software releases on a daily basis. The different ATLAS software applications also require a flexible build system that strongly supports unit and integration tests. Within the last year this build system was migrated to CMake. A CMake configuration has been developed that allows one to easily set up and build the above mentioned software packages. This also makes it possible to develop and test new and modified packages on top of existing releases. The system also allows one to detect and execute partial rebuilds of the release based on single package changes. The build system makes use of CPack for building RPM packages out of the software releases, and CTest for running unit and integration tests. We report on the migration and integration of the ATLAS software to CMake and show working examples of this large scale project in production.

  12. Western aeronautical test range real-time graphics software package MAGIC

    NASA Technical Reports Server (NTRS)

    Malone, Jacqueline C.; Moore, Archie L.

    1988-01-01

    The master graphics interactive console (MAGIC) software package used on the Western Aeronautical Test Range (WATR) of the NASA Ames Research Center is described. MAGIC is a resident real-time research tool available to flight researchers-scientists in the NASA mission control centers of the WATR at the Dryden Flight Research Facility at Edwards, California. The hardware configuration and capabilities of the real-time software package are also discussed.

  13. CheMentor Software System by H. A. Peoples

    NASA Astrophysics Data System (ADS)

    Reid, Brian P.

    1997-09-01

    CheMentor Software System H. A. Peoples. Computerized Learning Enhancements: http://www.ecis.com/~clehap; email: clehap@ecis.com; 1996 - 1997. CheMentor is a series of software packages for introductory-level chemistry, which includes Practice Items (I), Stoichiometry (I), Calculating Chemical Formulae, and the CheMentor Toolkit. The first three packages provide practice problems for students and various types of help to solve them; the Toolkit includes "calculators" for determining chemical quantities as well as the Practice Items (I) set of problems. The set of software packages is designed so that each individual product acts as a module of a common CheMentor program. As the name CheMentor implies, the software is designed as a "mentor" for students learning introductory chemistry concepts and problems. The typical use of the software would be by individual students (or perhaps small groups) as an adjunct to lectures. CheMentor is a HyperCard application and the modules are HyperCard stacks. The requirements to run the packages include a Macintosh computer with at least 1 MB of RAM, a hard drive with several MB of available space depending upon the packages selected (10 MB were required for all the packages reviewed here), and the Mac operating system 6.0.5 or later.

  14. Improving Energy Efficiency for the Vehicle Assembly Industry: A Discrete Event Simulation Approach

    NASA Astrophysics Data System (ADS)

    Oumer, Abduaziz; Mekbib Atnaw, Samson; Kie Cheng, Jack; Singh, Lakveer

    2016-11-01

    This paper presented a Discrete Event Simulation (DES) model for investigating and improving energy efficiency in vehicle assembly line. The car manufacturing industry is one of the highest energy consuming industries. Using Rockwell Arena DES package; a detailed model was constructed for an actual vehicle assembly plant. The sources of energy considered in this research are electricity and fuel; which are the two main types of energy sources used in a typical vehicle assembly plant. The model depicts the performance measurement for process- specific energy measures of painting, welding, and assembling processes. Sound energy efficiency model within this industry has two-fold advantage: reducing CO2 emission and cost reduction associated with fuel and electricity consumption. The paper starts with an overview of challenges in energy consumption within the facilities of automotive assembly line and highlights the parameters for energy efficiency. The results of the simulation model indicated improvements for energy saving objectives and reduced costs.

  15. Playing with Plug-ins

    ERIC Educational Resources Information Center

    Thompson, Douglas E.

    2013-01-01

    In today's complex music software packages, many features can remain unexplored and unused. Software plug-ins--available in most every music software package, yet easily overlooked in the software's basic operations--are one such feature. In this article, I introduce readers to plug-ins and offer tips for purchasing plug-ins I have…

  16. GUIDEseq: a bioconductor package to analyze GUIDE-Seq datasets for CRISPR-Cas nucleases.

    PubMed

    Zhu, Lihua Julie; Lawrence, Michael; Gupta, Ankit; Pagès, Hervé; Kucukural, Alper; Garber, Manuel; Wolfe, Scot A

    2017-05-15

    Genome editing technologies developed around the CRISPR-Cas9 nuclease system have facilitated the investigation of a broad range of biological questions. These nucleases also hold tremendous promise for treating a variety of genetic disorders. In the context of their therapeutic application, it is important to identify the spectrum of genomic sequences that are cleaved by a candidate nuclease when programmed with a particular guide RNA, as well as the cleavage efficiency of these sites. Powerful new experimental approaches, such as GUIDE-seq, facilitate the sensitive, unbiased genome-wide detection of nuclease cleavage sites within the genome. Flexible bioinformatics analysis tools for processing GUIDE-seq data are needed. Here, we describe an open source, open development software suite, GUIDEseq, for GUIDE-seq data analysis and annotation as a Bioconductor package in R. The GUIDEseq package provides a flexible platform with more than 60 adjustable parameters for the analysis of datasets associated with custom nuclease applications. These parameters allow data analysis to be tailored to different nuclease platforms with different length and complexity in their guide and PAM recognition sequences or their DNA cleavage position. They also enable users to customize sequence aggregation criteria, and vary peak calling thresholds that can influence the number of potential off-target sites recovered. GUIDEseq also annotates potential off-target sites that overlap with genes based on genome annotation information, as these may be the most important off-target sites for further characterization. In addition, GUIDEseq enables the comparison and visualization of off-target site overlap between different datasets for a rapid comparison of different nuclease configurations or experimental conditions. For each identified off-target, the GUIDEseq package outputs mapped GUIDE-Seq read count as well as cleavage score from a user specified off-target cleavage score prediction algorithm permitting the identification of genomic sequences with unexpected cleavage activity. The GUIDEseq package enables analysis of GUIDE-data from various nuclease platforms for any species with a defined genomic sequence. This software package has been used successfully to analyze several GUIDE-seq datasets. The software, source code and documentation are freely available at http://www.bioconductor.org/packages/release/bioc/html/GUIDEseq.html .

  17. Quantitative comparison and evaluation of software packages for assessment of abdominal adipose tissue distribution by magnetic resonance imaging.

    PubMed

    Bonekamp, S; Ghosh, P; Crawford, S; Solga, S F; Horska, A; Brancati, F L; Diehl, A M; Smith, S; Clark, J M

    2008-01-01

    To examine five available software packages for the assessment of abdominal adipose tissue with magnetic resonance imaging, compare their features and assess the reliability of measurement results. Feature evaluation and test-retest reliability of softwares (NIHImage, SliceOmatic, Analyze, HippoFat and EasyVision) used in manual, semi-automated or automated segmentation of abdominal adipose tissue. A random sample of 15 obese adults with type 2 diabetes. Axial T1-weighted spin echo images centered at vertebral bodies of L2-L3 were acquired at 1.5 T. Five software packages were evaluated (NIHImage, SliceOmatic, Analyze, HippoFat and EasyVision), comparing manual, semi-automated and automated segmentation approaches. Images were segmented into cross-sectional area (CSA), and the areas of visceral (VAT) and subcutaneous adipose tissue (SAT). Ease of learning and use and the design of the graphical user interface (GUI) were rated. Intra-observer accuracy and agreement between the software packages were calculated using intra-class correlation. Intra-class correlation coefficient was used to obtain test-retest reliability. Three of the five evaluated programs offered a semi-automated technique to segment the images based on histogram values or a user-defined threshold. One software package allowed manual delineation only. One fully automated program demonstrated the drawbacks of uncritical automated processing. The semi-automated approaches reduced variability and measurement error, and improved reproducibility. There was no significant difference in the intra-observer agreement in SAT and CSA. The VAT measurements showed significantly lower test-retest reliability. There were some differences between the software packages in qualitative aspects, such as user friendliness. Four out of five packages provided essentially the same results with respect to the inter- and intra-rater reproducibility. Our results using SliceOmatic, Analyze or NIHImage were comparable and could be used interchangeably. Newly developed fully automated approaches should be compared to one of the examined software packages.

  18. Quantitative comparison and evaluation of software packages for assessment of abdominal adipose tissue distribution by magnetic resonance imaging

    PubMed Central

    Bonekamp, S; Ghosh, P; Crawford, S; Solga, SF; Horska, A; Brancati, FL; Diehl, AM; Smith, S; Clark, JM

    2009-01-01

    Objective To examine five available software packages for the assessment of abdominal adipose tissue with magnetic resonance imaging, compare their features and assess the reliability of measurement results. Design Feature evaluation and test–retest reliability of softwares (NIHImage, SliceOmatic, Analyze, HippoFat and EasyVision) used in manual, semi-automated or automated segmentation of abdominal adipose tissue. Subjects A random sample of 15 obese adults with type 2 diabetes. Measurements Axial T1-weighted spin echo images centered at vertebral bodies of L2–L3 were acquired at 1.5 T. Five software packages were evaluated (NIHImage, SliceOmatic, Analyze, HippoFat and EasyVision), comparing manual, semi-automated and automated segmentation approaches. Images were segmented into cross-sectional area (CSA), and the areas of visceral (VAT) and subcutaneous adipose tissue (SAT). Ease of learning and use and the design of the graphical user interface (GUI) were rated. Intra-observer accuracy and agreement between the software packages were calculated using intra-class correlation. Intra-class correlation coefficient was used to obtain test–retest reliability. Results Three of the five evaluated programs offered a semi-automated technique to segment the images based on histogram values or a user-defined threshold. One software package allowed manual delineation only. One fully automated program demonstrated the drawbacks of uncritical automated processing. The semi-automated approaches reduced variability and measurement error, and improved reproducibility. There was no significant difference in the intra-observer agreement in SAT and CSA. The VAT measurements showed significantly lower test–retest reliability. There were some differences between the software packages in qualitative aspects, such as user friendliness. Conclusion Four out of five packages provided essentially the same results with respect to the inter- and intra-rater reproducibility. Our results using SliceOmatic, Analyze or NIHImage were comparable and could be used interchangeably. Newly developed fully automated approaches should be compared to one of the examined software packages. PMID:17700582

  19. Introduction to Software Packages. [Final Report.

    ERIC Educational Resources Information Center

    Frankel, Sheila, Ed.; And Others

    This document provides an introduction to applications computer software packages that support functional managers in government and encourages the use of such packages as an alternative to in-house development. A review of current application areas includes budget/project management, financial management/accounting, payroll, personnel,…

  20. GENERAL PURPOSE ADA PACKAGES

    NASA Technical Reports Server (NTRS)

    Klumpp, A. R.

    1994-01-01

    Ten families of subprograms are bundled together for the General-Purpose Ada Packages. The families bring to Ada many features from HAL/S, PL/I, FORTRAN, and other languages. These families are: string subprograms (INDEX, TRIM, LOAD, etc.); scalar subprograms (MAX, MIN, REM, etc.); array subprograms (MAX, MIN, PROD, SUM, GET, and PUT); numerical subprograms (EXP, CUBIC, etc.); service subprograms (DATE_TIME function, etc.); Linear Algebra II; Runge-Kutta integrators; and three text I/O families of packages. In two cases, a family consists of a single non-generic package. In all other cases, a family comprises a generic package and its instances for a selected group of scalar types. All generic packages are designed to be easily instantiated for the types declared in the user facility. The linear algebra package is LINRAG2. This package includes subprograms supplementing those in NPO-17985, An Ada Linear Algebra Package Modeled After HAL/S (LINRAG). Please note that LINRAG2 cannot be compiled without LINRAG. Most packages have widespread applicability, although some are oriented for avionics applications. All are designed to facilitate writing new software in Ada. Several of the packages use conventions introduced by other programming languages. A package of string subprograms is based on HAL/S (a language designed for the avionics software in the Space Shuttle) and PL/I. Packages of scalar and array subprograms are taken from HAL/S or generalized current Ada subprograms. A package of Runge-Kutta integrators is patterned after a built-in MAC (MIT Algebraic Compiler) integrator. Those packages modeled after HAL/S make it easy to translate existing HAL/S software to Ada. The General-Purpose Ada Packages program source code is available on two 360K 5.25" MS-DOS format diskettes. The software was developed using VAX Ada v1.5 under DEC VMS v4.5. It should be portable to any validated Ada compiler and it should execute either interactively or in batch. The largest package requires 205K of main memory on a DEC VAX running VMS. The software was developed in 1989, and is a copyrighted work with all copyright vested in NASA.

  1. Thermodynamic analysis of engineering solutions aimed at raising the efficiency of integrated gasification combined cycle

    NASA Astrophysics Data System (ADS)

    Gordeev, S. I.; Bogatova, T. F.; Ryzhkov, A. F.

    2017-11-01

    Raising the efficiency and environmental friendliness of electric power generation from coal is the aim of numerous research groups today. The traditional approach based on the steam power cycle has reached its efficiency limit, prompted by materials development and maneuverability performance. The rival approach based on the combined cycle is also drawing nearer to its efficiency limit. However, there is a reserve for efficiency increase of the integrated gasification combined cycle, which has the energy efficiency at the level of modern steam-turbine power units. The limit of increase in efficiency is the efficiency of NGCC. One of the main problems of the IGCC is higher costs of receiving and preparing fuel gas for GTU. It would be reasonable to decrease the necessary amount of fuel gas in the power unit to minimize the costs. The effect can be reached by raising of the heat value of fuel gas, its heat content and the heat content of cycle air. On the example of the process flowsheet of the IGCC with a power of 500 MW, running on Kuznetsk bituminous coal, by means of software Thermoflex, the influence of the developed technical solutions on the efficiency of the power plant is considered. It is received that rise in steam-air blast temperature to 900°C leads to an increase in conversion efficiency up to 84.2%. An increase in temperature levels of fuel gas clean-up to 900°C leads to an increase in the IGCC efficiency gross/net by 3.42%. Cycle air heating reduces the need for fuel gas by 40% and raises the IGCC efficiency gross/net by 0.85-1.22%. The offered solutions for IGCC allow to exceed net efficiency of analogous plants by 1.8-2.3%.

  2. A comparison of InVivoStat with other statistical software packages for analysis of data generated from animal experiments.

    PubMed

    Clark, Robin A; Shoaib, Mohammed; Hewitt, Katherine N; Stanford, S Clare; Bate, Simon T

    2012-08-01

    InVivoStat is a free-to-use statistical software package for analysis of data generated from animal experiments. The package is designed specifically for researchers in the behavioural sciences, where exploiting the experimental design is crucial for reliable statistical analyses. This paper compares the analysis of three experiments conducted using InVivoStat with other widely used statistical packages: SPSS (V19), PRISM (V5), UniStat (V5.6) and Statistica (V9). We show that InVivoStat provides results that are similar to those from the other packages and, in some cases, are more advanced. This investigation provides evidence of further validation of InVivoStat and should strengthen users' confidence in this new software package.

  3. Ramses-GPU: Second order MUSCL-Handcock finite volume fluid solver

    NASA Astrophysics Data System (ADS)

    Kestener, Pierre

    2017-10-01

    RamsesGPU is a reimplementation of RAMSES (ascl:1011.007) which drops the adaptive mesh refinement (AMR) features to optimize 3D uniform grid algorithms for modern graphics processor units (GPU) to provide an efficient software package for astrophysics applications that do not need AMR features but do require a very large number of integration time steps. RamsesGPU provides an very efficient C++/CUDA/MPI software implementation of a second order MUSCL-Handcock finite volume fluid solver for compressible hydrodynamics as a magnetohydrodynamics solver based on the constraint transport technique. Other useful modules includes static gravity, dissipative terms (viscosity, resistivity), and forcing source term for turbulence studies, and special care was taken to enhance parallel input/output performance by using state-of-the-art libraries such as HDF5 and parallel-netcdf.

  4. Development of Fuel Shuffling Module for PHISICS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allan Mabe; Andrea Alfonsi; Cristian Rabiti

    2013-06-01

    PHISICS (Parallel and Highly Innovative Simulation for the INL Code System) [4] code toolkit has been in development at the Idaho National Laboratory. This package is intended to provide a modern analysis tool for reactor physics investigation. It is designed with the mindset to maximize accuracy for a given availability of computational resources and to give state of the art tools to the modern nuclear engineer. This is obtained by implementing several different algorithms and meshing approaches among which the user will be able to choose, in order to optimize his computational resources and accuracy needs. The software is completelymore » modular in order to simplify the independent development of modules by different teams and future maintenance. The package is coupled with the thermo-hydraulic code RELAP5-3D [3]. In the following the structure of the different PHISICS modules is briefly recalled, focusing on the new shuffling module (SHUFFLE), object of this paper.« less

  5. RSEIS and RFOC: Seismic Analysis in R

    NASA Astrophysics Data System (ADS)

    Lees, J. M.

    2015-12-01

    Open software is essential for reproducible scientific exchange. R-packages provide a platform for development of seismological investigation software that can be properly documented and traced for data processing. A suite of R packages designed for a wide range of seismic analysis is currently available in the free software platform called R. R is a software platform based on the S-language developed at Bell Labs decades ago. Routines in R can be run as standalone function calls, or developed in object-oriented mode. R comes with a base set of routines, and thousands of user developed packages. The packages developed at UNC include subroutines and interactive codes for processing seismic data, analyzing geographic information (GIS) and inverting data involved in a variety of geophysical applications. On CRAN (Comprehensive R Archive Network, http://www.r-project.org/) currently available packages related to seismic analysis are RSEIS, Rquake, GEOmap, RFOC, zoeppritz, RTOMO, and geophys, Rwave, PEIP, hht, rFDSN. These include signal processing, data management, mapping, earthquake location, deconvolution, focal mechanisms, wavelet transforms, Hilbert-Huang Transforms, tomographic inversion, and Mogi deformation among other useful functionality. All software in R packages is required to have detailed documentation, making the exchange and modification of existing software easy. In this presentation, I will focus on packages RSEIS and RFOC, showing examples from a variety of seismic analyses. The R approach has similarities to the popular (and expensive) MATLAB platform, although R is open source and free to down load.

  6. Laboratory Connections: Review of Two Commercial Interfacing Packages.

    ERIC Educational Resources Information Center

    Powers, Michael H.

    1989-01-01

    Evaluates two Apple II interfacing packages designed to measure pH: (1) "Experiments in Chemistry" by HRM Software and (2) "Voltage Plotter III" by Vernier Software. Provides characteristics and screen dumps of each package. Reports both systems are suitable for high school or beginning college laboratories. (MVL)

  7. MicroSIFT Courseware Evaluations [Set 15 (362-388) and Set 16 (389-441), with an Index Listing the Contents of Each Set (Sets 1-16) and a Cumulative Subject Index (Sets 1-16)].

    ERIC Educational Resources Information Center

    Northwest Regional Educational Lab., Portland, OR.

    This document consists of 80 microcomputer software package evaluations prepared by the MicroSIFT (Microcomputer Software and Information for Teachers) Clearinghouse at the Northwest Regional Education Laboratory. Set 15 consists of 27 packages; set 16 consists of 53 packages. Each software review lists producer, time and place of evaluation,…

  8. Basic analysis of reflectometry data software package for the analysis of multilayered structures according to reflectometry data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Astaf'ev, S. B., E-mail: bard@ns.crys.ras.ru; Shchedrin, B. M.; Yanusova, L. G.

    2012-01-15

    The main principles of developing the Basic Analysis of Reflectometry Data (BARD) software package, which is aimed at obtaining a unified (standardized) tool for analyzing the structure of thin multilayer films and nanostructures of different nature based on reflectometry data, are considered. This software package contains both traditionally used procedures for processing reflectometry data and the authors' original developments on the basis of new methods for carrying out and analyzing reflectometry experiments. The structure of the package, its functional possibilities, examples of application, and prospects of development are reviewed.

  9. Is liver perfusion CT reproducible? A study on intra- and interobserver agreement of normal hepatic haemodynamic parameters obtained with two different software packages.

    PubMed

    Bretas, Elisa Almeida Sathler; Torres, Ulysses S; Torres, Lucas Rios; Bekhor, Daniel; Saito Filho, Celso Fernando; Racy, Douglas Jorge; Faggioni, Lorenzo; D'Ippolito, Giuseppe

    2017-10-01

    To evaluate the agreement between the measurements of perfusion CT parameters in normal livers by using two different software packages. This retrospective study was based on 78 liver perfusion CT examinations acquired for detecting suspected liver metastasis. Patients with any morphological or functional hepatic abnormalities were excluded. The final analysis included 37 patients (59.7 ± 14.9 y). Two readers (1 and 2) independently measured perfusion parameters using different software packages from two major manufacturers (A and B). Arterial perfusion (AP) and portal perfusion (PP) were determined using the dual-input vascular one-compartmental model. Inter-reader agreement for each package and intrareader agreement between both packages were assessed with intraclass correlation coefficients (ICC) and Bland-Altman statistics. Inter-reader agreement was substantial for AP using software A (ICC = 0.82) and B (ICC = 0.85-0.86), fair for PP using software A (ICC = 0.44) and fair to moderate for PP using software B (ICC = 0.56-0.77). Intrareader agreement between software A and B ranged from slight to moderate (ICC = 0.32-0.62) for readers 1 and 2 considering the AP parameters, and from fair to moderate (ICC = 0.40-0.69) for readers 1 and 2 considering the PP parameters. At best there was only moderate agreement between both software packages, resulting in some uncertainty and suboptimal reproducibility. Advances in knowledge: Software-dependent factors may contribute to variance in perfusion measurements, demanding further technical improvements. AP measurements seem to be the most reproducible parameter to be adopted when evaluating liver perfusion CT.

  10. Making software get along: integrating optical and mechanical design programs

    NASA Astrophysics Data System (ADS)

    Shackelford, Christie J.; Chinnock, Randal B.

    2001-03-01

    As modern optomechanical engineers, we have the good fortune of having very sophisticated software programs available to us. The current optical design, mechanical design, industrial design, and CAM programs are very powerful tools with some very desirable features. However, no one program can do everything necessary to complete an entire optomechanical system design. Each program has a unique set of features and benefits, and typically two or mo re will be used during the product development process. At a minimum, an optical design program and a mechanical CAD package will be employed. As we strive for efficient, cost-effective, and rapid progress in our development projects, we must use these programs to their full advantage, while keeping redundant tasks to a minimum. Together, these programs offer the promise of a `seamless' flow of data from concept all the way to the download of part designs directly to the machine shop for fabrication. In reality, transferring data from one software package to the next is often frustrating. Overcoming these problems takes some know-how, a bit of creativity, and a lot of persistence. This paper describes a complex optomechanical development effort in which a variety of software tools were used from the concept stage to prototyping. It will describe what software was used for each major design task, how we learned to use them together to best advantage, and how we overcame the frustrations of software that didn't get along.

  11. JMorph: Software for performing rapid morphometric measurements on digital images of fossil assemblages

    NASA Astrophysics Data System (ADS)

    Lelièvre, Peter G.; Grey, Melissa

    2017-08-01

    Quantitative morphometric analyses of form are widely used in palaeontology, especially for taxonomic and evolutionary research. These analyses can involve several measurements performed on hundreds or even thousands of samples. Performing measurements of size and shape on large assemblages of macro- or microfossil samples is generally infeasible or impossible with traditional instruments such as vernier calipers. Instead, digital image processing software is required to perform measurements via suitable digital images of samples. Many software packages exist for morphometric analyses but there is not much available for the integral stage of data collection, particularly for the measurement of the outlines of samples. Some software exists to automatically detect the outline of a fossil sample from a digital image. However, automatic outline detection methods may perform inadequately when samples have incomplete outlines or images contain poor contrast between the sample and staging background. Hence, a manual digitization approach may be the only option. We are not aware of any software packages that are designed specifically for efficient digital measurement of fossil assemblages with numerous samples, especially for the purposes of manual outline analysis. Throughout several previous studies, we have developed a new software tool, JMorph, that is custom-built for that task. JMorph provides the means to perform many different types of measurements, which we describe in this manuscript. We focus on JMorph's ability to rapidly and accurately digitize the outlines of fossils. JMorph is freely available from the authors.

  12. A Comparison of Authoring Software for Developing Mathematics Self-Learning Software Packages.

    ERIC Educational Resources Information Center

    Suen, Che-yin; Pok, Yang-ming

    Four years ago, the authors started to develop a self-paced mathematics learning software called NPMaths by using an authoring package called Tencore. However, NPMaths had some weak points. A development team was hence formed to develop similar software called Mathematics On Line. This time the team used another development language called…

  13. Kassiopeia: a modern, extensible C++ particle tracking package

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Furse, Daniel; Groh, Stefan; Trost, Nikolaus

    The Kassiopeia particle tracking framework is an object-oriented software package using modern C++ techniques, written originally to meet the needs of the KATRIN collaboration. Kassiopeia features a new algorithmic paradigm for particle tracking simulations which targets experiments containing complex geometries and electromagnetic fields, with high priority put on calculation efficiency, customizability, extensibility, and ease-of-use for novice programmers. To solve Kassiopeia's target physics problem the software is capable of simulating particle trajectories governed by arbitrarily complex differential equations of motion, continuous physics processes that may in part be modeled as terms perturbing that equation of motion, stochastic processes that occur inmore » flight such as bulk scattering and decay, and stochastic surface processes occurring at interfaces, including transmission and reflection effects. This entire set of computations takes place against the backdrop of a rich geometry package which serves a variety of roles, including initialization of electromagnetic field simulations and the support of state-dependent algorithm-swapping and behavioral changes as a particle's state evolves. Thanks to the very general approach taken by Kassiopeia it can be used by other experiments facing similar challenges when calculating particle trajectories in electromagnetic fields. It is publicly available at https://github.com/KATRIN-Experiment/Kassiopeia.« less

  14. Kassiopeia: a modern, extensible C++ particle tracking package

    DOE PAGES

    Furse, Daniel; Groh, Stefan; Trost, Nikolaus; ...

    2017-05-16

    The Kassiopeia particle tracking framework is an object-oriented software package using modern C++ techniques, written originally to meet the needs of the KATRIN collaboration. Kassiopeia features a new algorithmic paradigm for particle tracking simulations which targets experiments containing complex geometries and electromagnetic fields, with high priority put on calculation efficiency, customizability, extensibility, and ease-of-use for novice programmers. To solve Kassiopeia's target physics problem the software is capable of simulating particle trajectories governed by arbitrarily complex differential equations of motion, continuous physics processes that may in part be modeled as terms perturbing that equation of motion, stochastic processes that occur inmore » flight such as bulk scattering and decay, and stochastic surface processes occurring at interfaces, including transmission and reflection effects. This entire set of computations takes place against the backdrop of a rich geometry package which serves a variety of roles, including initialization of electromagnetic field simulations and the support of state-dependent algorithm-swapping and behavioral changes as a particle's state evolves. Thanks to the very general approach taken by Kassiopeia it can be used by other experiments facing similar challenges when calculating particle trajectories in electromagnetic fields. It is publicly available at https://github.com/KATRIN-Experiment/Kassiopeia.« less

  15. Kassiopeia: a modern, extensible C++ particle tracking package

    NASA Astrophysics Data System (ADS)

    Furse, Daniel; Groh, Stefan; Trost, Nikolaus; Babutzka, Martin; Barrett, John P.; Behrens, Jan; Buzinsky, Nicholas; Corona, Thomas; Enomoto, Sanshiro; Erhard, Moritz; Formaggio, Joseph A.; Glück, Ferenc; Harms, Fabian; Heizmann, Florian; Hilk, Daniel; Käfer, Wolfgang; Kleesiek, Marco; Leiber, Benjamin; Mertens, Susanne; Oblath, Noah S.; Renschler, Pascal; Schwarz, Johannes; Slocum, Penny L.; Wandkowsky, Nancy; Wierman, Kevin; Zacher, Michael

    2017-05-01

    The Kassiopeia particle tracking framework is an object-oriented software package using modern C++ techniques, written originally to meet the needs of the KATRIN collaboration. Kassiopeia features a new algorithmic paradigm for particle tracking simulations which targets experiments containing complex geometries and electromagnetic fields, with high priority put on calculation efficiency, customizability, extensibility, and ease-of-use for novice programmers. To solve Kassiopeia's target physics problem the software is capable of simulating particle trajectories governed by arbitrarily complex differential equations of motion, continuous physics processes that may in part be modeled as terms perturbing that equation of motion, stochastic processes that occur in flight such as bulk scattering and decay, and stochastic surface processes occurring at interfaces, including transmission and reflection effects. This entire set of computations takes place against the backdrop of a rich geometry package which serves a variety of roles, including initialization of electromagnetic field simulations and the support of state-dependent algorithm-swapping and behavioral changes as a particle’s state evolves. Thanks to the very general approach taken by Kassiopeia it can be used by other experiments facing similar challenges when calculating particle trajectories in electromagnetic fields. It is publicly available at https://github.com/KATRIN-Experiment/Kassiopeia.

  16. Integrated optomechanical analysis and testing software development at MIT Lincoln Laboratory

    NASA Astrophysics Data System (ADS)

    Stoeckel, Gerhard P.; Doyle, Keith B.

    2013-09-01

    Advanced analytical software capabilities are being developed to advance the design of prototypical hardware in the Engineering Division at MIT Lincoln Laboratory. The current effort is focused on the integration of analysis tools tailored to the work flow, organizational structure, and current technology demands. These tools are being designed to provide superior insight into the interdisciplinary behavior of optical systems and enable rapid assessment and execution of design trades to optimize the design of optomechanical systems. The custom software architecture is designed to exploit and enhance the functionality of existing industry standard commercial software, provide a framework for centralizing internally developed tools, and deliver greater efficiency, productivity, and accuracy through standardization, automation, and integration. Specific efforts have included the development of a feature-rich software package for Structural-Thermal-Optical Performance (STOP) modeling, advanced Line Of Sight (LOS) jitter simulations, and improved integration of dynamic testing and structural modeling.

  17. Technology Assessment Software Package: Final Report.

    ERIC Educational Resources Information Center

    Hutinger, Patricia L.

    This final report describes the Technology Assessment Software Package (TASP) Project, which produced developmentally appropriate technology assessment software for children from 18 months through 8 years of age who have moderate to severe disabilities that interfere with their interaction with people, objects, tasks, and events in their…

  18. The Hidden Cost of Buying a Computer.

    ERIC Educational Resources Information Center

    Johnson, Michael

    1983-01-01

    In order to process data in a computer, application software must be either developed or purchased. Costs for modifications of the software package and maintenance are often hidden. The decision to buy or develop software packages should be based upon factors of time and maintenance. (MLF)

  19. Software for Managing Personal Files.

    ERIC Educational Resources Information Center

    Lundeen, Gerald

    1989-01-01

    Discusses the special characteristics of personal file management software and compares four microcomputer software packages: Notebook II with Bibliography and Convert, Pro-Cite with Biblio-Links, askSam, and Reference Manager. Each package is evaluated in terms of the user interface, file maintenance, retrieval capabilities, output, and…

  20. Software design for analysis of multichannel intracardial and body surface electrocardiograms.

    PubMed

    Potse, Mark; Linnenbank, André C; Grimbergen, Cornelis A

    2002-11-01

    Analysis of multichannel ECG recordings (body surface maps (BSMs) and intracardial maps) requires special software. We created a software package and a user interface on top of a commercial data analysis package (MATLAB) by a combination of high-level and low-level programming. Our software was created to satisfy the needs of a diverse group of researchers. It can handle a large variety of recording configurations. It allows for interactive usage through a fast and robust user interface, and batch processing for the analysis of large amounts of data. The package is user-extensible, includes routines for both common and experimental data processing tasks, and works on several computer platforms. The source code is made intelligible using software for structured documentation and is available to the users. The package is currently used by more than ten research groups analysing ECG data worldwide.

  1. 1986 Petroleum Software Directory. [800 mini, micro and mainframe computer software packages

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1985-01-01

    Pennwell's 1986 Petroleum Software Directory is a complete listing of software created specifically for the petroleum industry. Details are provided on over 800 mini, micro and mainframe computer software packages from more than 250 different companies. An accountant can locate programs to automate bookkeeping functions in large oil and gas production firms. A pipeline engineer will find programs designed to calculate line flow and wellbore pressure drop.

  2. Diagnostic evaluation of three cardiac software packages using a consecutive group of patients

    PubMed Central

    2011-01-01

    Purpose The aim of this study was to compare the diagnostic performance of the three software packages 4DMSPECT (4DM), Emory Cardiac Toolbox (ECTb), and Cedars Quantitative Perfusion SPECT (QPS) for quantification of myocardial perfusion scintigram (MPS) using a large group of consecutive patients. Methods We studied 1,052 consecutive patients who underwent 2-day stress/rest 99mTc-sestamibi MPS studies. The reference/gold-standard classifications for the MPS studies were obtained from three physicians, with more than 25 years each of experience in nuclear cardiology, who re-evaluated all MPS images. Automatic processing was carried out using 4DM, ECTb, and QPS software packages. Total stress defect extent (TDE) and summed stress score (SSS) based on a 17-segment model were obtained from the software packages. Receiver-operating characteristic (ROC) analysis was performed. Results A total of 734 patients were classified as normal and the remaining 318 were classified as having infarction and/or ischemia. The performance of the software packages calculated as the area under the SSS ROC curve were 0.87 for 4DM, 0.80 for QPS, and 0.76 for ECTb (QPS vs. ECTb p = 0.03; other differences p < 0.0001). The area under the TDE ROC curve were 0.87 for 4DM, 0.82 for QPS, and 0.76 for ECTb (QPS vs. ECTb p = 0.0005; other differences p < 0.0001). Conclusion There are considerable differences in performance between the three software packages with 4DM showing the best performance and ECTb the worst. These differences in performance should be taken in consideration when software packages are used in clinical routine or in clinical studies. PMID:22214226

  3. Building Information Model: advantages, tools and adoption efficiency

    NASA Astrophysics Data System (ADS)

    Abakumov, R. G.; Naumov, A. E.

    2018-03-01

    The paper expands definition and essence of Building Information Modeling. It describes content and effects from application of Information Modeling at different stages of a real property item. Analysis of long-term and short-term advantages is given. The authors included an analytical review of Revit software package in comparison with Autodesk with respect to: features, advantages and disadvantages, cost and pay cutoff. A prognostic calculation is given for efficiency of adoption of the Building Information Modeling technology, with examples of its successful adoption in Russia and worldwide.

  4. A Study of Visualization for Mathematics Education

    NASA Technical Reports Server (NTRS)

    Daugherty, Sarah C.

    2008-01-01

    Graphical representations such as figures, illustrations, and diagrams play a critical role in mathematics and they are equally important in mathematics education. However, graphical representations in mathematics textbooks are static, Le. they are used to illustrate only a specific example or a limited set. of examples. By using computer software to visualize mathematical principles, virtually there is no limit to the number of specific cases and examples that can be demonstrated. However, we have not seen widespread adoption of visualization software in mathematics education. There are currently a number of software packages that provide visualization of mathematics for research and also software packages specifically developed for mathematics education. We conducted a survey of mathematics visualization software packages, summarized their features and user bases, and analyzed their limitations. In this survey, we focused on evaluating the software packages for their use with mathematical subjects adopted by institutions of secondary education in the United States (middle schools and high schools), including algebra, geometry, trigonometry, and calculus. We found that cost, complexity, and lack of flexibility are the major factors that hinder the widespread use of mathematics visualization software in education.

  5. Dill: an algorithm and a symbolic software package for doing classical supersymmetry calculations

    NASA Astrophysics Data System (ADS)

    Luc̆ić, Vladan

    1995-11-01

    An algorithm is presented that formalizes different steps in a classical Supersymmetric (SUSY) calculation. Based on the algorithm Dill, a symbolic software package, that can perform the calculations, is developed in the Mathematica programming language. While the algorithm is quite general, the package is created for the 4 - D, N = 1 model. Nevertheless, with little modification, the package could be used for other SUSY models. The package has been tested and some of the results are presented.

  6. MPBEC, a Matlab Program for Biomolecular Electrostatic Calculations

    NASA Astrophysics Data System (ADS)

    Vergara-Perez, Sandra; Marucho, Marcelo

    2016-01-01

    One of the most used and efficient approaches to compute electrostatic properties of biological systems is to numerically solve the Poisson-Boltzmann (PB) equation. There are several software packages available that solve the PB equation for molecules in aqueous electrolyte solutions. Most of these software packages are useful for scientists with specialized training and expertise in computational biophysics. However, the user is usually required to manually take several important choices, depending on the complexity of the biological system, to successfully obtain the numerical solution of the PB equation. This may become an obstacle for researchers, experimentalists, even students with no special training in computational methodologies. Aiming to overcome this limitation, in this article we present MPBEC, a free, cross-platform, open-source software that provides non-experts in the field an easy and efficient way to perform biomolecular electrostatic calculations on single processor computers. MPBEC is a Matlab script based on the Adaptative Poisson-Boltzmann Solver, one of the most popular approaches used to solve the PB equation. MPBEC does not require any user programming, text editing or extensive statistical skills, and comes with detailed user-guide documentation. As a unique feature, MPBEC includes a useful graphical user interface (GUI) application which helps and guides users to configure and setup the optimal parameters and approximations to successfully perform the required biomolecular electrostatic calculations. The GUI also incorporates visualization tools to facilitate users pre- and post-analysis of structural and electrical properties of biomolecules.

  7. MPBEC, a Matlab Program for Biomolecular Electrostatic Calculations

    PubMed Central

    Vergara-Perez, Sandra; Marucho, Marcelo

    2015-01-01

    One of the most used and efficient approaches to compute electrostatic properties of biological systems is to numerically solve the Poisson-Boltzmann (PB) equation. There are several software packages available that solve the PB equation for molecules in aqueous electrolyte solutions. Most of these software packages are useful for scientists with specialized training and expertise in computational biophysics. However, the user is usually required to manually take several important choices, depending on the complexity of the biological system, to successfully obtain the numerical solution of the PB equation. This may become an obstacle for researchers, experimentalists, even students with no special training in computational methodologies. Aiming to overcome this limitation, in this article we present MPBEC, a free, cross-platform, open-source software that provides non-experts in the field an easy and efficient way to perform biomolecular electrostatic calculations on single processor computers. MPBEC is a Matlab script based on the Adaptative Poisson Boltzmann Solver, one of the most popular approaches used to solve the PB equation. MPBEC does not require any user programming, text editing or extensive statistical skills, and comes with detailed user-guide documentation. As a unique feature, MPBEC includes a useful graphical user interface (GUI) application which helps and guides users to configure and setup the optimal parameters and approximations to successfully perform the required biomolecular electrostatic calculations. The GUI also incorporates visualization tools to facilitate users pre- and post- analysis of structural and electrical properties of biomolecules. PMID:26924848

  8. MPBEC, a Matlab Program for Biomolecular Electrostatic Calculations.

    PubMed

    Vergara-Perez, Sandra; Marucho, Marcelo

    2016-01-01

    One of the most used and efficient approaches to compute electrostatic properties of biological systems is to numerically solve the Poisson-Boltzmann (PB) equation. There are several software packages available that solve the PB equation for molecules in aqueous electrolyte solutions. Most of these software packages are useful for scientists with specialized training and expertise in computational biophysics. However, the user is usually required to manually take several important choices, depending on the complexity of the biological system, to successfully obtain the numerical solution of the PB equation. This may become an obstacle for researchers, experimentalists, even students with no special training in computational methodologies. Aiming to overcome this limitation, in this article we present MPBEC, a free, cross-platform, open-source software that provides non-experts in the field an easy and efficient way to perform biomolecular electrostatic calculations on single processor computers. MPBEC is a Matlab script based on the Adaptative Poisson Boltzmann Solver, one of the most popular approaches used to solve the PB equation. MPBEC does not require any user programming, text editing or extensive statistical skills, and comes with detailed user-guide documentation. As a unique feature, MPBEC includes a useful graphical user interface (GUI) application which helps and guides users to configure and setup the optimal parameters and approximations to successfully perform the required biomolecular electrostatic calculations. The GUI also incorporates visualization tools to facilitate users pre- and post- analysis of structural and electrical properties of biomolecules.

  9. An Ada Linear-Algebra Software Package Modeled After HAL/S

    NASA Technical Reports Server (NTRS)

    Klumpp, Allan R.; Lawson, Charles L.

    1990-01-01

    New avionics software written more easily. Software package extends Ada programming language to include linear-algebra capabilities similar to those of HAL/S programming language. Designed for such avionics applications as Space Station flight software. In addition to built-in functions of HAL/S, package incorporates quaternion functions used in Space Shuttle and Galileo projects and routines from LINPAK solving systems of equations involving general square matrices. Contains two generic programs: one for floating-point computations and one for integer computations. Written on IBM/AT personal computer running under PC DOS, v.3.1.

  10. On Fitting Generalized Linear Mixed-effects Models for Binary Responses using Different Statistical Packages

    PubMed Central

    Zhang, Hui; Lu, Naiji; Feng, Changyong; Thurston, Sally W.; Xia, Yinglin; Tu, Xin M.

    2011-01-01

    Summary The generalized linear mixed-effects model (GLMM) is a popular paradigm to extend models for cross-sectional data to a longitudinal setting. When applied to modeling binary responses, different software packages and even different procedures within a package may give quite different results. In this report, we describe the statistical approaches that underlie these different procedures and discuss their strengths and weaknesses when applied to fit correlated binary responses. We then illustrate these considerations by applying these procedures implemented in some popular software packages to simulated and real study data. Our simulation results indicate a lack of reliability for most of the procedures considered, which carries significant implications for applying such popular software packages in practice. PMID:21671252

  11. NDAS Hardware Translation Layer Development

    NASA Technical Reports Server (NTRS)

    Nazaretian, Ryan N.; Holladay, Wendy T.

    2011-01-01

    The NASA Data Acquisition System (NDAS) project is aimed to replace all DAS software for NASA s Rocket Testing Facilities. There must be a software-hardware translation layer so the software can properly talk to the hardware. Since the hardware from each test stand varies, drivers for each stand have to be made. These drivers will act more like plugins for the software. If the software is being used in E3, then the software should point to the E3 driver package. If the software is being used at B2, then the software should point to the B2 driver package. The driver packages should also be filled with hardware drivers that are universal to the DAS system. For example, since A1, A2, and B2 all use the Preston 8300AU signal conditioners, then the driver for those three stands should be the same and updated collectively.

  12. Academic Web Authoring Mulitmedia Development and Course Management Tools

    ERIC Educational Resources Information Center

    Halloran, Margaret E.

    2005-01-01

    Course management software enables faculty members to learn one software package for web-based curriculum, assessment, synchronous and asynchronous discussions, collaborative work, multimedia and interactive resource development. There are as many as 109 different course management software packages on the market and several studies have evaluated…

  13. lumpR 2.0.0: an R package facilitating landscape discretisation for hillslope-based hydrological models

    NASA Astrophysics Data System (ADS)

    Pilz, Tobias; Francke, Till; Bronstert, Axel

    2017-08-01

    The characteristics of a landscape pose essential factors for hydrological processes. Therefore, an adequate representation of the landscape of a catchment in hydrological models is vital. However, many of such models exist differing, amongst others, in spatial concept and discretisation. The latter constitutes an essential pre-processing step, for which many different algorithms along with numerous software implementations exist. In that context, existing solutions are often model specific, commercial, or depend on commercial back-end software, and allow only a limited or no workflow automation at all. Consequently, a new package for the scientific software and scripting environment R, called lumpR, was developed. lumpR employs an algorithm for hillslope-based landscape discretisation directed to large-scale application via a hierarchical multi-scale approach. The package addresses existing limitations as it is free and open source, easily extendible to other hydrological models, and the workflow can be fully automated. Moreover, it is user-friendly as the direct coupling to a GIS allows for immediate visual inspection and manual adjustment. Sufficient control is furthermore retained via parameter specification and the option to include expert knowledge. Conversely, completely automatic operation also allows for extensive analysis of aspects related to landscape discretisation. In a case study, the application of the package is presented. A sensitivity analysis of the most important discretisation parameters demonstrates its efficient workflow automation. Considering multiple streamflow metrics, the employed model proved reasonably robust to the discretisation parameters. However, parameters determining the sizes of subbasins and hillslopes proved to be more important than the others, including the number of representative hillslopes, the number of attributes employed for the lumping algorithm, and the number of sub-discretisations of the representative hillslopes.

  14. On controllability and system constraints of the linear models of proton exchange membrane and solid oxide fuel cells

    NASA Astrophysics Data System (ADS)

    Radisavljevic, Verica

    2011-10-01

    In this paper we first show that the linear models of proton exchange membrane (polymer electrolyte membrane, PEM) and solid oxide (SO) fuel cells, commonly used in power and energy literature, are not controllable. The source of uncontrollability is the equation for pressure of the water vapor that is only affected by the fuel cell current, which in fact is a disturbance in this system and cannot be controlled by the given model inputs: inlet molar flow rates of hydrogen and oxygen. Being uncontrollable these models are not good candidates for studying control of dynamic processes in PEM and SO fuel cells. However, due to their simplicity, they can be used in hybrid configurations with other energy producing devices such as photovoltaic (solar) cells, wind turbine, micro gas turbine, battery (ultra capacitor) to demonstrate some other phenomena, but not for control purposes unless the hybrid models formed in such hybrid configurations are controllable. Testing controllability of such hybrid models is mandatory. Secondly, we introduce some algebraic constraints that follow from the model dynamics and the Nernst open-loop fuel cell voltage formula. These constraints must be satisfied in simulation of considered fuel cell modes, for example, via MATLAB/Simulink or any other computer software package.

  15. Usability study of clinical exome analysis software: top lessons learned and recommendations.

    PubMed

    Shyr, Casper; Kushniruk, Andre; Wasserman, Wyeth W

    2014-10-01

    New DNA sequencing technologies have revolutionized the search for genetic disruptions. Targeted sequencing of all protein coding regions of the genome, called exome analysis, is actively used in research-oriented genetics clinics, with the transition to exomes as a standard procedure underway. This transition is challenging; identification of potentially causal mutation(s) amongst ∼10(6) variants requires specialized computation in combination with expert assessment. This study analyzes the usability of user interfaces for clinical exome analysis software. There are two study objectives: (1) To ascertain the key features of successful user interfaces for clinical exome analysis software based on the perspective of expert clinical geneticists, (2) To assess user-system interactions in order to reveal strengths and weaknesses of existing software, inform future design, and accelerate the clinical uptake of exome analysis. Surveys, interviews, and cognitive task analysis were performed for the assessment of two next-generation exome sequence analysis software packages. The subjects included ten clinical geneticists who interacted with the software packages using the "think aloud" method. Subjects' interactions with the software were recorded in their clinical office within an urban research and teaching hospital. All major user interface events (from the user interactions with the packages) were time-stamped and annotated with coding categories to identify usability issues in order to characterize desired features and deficiencies in the user experience. We detected 193 usability issues, the majority of which concern interface layout and navigation, and the resolution of reports. Our study highlights gaps in specific software features typical within exome analysis. The clinicians perform best when the flow of the system is structured into well-defined yet customizable layers for incorporation within the clinical workflow. The results highlight opportunities to dramatically accelerate clinician analysis and interpretation of patient genomic data. We present the first application of usability methods to evaluate software interfaces in the context of exome analysis. Our results highlight how the study of user responses can lead to identification of usability issues and challenges and reveal software reengineering opportunities for improving clinical next-generation sequencing analysis. While the evaluation focused on two distinctive software tools, the results are general and should inform active and future software development for genome analysis software. As large-scale genome analysis becomes increasingly common in healthcare, it is critical that efficient and effective software interfaces are provided to accelerate clinical adoption of the technology. Implications for improved design of such applications are discussed. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  16. Genome-Assisted Prediction of Quantitative Traits Using the R Package sommer.

    PubMed

    Covarrubias-Pazaran, Giovanny

    2016-01-01

    Most traits of agronomic importance are quantitative in nature, and genetic markers have been used for decades to dissect such traits. Recently, genomic selection has earned attention as next generation sequencing technologies became feasible for major and minor crops. Mixed models have become a key tool for fitting genomic selection models, but most current genomic selection software can only include a single variance component other than the error, making hybrid prediction using additive, dominance and epistatic effects unfeasible for species displaying heterotic effects. Moreover, Likelihood-based software for fitting mixed models with multiple random effects that allows the user to specify the variance-covariance structure of random effects has not been fully exploited. A new open-source R package called sommer is presented to facilitate the use of mixed models for genomic selection and hybrid prediction purposes using more than one variance component and allowing specification of covariance structures. The use of sommer for genomic prediction is demonstrated through several examples using maize and wheat genotypic and phenotypic data. At its core, the program contains three algorithms for estimating variance components: Average information (AI), Expectation-Maximization (EM) and Efficient Mixed Model Association (EMMA). Kernels for calculating the additive, dominance and epistatic relationship matrices are included, along with other useful functions for genomic analysis. Results from sommer were comparable to other software, but the analysis was faster than Bayesian counterparts in the magnitude of hours to days. In addition, ability to deal with missing data, combined with greater flexibility and speed than other REML-based software was achieved by putting together some of the most efficient algorithms to fit models in a gentle environment such as R.

  17. Data from fitting Gaussian process models to various data sets using eight Gaussian process software packages.

    PubMed

    Erickson, Collin B; Ankenman, Bruce E; Sanchez, Susan M

    2018-06-01

    This data article provides the summary data from tests comparing various Gaussian process software packages. Each spreadsheet represents a single function or type of function using a particular input sample size. In each spreadsheet, a row gives the results for a particular replication using a single package. Within each spreadsheet there are the results from eight Gaussian process model-fitting packages on five replicates of the surface. There is also one spreadsheet comparing the results from two packages performing stochastic kriging. These data enable comparisons between the packages to determine which package will give users the best results.

  18. JP-8+100: The development of high-thermal-stability jet fuel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heneghan, S.P.; Zabarnick, S.; Ballal, D.R.

    1996-09-01

    Jet fuel requirements have evolved over the years as a balance of the demands placed by advanced aircraft performance (technological need), fuel cost (economic factors), and fuel availability (strategic factors). In a modern aircraft, the jet fuel not only provides the propulsive energy for flight, but also is the primary coolant for aircraft and engine subsystems. To meet the evolving challenge of improving the cooling potential of jet fuel while maintaining the current availability at a minimal price increase, the US Air Force, industry, and academia have teamed to develop an additive package for JP-8 fuels. This paper describes themore » development of an additive package for JP-8, to produce JP-8+100. This new fuel offers a 55 C increase in the bulk maximum temperature (from 325 F to 425 F) and improves the heat sink capability by 50%. Major advances made during the development JP-8 + 100 fuel include the development of several new quantitative fuel analysis tests, a free radical theory of autooxidation, adaptation of new chemistry models to computational fluid dynamics programs, and a nonparametric statistical analysis to evaluate thermal stability. Hundreds of additives were tested for effectiveness, and a package of additives was then formulated for JP-8 fuel. This package has been tested for fuel system materials compatibility and general fuel applicability. To date, the flight testing ha shown an improvement in thermal stability of JP-8 fuel. This improvement has resulted in a significant reduction in fuel-related maintenance costs and a threefold increase in mean time between fuel-related failures. In this manner, a novel high-thermal-stability jet fuel for the 21st century has been successfully developed.« less

  19. Modeling and MBL: Software Tools for Science.

    ERIC Educational Resources Information Center

    Tinker, Robert F.

    Recent technological advances and new software packages put unprecedented power for experimenting and theory-building in the hands of students at all levels. Microcomputer-based laboratory (MBL) and model-solving tools illustrate the educational potential of the technology. These tools include modeling software and three MBL packages (which are…

  20. A Characteristics Approach to the Evaluation of Economics Software Packages.

    ERIC Educational Resources Information Center

    Lumsden, Keith; Scott, Alex

    1988-01-01

    Utilizes Bloom's Taxonomy to identify elements of teacher and student interest. Depicts the way in which these interests are developed into characteristics for use in analytically evaluating software. Illustrates the use of this evaluating technique by appraising the much used software package "Running the British Economy." (KO)

  1. Scientific Software: How to Find What You Need and Get What You Pay for.

    ERIC Educational Resources Information Center

    Gabaldon, Diana J.

    1984-01-01

    Provides examples of software for the sciences, including: packages for pathology/toxicology laboratories (costing over $15,000), DNA sequencing, and data acquisition/analysis; general-purpose software for scientific uses; and "custom" packages, including a program to maintain a listing of "Escherichia coli" strains and a…

  2. Learn by Yourself: The Self-Learning Tools for Qualitative Analysis Software Packages

    ERIC Educational Resources Information Center

    Freitas, Fábio; Ribeiro, Jaime; Brandão, Catarina; Reis, Luís Paulo; de Souza, Francislê Neri; Costa, António Pedro

    2017-01-01

    Computer Assisted Qualitative Data Analysis Software (CAQDAS) are tools that help researchers to develop qualitative research projects. These software packages help the users with tasks such as transcription analysis, coding and text interpretation, writing and annotation, content search and analysis, recursive abstraction, grounded theory…

  3. Interactive Visualization of Assessment Data: The Software Package Mondrian

    ERIC Educational Resources Information Center

    Unlu, Ali; Sargin, Anatol

    2009-01-01

    Mondrian is state-of-the-art statistical data visualization software featuring modern interactive visualization techniques for a wide range of data types. This article reviews the capabilities, functionality, and interactive properties of this software package. Key features of Mondrian are illustrated with data from the Programme for International…

  4. An Overview of Software for Conducting Dimensionality Assessment in Multidimensional Models

    ERIC Educational Resources Information Center

    Svetina, Dubravka; Levy, Roy

    2012-01-01

    An overview of popular software packages for conducting dimensionality assessment in multidimensional models is presented. Specifically, five popular software packages are described in terms of their capabilities to conduct dimensionality assessment with respect to the nature of analysis (exploratory or confirmatory), types of data (dichotomous,…

  5. Cognitive Foundry v. 3.0 (OSS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Basilico, Justin; Dixon, Kevin; McClain, Jonathan

    2009-11-18

    The Cognitive Foundry is a unified collection of tools designed for research and applications that use cognitive modeling, machine learning, or pattern recognition. The software library contains design patterns, interface definitions, and default implementations of reusable software components and algorithms designed to support a wide variety of research and development needs. The library contains three main software packages: the Common package that contains basic utilities and linear algebraic methods, the Cognitive Framework package that contains tools to assist in implementing and analyzing theories of cognition, and the Machine Learning package that provides general algorithms and methods for populating Cognitive Frameworkmore » components from domain-relevant data.« less

  6. 76 FR 54808 - Agency Information Collection Activities: Submission for the Office of Management and Budget (OMB...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-02

    ... the Independent Storage of Spent Nuclear Fuel, High-Level Radioactive Waste and Reactor-Related... receive, transfer, package and possess power reactor spent fuel, high-level waste, and other radioactive..., package, and possess power reactor spent fuel and high-level radioactive waste, and other associated...

  7. MicroSIFT Courseware Evaluations. [Set 11 (223-259), Set 12 (260-293), and a Special Set of 99 LIBRA Reviews of Junior High School Science Software, Including Subject and Title Indexes Covering Sets 1-12 and Special Set L].

    ERIC Educational Resources Information Center

    Northwest Regional Educational Lab., Portland, OR.

    This document consists of 170 microcomputer software package evaluations prepared by the MicroSIFT (Microcomputer Software and Information for Teachers) Clearinghouse at the Northwest Regional Education Laboratory. Set 11 consists of 37 packages. Set 12 consists of 34 packages. A special unnumbered set, entitled LIBRA Reviews, treats 99 packages…

  8. DRUGDOG 3.0: U.S. Navy Random Urinalysis Software Package

    DTIC Science & Technology

    1994-03-15

    NAVAL PO11GRADUATE SCHOOL Monterey, California AD-A281 748 THESIS LJuEoTE DRUGDOG 3.0: U. S . NAVY RANDOM URINALYSIS SOFTWARE PACKAGE by (% Dale E...ONLY (Leave blank) 2. REPORT DATE 3. REPORT TYPE AND DATES COVERED 15 MAR 94 Master’s Thesis 4. TITLE AND SUBTITLE DRUGDOG 3.0: U. S . NAVY RANDOM 5...FUNDING NUMBERS URINALYSIS SOFTWARE PACKAGE 6. AUTHOR( S ) Dale E. Wilson 7. PERFORMING ORGANIZATION NAME( S ) AND ADDRESS(ES) 8. PERFORMING Naval

  9. Progress towards an Optimization Methodology for Combustion-Driven Portable Thermoelectric Power Generation Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krishnan, Shankar; Karri, Naveen K.; Gogna, Pawan K.

    2012-03-13

    Enormous military and commercial interests exist in developing quiet, lightweight, and compact thermoelectric (TE) power generation systems. This paper investigates design integration and analysis of an advanced TE power generation system implementing JP-8 fueled combustion and thermal recuperation. Design and development of a portable TE power system using a JP-8 combustor as a high temperature heat source and optimal process flows depend on efficient heat generation, transfer, and recovery within the system are explored. Design optimization of the system required considering the combustion system efficiency and TE conversion efficiency simultaneously. The combustor performance and TE sub-system performance were coupled directlymore » through exhaust temperatures, fuel and air mass flow rates, heat exchanger performance, subsequent hot-side temperatures, and cold-side cooling techniques and temperatures. Systematic investigation of this system relied on accurate thermodynamic modeling of complex, high-temperature combustion processes concomitantly with detailed thermoelectric converter thermal/mechanical modeling. To this end, this work reports on design integration of systemlevel process flow simulations using commercial software CHEMCADTM with in-house thermoelectric converter and module optimization, and heat exchanger analyses using COMSOLTM software. High-performance, high-temperature TE materials and segmented TE element designs are incorporated in coupled design analyses to achieve predicted TE subsystem level conversion efficiencies exceeding 10%. These TE advances are integrated with a high performance microtechnology combustion reactor based on recent advances at the Pacific Northwest National Laboratory (PNNL). Predictions from this coupled simulation established a basis for optimal selection of fuel and air flow rates, thermoelectric module design and operating conditions, and microtechnology heat-exchanger design criteria. This paper will discuss this simulation process that leads directly to system efficiency power maps defining potentially available optimal system operating conditions and regimes. This coupled simulation approach enables pathways for integrated use of high-performance combustor components, high performance TE devices, and microtechnologies to produce a compact, lightweight, combustion driven TE power system prototype that operates on common fuels.« less

  10. Using an architectural approach to integrate heterogeneous, distributed software components

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Purtilo, James M.

    1995-01-01

    Many computer programs cannot be easily integrated because their components are distributed and heterogeneous, i.e., they are implemented in diverse programming languages, use different data representation formats, or their runtime environments are incompatible. In many cases, programs are integrated by modifying their components or interposing mechanisms that handle communication and conversion tasks. For example, remote procedure call (RPC) helps integrate heterogeneous, distributed programs. When configuring such programs, however, mechanisms like RPC must be used explicitly by software developers in order to integrate collections of diverse components. Each collection may require a unique integration solution. This paper describes improvements to the concepts of software packaging and some of our experiences in constructing complex software systems from a wide variety of components in different execution environments. Software packaging is a process that automatically determines how to integrate a diverse collection of computer programs based on the types of components involved and the capabilities of available translators and adapters in an environment. Software packaging provides a context that relates such mechanisms to software integration processes and reduces the cost of configuring applications whose components are distributed or implemented in different programming languages. Our software packaging tool subsumes traditional integration tools like UNIX make by providing a rule-based approach to software integration that is independent of execution environments.

  11. Molecular Isotopic Distribution Analysis (MIDAs) with Adjustable Mass Accuracy

    NASA Astrophysics Data System (ADS)

    Alves, Gelio; Ogurtsov, Aleksey Y.; Yu, Yi-Kuo

    2014-01-01

    In this paper, we present Molecular Isotopic Distribution Analysis (MIDAs), a new software tool designed to compute molecular isotopic distributions with adjustable accuracies. MIDAs offers two algorithms, one polynomial-based and one Fourier-transform-based, both of which compute molecular isotopic distributions accurately and efficiently. The polynomial-based algorithm contains few novel aspects, whereas the Fourier-transform-based algorithm consists mainly of improvements to other existing Fourier-transform-based algorithms. We have benchmarked the performance of the two algorithms implemented in MIDAs with that of eight software packages (BRAIN, Emass, Mercury, Mercury5, NeutronCluster, Qmass, JFC, IC) using a consensus set of benchmark molecules. Under the proposed evaluation criteria, MIDAs's algorithms, JFC, and Emass compute with comparable accuracy the coarse-grained (low-resolution) isotopic distributions and are more accurate than the other software packages. For fine-grained isotopic distributions, we compared IC, MIDAs's polynomial algorithm, and MIDAs's Fourier transform algorithm. Among the three, IC and MIDAs's polynomial algorithm compute isotopic distributions that better resemble their corresponding exact fine-grained (high-resolution) isotopic distributions. MIDAs can be accessed freely through a user-friendly web-interface at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/midas/index.html.

  12. Molecular Isotopic Distribution Analysis (MIDAs) with adjustable mass accuracy.

    PubMed

    Alves, Gelio; Ogurtsov, Aleksey Y; Yu, Yi-Kuo

    2014-01-01

    In this paper, we present Molecular Isotopic Distribution Analysis (MIDAs), a new software tool designed to compute molecular isotopic distributions with adjustable accuracies. MIDAs offers two algorithms, one polynomial-based and one Fourier-transform-based, both of which compute molecular isotopic distributions accurately and efficiently. The polynomial-based algorithm contains few novel aspects, whereas the Fourier-transform-based algorithm consists mainly of improvements to other existing Fourier-transform-based algorithms. We have benchmarked the performance of the two algorithms implemented in MIDAs with that of eight software packages (BRAIN, Emass, Mercury, Mercury5, NeutronCluster, Qmass, JFC, IC) using a consensus set of benchmark molecules. Under the proposed evaluation criteria, MIDAs's algorithms, JFC, and Emass compute with comparable accuracy the coarse-grained (low-resolution) isotopic distributions and are more accurate than the other software packages. For fine-grained isotopic distributions, we compared IC, MIDAs's polynomial algorithm, and MIDAs's Fourier transform algorithm. Among the three, IC and MIDAs's polynomial algorithm compute isotopic distributions that better resemble their corresponding exact fine-grained (high-resolution) isotopic distributions. MIDAs can be accessed freely through a user-friendly web-interface at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/midas/index.html.

  13. Improving the quality of EHR recording in primary care: a data quality feedback tool.

    PubMed

    van der Bij, Sjoukje; Khan, Nasra; Ten Veen, Petra; de Bakker, Dinny H; Verheij, Robert A

    2017-01-01

    Electronic health record (EHR) data are used to exchange information among health care providers. For this purpose, the quality of the data is essential. We developed a data quality feedback tool that evaluates differences in EHR data quality among practices and software packages as part of a larger intervention. The tool was applied in 92 practices in the Netherlands using different software packages. Practices received data quality feedback in 2010 and 2012. We observed large differences in the quality of recording. For example, the percentage of episodes of care that had a meaningful diagnostic code ranged from 30% to 100%. Differences were highly related to the software package. A year after the first measurement, the quality of recording had improved significantly and differences decreased, with 67% of the physicians indicating that they had actively changed their recording habits based on the results of the first measurement. About 80% found the feedback helpful in pinpointing recording problems. One of the software vendors made changes in functionality as a result of the feedback. Our EHR data quality feedback tool is capable of highlighting differences among practices and software packages. As such, it also stimulates improvements. As substantial variability in recording is related to the software package, our study strengthens the evidence that data quality can be improved substantially by standardizing the functionalities of EHR software packages. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  14. PHYLUCE is a software package for the analysis of conserved genomic loci.

    PubMed

    Faircloth, Brant C

    2016-03-01

    Targeted enrichment of conserved and ultraconserved genomic elements allows universal collection of phylogenomic data from hundreds of species at multiple time scales (<5 Ma to > 300 Ma). Prior to downstream inference, data from these types of targeted enrichment studies must undergo preprocessing to assemble contigs from sequence data; identify targeted, enriched loci from the off-target background data; align enriched contigs representing conserved loci to one another; and prepare and manipulate these alignments for subsequent phylogenomic inference. PHYLUCE is an efficient and easy-to-install software package that accomplishes these tasks across hundreds of taxa and thousands of enriched loci. PHYLUCE is written for Python 2.7. PHYLUCE is supported on OSX and Linux (RedHat/CentOS) operating systems. PHYLUCE source code is distributed under a BSD-style license from https://www.github.com/faircloth-lab/phyluce/ PHYLUCE is also available as a package (https://binstar.org/faircloth-lab/phyluce) for the Anaconda Python distribution that installs all dependencies, and users can request a PHYLUCE instance on iPlant Atmosphere (tag: phyluce). The software manual and a tutorial are available from http://phyluce.readthedocs.org/en/latest/ and test data are available from doi: 10.6084/m9.figshare.1284521. brant@faircloth-lab.org Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  15. Efficient Calculation of Exact Exchange Within the Quantum Espresso Software Package

    NASA Astrophysics Data System (ADS)

    Barnes, Taylor; Kurth, Thorsten; Carrier, Pierre; Wichmann, Nathan; Prendergast, David; Kent, Paul; Deslippe, Jack

    Accurate simulation of condensed matter at the nanoscale requires careful treatment of the exchange interaction between electrons. In the context of plane-wave DFT, these interactions are typically represented through the use of approximate functionals. Greater accuracy can often be obtained through the use of functionals that incorporate some fraction of exact exchange; however, evaluation of the exact exchange potential is often prohibitively expensive. We present an improved algorithm for the parallel computation of exact exchange in Quantum Espresso, an open-source software package for plane-wave DFT simulation. Through the use of aggressive load balancing and on-the-fly transformation of internal data structures, our code exhibits speedups of approximately an order of magnitude for practical calculations. Additional optimizations are presented targeting the many-core Intel Xeon-Phi ``Knights Landing'' architecture, which largely powers NERSC's new Cori system. We demonstrate the successful application of the code to difficult problems, including simulation of water at a platinum interface and computation of the X-ray absorption spectra of transition metal oxides.

  16. AstroBlend: An astrophysical visualization package for Blender

    NASA Astrophysics Data System (ADS)

    Naiman, J. P.

    2016-04-01

    The rapid growth in scale and complexity of both computational and observational astrophysics over the past decade necessitates efficient and intuitive methods for examining and visualizing large datasets. Here, I present AstroBlend, an open-source Python library for use within the three dimensional modeling software, Blender. While Blender has been a popular open-source software among animators and visual effects artists, in recent years it has also become a tool for visualizing astrophysical datasets. AstroBlend combines the three dimensional capabilities of Blender with the analysis tools of the widely used astrophysical toolset, yt, to afford both computational and observational astrophysicists the ability to simultaneously analyze their data and create informative and appealing visualizations. The introduction of this package includes a description of features, work flow, and various example visualizations. A website - www.astroblend.com - has been developed which includes tutorials, and a gallery of example images and movies, along with links to downloadable data, three dimensional artistic models, and various other resources.

  17. An interactive environment for the analysis of large Earth observation and model data sets

    NASA Technical Reports Server (NTRS)

    Bowman, Kenneth P.; Walsh, John E.; Wilhelmson, Robert B.

    1994-01-01

    Envision is an interactive environment that provides researchers in the earth sciences convenient ways to manage, browse, and visualize large observed or model data sets. Its main features are support for the netCDF and HDF file formats, an easy to use X/Motif user interface, a client-server configuration, and portability to many UNIX workstations. The Envision package also provides new ways to view and change metadata in a set of data files. It permits a scientist to conveniently and efficiently manage large data sets consisting of many data files. It also provides links to popular visualization tools so that data can be quickly browsed. Envision is a public domain package, freely available to the scientific community. Envision software (binaries and source code) and documentation can be obtained from either of these servers: ftp://vista.atmos.uiuc.edu/pub/envision/ and ftp://csrp.tamu.edu/pub/envision/. Detailed descriptions of Envision capabilities and operations can be found in the User's Guide and Reference Manuals distributed with Envision software.

  18. SimVascular: An Open Source Pipeline for Cardiovascular Simulation.

    PubMed

    Updegrove, Adam; Wilson, Nathan M; Merkow, Jameson; Lan, Hongzhi; Marsden, Alison L; Shadden, Shawn C

    2017-03-01

    Patient-specific cardiovascular simulation has become a paradigm in cardiovascular research and is emerging as a powerful tool in basic, translational and clinical research. In this paper we discuss the recent development of a fully open-source SimVascular software package, which provides a complete pipeline from medical image data segmentation to patient-specific blood flow simulation and analysis. This package serves as a research tool for cardiovascular modeling and simulation, and has contributed to numerous advances in personalized medicine, surgical planning and medical device design. The SimVascular software has recently been refactored and expanded to enhance functionality, usability, efficiency and accuracy of image-based patient-specific modeling tools. Moreover, SimVascular previously required several licensed components that hindered new user adoption and code management and our recent developments have replaced these commercial components to create a fully open source pipeline. These developments foster advances in cardiovascular modeling research, increased collaboration, standardization of methods, and a growing developer community.

  19. Information Metacatalog for a Grid

    NASA Technical Reports Server (NTRS)

    Kolano, Paul

    2007-01-01

    SWIM is a Software Information Metacatalog that gathers detailed information about the software components and packages installed on a grid resource. Information is currently gathered for Executable and Linking Format (ELF) executables and shared libraries, Java classes, shell scripts, and Perl and Python modules. SWIM is built on top of the POUR framework, which is described in the preceding article. SWIM consists of a set of Perl modules for extracting software information from a system, an XML schema defining the format of data that can be added by users, and a POUR XML configuration file that describes how these elements are used to generate periodic, on-demand, and user-specified information. Periodic software information is derived mainly from the package managers used on each system. SWIM collects information from native package managers in FreeBSD, Solaris, and IRX as well as the RPM, Perl, and Python package managers on multiple platforms. Because not all software is available, or installed in package form, SWIM also crawls the set of relevant paths from the File System Hierarchy Standard that defines the standard file system structure used by all major UNIX distributions. Using these two techniques, the vast majority of software installed on a system can be located. SWIM computes the same information gathered by the periodic routines for specific files on specific hosts, and locates software on a system given only its name and type.

  20. The Package-Based Development Process in the Flight Dynamics Division

    NASA Technical Reports Server (NTRS)

    Parra, Amalia; Seaman, Carolyn; Basili, Victor; Kraft, Stephen; Condon, Steven; Burke, Steven; Yakimovich, Daniil

    1997-01-01

    The Software Engineering Laboratory (SEL) has been operating for more than two decades in the Flight Dynamics Division (FDD) and has adapted to the constant movement of the software development environment. The SEL's Improvement Paradigm shows that process improvement is an iterative process. Understanding, Assessing and Packaging are the three steps that are followed in this cyclical paradigm. As the improvement process cycles back to the first step, after having packaged some experience, the level of understanding will be greater. In the past, products resulting from the packaging step have been large process documents, guidebooks, and training programs. As the technical world moves toward more modularized software, we have made a move toward more modularized software development process documentation, as such the products of the packaging step are becoming smaller and more frequent. In this manner, the QIP takes on a more spiral approach rather than a waterfall. This paper describes the state of the FDD in the area of software development processes, as revealed through the understanding and assessing activities conducted by the COTS study team. The insights presented include: (1) a characterization of a typical FDD Commercial Off the Shelf (COTS) intensive software development life-cycle process, (2) lessons learned through the COTS study interviews, and (3) a description of changes in the SEL due to the changing and accelerating nature of software development in the FDD.

  1. Cantera Integration with the Toolbox for Modeling and Analysis of Thermodynamic Systems (T-MATS)

    NASA Technical Reports Server (NTRS)

    Lavelle, Thomas M.; Chapman, Jeffryes W.; May, Ryan D.; Litt, Jonathan S.; Guo, Ten-Huei

    2014-01-01

    NASA Glenn Research Center (GRC) has recently developed a software package for modeling generic thermodynamic systems called the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS). T-MATS is a library of building blocks that can be assembled to represent any thermodynamic system in the Simulink(Registered TradeMark) (The MathWorks, Inc.) environment. These elements, along with a Newton Raphson solver (also provided as part of the T-MATS package), enable users to create models of a wide variety of systems. The current version of T-MATS (v1.0.1) uses tabular data for providing information about a specific mixture of air, water (humidity), and hydrocarbon fuel in calculations of thermodynamic properties. The capabilities of T-MATS can be expanded by integrating it with the Cantera thermodynamic package. Cantera is an object-oriented analysis package that calculates thermodynamic solutions for any mixture defined by the user. Integration of Cantera with T-MATS extends the range of systems that may be modeled using the toolbox. In addition, the library of elements released with Cantera were developed using MATLAB native M-files, allowing for quicker prototyping of elements. This paper discusses how the new Cantera-based elements are created and provides examples for using T-MATS integrated with Cantera.

  2. Cantera Integration with the Toolbox for Modeling and Analysis of Thermodynamic Systems (T-MATS)

    NASA Technical Reports Server (NTRS)

    Lavelle, Thomas M.; Chapman, Jeffryes W.; May, Ryan D.; Litt, Jonathan S.; Guo, Ten-Huei

    2014-01-01

    NASA Glenn Research Center (GRC) has recently developed a software package for modeling generic thermodynamic systems called the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS). T-MATS is a library of building blocks that can be assembled to represent any thermodynamic system in the Simulink (The MathWorks, Inc.) environment. These elements, along with a Newton Raphson solver (also provided as part of the T-MATS package), enable users to create models of a wide variety of systems. The current version of T-MATS (v1.0.1) uses tabular data for providing information about a specific mixture of air, water (humidity), and hydrocarbon fuel in calculations of thermodynamic properties. The capabilities of T-MATS can be expanded by integrating it with the Cantera thermodynamic package. Cantera is an object-oriented analysis package that calculates thermodynamic solutions for any mixture defined by the user. Integration of Cantera with T-MATS extends the range of systems that may be modeled using the toolbox. In addition, the library of elements released with Cantera were developed using MATLAB native M-files, allowing for quicker prototyping of elements. This paper discusses how the new Cantera-based elements are created and provides examples for using T-MATS integrated with Cantera.

  3. Global review of open access risk assessment software packages valid for global or continental scale analysis

    NASA Astrophysics Data System (ADS)

    Daniell, James; Simpson, Alanna; Gunasekara, Rashmin; Baca, Abigail; Schaefer, Andreas; Ishizawa, Oscar; Murnane, Rick; Tijssen, Annegien; Deparday, Vivien; Forni, Marc; Himmelfarb, Anne; Leder, Jan

    2015-04-01

    Over the past few decades, a plethora of open access software packages for the calculation of earthquake, volcanic, tsunami, storm surge, wind and flood have been produced globally. As part of the World Bank GFDRR Review released at the Understanding Risk 2014 Conference, over 80 such open access risk assessment software packages were examined. Commercial software was not considered in the evaluation. A preliminary analysis was used to determine whether the 80 models were currently supported and if they were open access. This process was used to select a subset of 31 models that include 8 earthquake models, 4 cyclone models, 11 flood models, and 8 storm surge/tsunami models for more detailed analysis. By using multi-criteria analysis (MCDA) and simple descriptions of the software uses, the review allows users to select a few relevant software packages for their own testing and development. The detailed analysis evaluated the models on the basis of over 100 criteria and provides a synopsis of available open access natural hazard risk modelling tools. In addition, volcano software packages have since been added making the compendium of risk software tools in excess of 100. There has been a huge increase in the quality and availability of open access/source software over the past few years. For example, private entities such as Deltares now have an open source policy regarding some flood models (NGHS). In addition, leaders in developing risk models in the public sector, such as Geoscience Australia (EQRM, TCRM, TsuDAT, AnuGA) or CAPRA (ERN-Flood, Hurricane, CRISIS2007 etc.), are launching and/or helping many other initiatives. As we achieve greater interoperability between modelling tools, we will also achieve a future wherein different open source and open access modelling tools will be increasingly connected and adapted towards unified multi-risk model platforms and highly customised solutions. It was seen that many software tools could be improved by enabling user-defined exposure and vulnerability. Without this function, many tools can only be used regionally and not at global or continental scale. It is becoming increasingly easy to use multiple packages for a single region and/or hazard to characterize the uncertainty in the risk, or use as checks for the sensitivities in the analysis. There is a potential for valuable synergy between existing software. A number of open source software packages could be combined to generate a multi-risk model with multiple views of a hazard. This extensive review has simply attempted to provide a platform for dialogue between all open source and open access software packages and to hopefully inspire collaboration between developers, given the great work done by all open access and open source developers.

  4. Design Package for Fuel Retrieval System Fuel Handling Tool Modification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    TEDESCHI, D.J.

    This design package documents design, fabrication, and testing of new stinger tool design. Future revisions will document further development of the stinger tool and incorporate various developmental stages, and final test results.

  5. On fitting generalized linear mixed-effects models for binary responses using different statistical packages.

    PubMed

    Zhang, Hui; Lu, Naiji; Feng, Changyong; Thurston, Sally W; Xia, Yinglin; Zhu, Liang; Tu, Xin M

    2011-09-10

    The generalized linear mixed-effects model (GLMM) is a popular paradigm to extend models for cross-sectional data to a longitudinal setting. When applied to modeling binary responses, different software packages and even different procedures within a package may give quite different results. In this report, we describe the statistical approaches that underlie these different procedures and discuss their strengths and weaknesses when applied to fit correlated binary responses. We then illustrate these considerations by applying these procedures implemented in some popular software packages to simulated and real study data. Our simulation results indicate a lack of reliability for most of the procedures considered, which carries significant implications for applying such popular software packages in practice. Copyright © 2011 John Wiley & Sons, Ltd.

  6. Current And Future Directions Of Lens Design Software

    NASA Astrophysics Data System (ADS)

    Gustafson, Darryl E.

    1983-10-01

    The most effective environment for doing lens design continues to evolve as new computer hardware and software tools become available. Important recent hardware developments include: Low-cost but powerful interactive multi-user 32 bit computers with virtual memory that are totally software-compatible with prior larger and more expensive members of the family. A rapidly growing variety of graphics devices for both hard-copy and screen graphics, including many with color capability. In addition, with optical design software readily accessible in many forms, optical design has become a part-time activity for a large number of engineers instead of being restricted to a small number of full-time specialists. A designer interface that is friendly for the part-time user while remaining efficient for the full-time designer is thus becoming more important as well as more practical. Along with these developments, software tools in other scientific and engineering disciplines are proliferating. Thus, the optical designer is less and less unique in his use of computer-aided techniques and faces the challenge and opportunity of efficiently communicating his designs to other computer-aided-design (CAD), computer-aided-manufacturing (CAM), structural, thermal, and mechanical software tools. This paper will address the impact of these developments on the current and future directions of the CODE VTM optical design software package, its implementation, and the resulting lens design environment.

  7. General-Purpose Ada Software Packages

    NASA Technical Reports Server (NTRS)

    Klumpp, Allan R.

    1991-01-01

    Collection of subprograms brings to Ada many features from other programming languages. All generic packages designed to be easily instantiated for types declared in user's facility. Most packages have widespread applicability, although some oriented for avionics applications. All designed to facilitate writing new software in Ada. Written on IBM/AT personal computer running under PC DOS, v.3.1.

  8. Advance Directives and Do Not Resuscitate Orders

    MedlinePlus

    ... a form. Call a lawyer. Use a computer software package for legal documents. Advance directives and living ... you write by yourself or with a computer software package should follow your state laws. You may ...

  9. Nested Cohort - R software package

    Cancer.gov

    NestedCohort is an R software package for fitting Kaplan-Meier and Cox Models to estimate standardized survival and attributable risks for studies where covariates of interest are observed on only a sample of the cohort.

  10. Rough mill simulator version 3.0: an analysis tool for refining rough mill operations

    Treesearch

    Edward Thomas; Joel Weiss

    2006-01-01

    ROMI-3 is a rough mill computer simulation package designed to be used by both rip-first and chop-first rough mill operators and researchers. ROMI-3 allows users to model and examine the complex relationships among cutting bill, lumber grade mix, processing options, and their impact on rough mill yield and efficiency. Integrated into the ROMI-3 software is a new least-...

  11. An Integrated Software Package to Enable Predictive Simulation Capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Fitzhenry, Erin B.; Jin, Shuangshuang

    The power grid is increasing in complexity due to the deployment of smart grid technologies. Such technologies vastly increase the size and complexity of power grid systems for simulation and modeling. This increasing complexity necessitates not only the use of high-performance-computing (HPC) techniques, but a smooth, well-integrated interplay between HPC applications. This paper presents a new integrated software package that integrates HPC applications and a web-based visualization tool based on a middleware framework. This framework can support the data communication between different applications. Case studies with a large power system demonstrate the predictive capability brought by the integrated software package,more » as well as the better situational awareness provided by the web-based visualization tool in a live mode. Test results validate the effectiveness and usability of the integrated software package.« less

  12. Software packager user's guide

    NASA Technical Reports Server (NTRS)

    Callahan, John R.

    1995-01-01

    Software integration is a growing area of concern for many programmers and software managers because the need to build new programs quickly from existing components is greater than ever. This includes building versions of software products for multiple hardware platforms and operating systems, building programs from components written in different languages, and building systems from components that must execute on different machines in a distributed network. The goal of software integration is to make building new programs from existing components more seamless -- programmers should pay minimal attention to the underlying configuration issues involved. Libraries of reusable components and classes are important tools but only partial solutions to software development problems. Even though software components may have compatible interfaces, there may be other reasons, such as differences between execution environments, why they cannot be integrated. Often, components must be adapted or reimplemented to fit into another application because of implementation differences -- they are implemented in different programming languages, dependent on different operating system resources, or must execute on different physical machines. The software packager is a tool that allows programmers to deal with interfaces between software components and ignore complex integration details. The packager takes modular descriptions of the structure of a software system written in the package specification language and produces an integration program in the form of a makefile. If complex integration tools are needed to integrate a set of components, such as remote procedure call stubs, their use is implied by the packager automatically and stub generation tools are invoked in the corresponding makefile. The programmer deals only with the components themselves and not the details of how to build the system on any given platform.

  13. MicroSIFT Courseware Evaluations (169-198). Set 9. Including Subject and Title Indexes Covering Sets 1-9.

    ERIC Educational Resources Information Center

    Weaver, Dave, Ed.

    This document consists of 30 microcomputer software package evaluations prepared for the MicroSIFT (Microcomputer Software and Information for Teachers) Clearinghouse at the Northwest Regional Educational Laboratory (NWREL). The concise, single-sheet resume describing and evaluating each software package includes source, cost, ability level,…

  14. Annotated Bibliography of Computer Software for Teaching Early Reading and Spelling. Project RIMES 2000.

    ERIC Educational Resources Information Center

    Rhein, Deborah; Alibrandi, Mary; Lyons, Mary; Sammons, Janice; Doyle, Luther

    This bibliography, developed by Project RIMES (Reading Instructional Methods of Efficacy with Students) lists 80 software packages for teaching early reading and spelling to students at risk for reading and spelling failure. The software packages are presented alphabetically by title. Entries usually include a grade level indicator, a brief…

  15. The application of Dow Chemical's perfluorinated membranes in proton-exchange membrane fuel cells

    NASA Technical Reports Server (NTRS)

    Eisman, G. A.

    1989-01-01

    Dow Chemical's research activities in fuel cells revolve around the development of perfluorosulfonic acid membranes useful as the proton transport medium and separator. Some of the performance characteristics which are typical for such membranes are outlined. The results of tests utilizing a new experimental membrane useful in proton-exchange membrane fuel cells are presented. The high voltage at low current densities can lead to higher system efficiencies while, at the same time, not sacrificing other critical properties pertinent to membrane fuel cell operation. A series of tests to determine response times indicated that on-off cycles are on the order of 80 milliseconds to reach 90 percent of full power. The IR free voltage at 100 amps/sq ft was determined and the results indicating a membrane/electrode package resistance to be .15 ohm-sq cm at 100 amps/sq ft.

  16. GAPIT: genome association and prediction integrated tool.

    PubMed

    Lipka, Alexander E; Tian, Feng; Wang, Qishan; Peiffer, Jason; Li, Meng; Bradbury, Peter J; Gore, Michael A; Buckler, Edward S; Zhang, Zhiwu

    2012-09-15

    Software programs that conduct genome-wide association studies and genomic prediction and selection need to use methodologies that maximize statistical power, provide high prediction accuracy and run in a computationally efficient manner. We developed an R package called Genome Association and Prediction Integrated Tool (GAPIT) that implements advanced statistical methods including the compressed mixed linear model (CMLM) and CMLM-based genomic prediction and selection. The GAPIT package can handle large datasets in excess of 10 000 individuals and 1 million single-nucleotide polymorphisms with minimal computational time, while providing user-friendly access and concise tables and graphs to interpret results. http://www.maizegenetics.net/GAPIT. zhiwu.zhang@cornell.edu Supplementary data are available at Bioinformatics online.

  17. On Designing Lightweight Threads for Substrate Software

    NASA Technical Reports Server (NTRS)

    Haines, Matthew

    1997-01-01

    Existing user-level thread packages employ a 'black box' design approach, where the implementation of the threads is hidden from the user. While this approach is often sufficient for application-level programmers, it hides critical design decisions that system-level programmers must be able to change in order to provide efficient service for high-level systems. By applying the principles of Open Implementation Analysis and Design, we construct a new user-level threads package that supports common thread abstractions and a well-defined meta-interface for altering the behavior of these abstractions. As a result, system-level programmers will have the advantages of using high-level thread abstractions without having to sacrifice performance, flexibility or portability.

  18. Error Prevention Aid

    NASA Technical Reports Server (NTRS)

    1987-01-01

    In a complex computer environment there is ample opportunity for error, a mistake by a programmer, or a software-induced undesirable side effect. In insurance, errors can cost a company heavily, so protection against inadvertent change is a must for the efficient firm. The data processing center at Transport Life Insurance Company has taken a step to guard against accidental changes by adopting a software package called EQNINT (Equations Interpreter Program). EQNINT cross checks the basic formulas in a program against the formulas that make up the major production system. EQNINT assures that formulas are coded correctly and helps catch errors before they affect the customer service or its profitability.

  19. SPSS and SAS programs for determining the number of components using parallel analysis and velicer's MAP test.

    PubMed

    O'Connor, B P

    2000-08-01

    Popular statistical software packages do not have the proper procedures for determining the number of components in factor and principal components analyses. Parallel analysis and Velicer's minimum average partial (MAP) test are validated procedures, recommended widely by statisticians. However, many researchers continue to use alternative, simpler, but flawed procedures, such as the eigenvalues-greater-than-one rule. Use of the proper procedures might be increased if these procedures could be conducted within familiar software environments. This paper describes brief and efficient programs for using SPSS and SAS to conduct parallel analyses and the MAP test.

  20. Hybrid PV/diesel solar power system design using multi-level factor analysis optimization

    NASA Astrophysics Data System (ADS)

    Drake, Joshua P.

    Solar power systems represent a large area of interest across a spectrum of organizations at a global level. It was determined that a clear understanding of current state of the art software and design methods, as well as optimization methods, could be used to improve the design methodology. Solar power design literature was researched for an in depth understanding of solar power system design methods and algorithms. Multiple software packages for the design and optimization of solar power systems were analyzed for a critical understanding of their design workflow. In addition, several methods of optimization were studied, including brute force, Pareto analysis, Monte Carlo, linear and nonlinear programming, and multi-way factor analysis. Factor analysis was selected as the most efficient optimization method for engineering design as it applied to solar power system design. The solar power design algorithms, software work flow analysis, and factor analysis optimization were combined to develop a solar power system design optimization software package called FireDrake. This software was used for the design of multiple solar power systems in conjunction with an energy audit case study performed in seven Tibetan refugee camps located in Mainpat, India. A report of solar system designs for the camps, as well as a proposed schedule for future installations was generated. It was determined that there were several improvements that could be made to the state of the art in modern solar power system design, though the complexity of current applications is significant.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salama, A.; Mikhail, M.

    Comprehensive software packages have been developed at the Western Research Centre as tools to help coal preparation engineers analyze, evaluate, and control coal cleaning processes. The COal Preparation Software package (COPS) performs three functions: (1) data handling and manipulation, (2) data analysis, including the generation of washability data, performance evaluation and prediction, density and size modeling, evaluation of density and size partition characteristics and attrition curves, and (3) generation of graphics output. The Separation ChARacteristics Estimation software packages (SCARE) are developed to balance raw density or size separation data. The cases of density and size separation data are considered. Themore » generated balanced data can take the balanced or normalized forms. The scaled form is desirable for direct determination of the partition functions (curves). The raw and generated separation data are displayed in tabular and/or graphical forms. The computer softwares described in this paper are valuable tools for coal preparation plant engineers and operators for evaluating process performance, adjusting plant parameters, and balancing raw density or size separation data. These packages have been applied very successfully in many projects carried out by WRC for the Canadian coal preparation industry. The software packages are designed to run on a personal computer (PC).« less

  2. Software Library for Bruker TopSpin NMR Data Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    A software library for parsing and manipulating frequency-domain data files that have been processed using the Bruker TopSpin NMR software package. In the context of NMR, the term "processed" indicates that the end-user of the Bruker TopSpin NMR software package has (a) Fourier transformed the raw, time-domain data (the Free Induction Decay) into the frequency-domain and (b) has extracted the list of NMR peaks.

  3. Investigation into the development of computer aided design software for space based sensors

    NASA Technical Reports Server (NTRS)

    Pender, C. W.; Clark, W. L.

    1987-01-01

    The described effort is phase one of the development of a Computer Aided Design (CAD) software to be used to perform radiometric sensor design. The software package will be referred to as SCAD and is directed toward the preliminary phase of the design of space based sensor system. The approach being followed is to develop a modern, graphic intensive, user friendly software package using existing software as building blocks. The emphasis will be directed toward the development of a shell containing menus, smart defaults, and interfaces, which can accommodate a wide variety of existing application software packages. The shell will offer expected utilities such as graphics, tailored menus, and a variety of drivers for I/O devices. Following the development of the shell, the development of SCAD is planned as chiefly selection and integration of appropriate building blocks. The phase one development activities have included: the selection of hardware which will be used with SCAD; the determination of the scope of SCAD; the preliminary evaluation of a number of software packages for applicability to SCAD; determination of a method for achieving required capabilities where voids exist; and then establishing a strategy for binding the software modules into an easy to use tool kit.

  4. Software Review. Macintosh Laboratory Automation: Three Software Packages.

    ERIC Educational Resources Information Center

    Jezl, Barbara Ann

    1990-01-01

    Reviewed are "LABTECH NOTEBOOK,""LabVIEW," and "Parameter Manager pmPLUS/pmTALK." Each package is described including functions, uses, hardware, and costs. Advantages and disadvantages of this type of laboratory approach are discussed. (CW)

  5. Computing and software

    USGS Publications Warehouse

    White, Gary C.; Hines, J.E.

    2004-01-01

    The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methodsavailable. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them.In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented.Rotella et al. (2004) compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004).Efford et al. (2004) present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years.Barker & White (2004) discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine pieces of likelihood functions together to generate estimates. The idea is interesting, and maybe some bright young statistician can work out the specifics to implement the procedure.Choquet et al. (2004) describe MSURGE, a software package that implements the multistate capture–recapture models. The unique feature of MSURGE is that the design matrix is constructed with an interpreted language called GEMACO. Because MSURGE is limited to just multistate models, the special requirements of these likelihoods can be provided.The software and methods presented in these papers gives biologists and wildlife managers an expanding range of possibilities for data analysis. Although ease–of–use is generally getting better, it does not replace the need for understanding of the requirements and structure of the models being computed. The internet provides access to many free software packages as well as user–discussion groups to share knowledge and ideas. (A starting point for wildlife–related applications is (http://www.phidot.org).

  6. Stencil computations for PDE-based applications with examples from DUNE and hypre

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Engwer, C.; Falgout, R. D.; Yang, U. M.

    Here, stencils are commonly used to implement efficient on–the–fly computations of linear operators arising from partial differential equations. At the same time the term “stencil” is not fully defined and can be interpreted differently depending on the application domain and the background of the software developers. Common features in stencil codes are the preservation of the structure given by the discretization of the partial differential equation and the benefit of minimal data storage. We discuss stencil concepts of different complexity, show how they are used in modern software packages like hypre and DUNE, and discuss recent efforts to extend themore » software to enable stencil computations of more complex problems and methods such as inf–sup–stable Stokes discretizations and mixed finite element discretizations.« less

  7. Stencil computations for PDE-based applications with examples from DUNE and hypre

    DOE PAGES

    Engwer, C.; Falgout, R. D.; Yang, U. M.

    2017-02-24

    Here, stencils are commonly used to implement efficient on–the–fly computations of linear operators arising from partial differential equations. At the same time the term “stencil” is not fully defined and can be interpreted differently depending on the application domain and the background of the software developers. Common features in stencil codes are the preservation of the structure given by the discretization of the partial differential equation and the benefit of minimal data storage. We discuss stencil concepts of different complexity, show how they are used in modern software packages like hypre and DUNE, and discuss recent efforts to extend themore » software to enable stencil computations of more complex problems and methods such as inf–sup–stable Stokes discretizations and mixed finite element discretizations.« less

  8. Modal Analysis for Grid Operation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MANGO software is to provide a solution for improving small signal stability of power systems through adjusting operator-controllable variables using PMU measurement. System oscillation problems are one of the major threats to the grid stability and reliability in California and the Western Interconnection. These problems result in power fluctuations, lower grid operation efficiency, and may even lead to large-scale grid breakup and outages. This MANGO software aims to solve this problem by automatically generating recommended operation procedures termed Modal Analysis for Grid Operation (MANGO) to improve damping of inter-area oscillation modes. The MANGO procedure includes three steps: recognizing small signalmore » stability problems, implementing operating point adjustment using modal sensitivity, and evaluating the effectiveness of the adjustment. The MANGO software package is designed to help implement the MANGO procedure.« less

  9. Computer-intensive simulation of solid-state NMR experiments using SIMPSON.

    PubMed

    Tošner, Zdeněk; Andersen, Rasmus; Stevensson, Baltzar; Edén, Mattias; Nielsen, Niels Chr; Vosegaard, Thomas

    2014-09-01

    Conducting large-scale solid-state NMR simulations requires fast computer software potentially in combination with efficient computational resources to complete within a reasonable time frame. Such simulations may involve large spin systems, multiple-parameter fitting of experimental spectra, or multiple-pulse experiment design using parameter scan, non-linear optimization, or optimal control procedures. To efficiently accommodate such simulations, we here present an improved version of the widely distributed open-source SIMPSON NMR simulation software package adapted to contemporary high performance hardware setups. The software is optimized for fast performance on standard stand-alone computers, multi-core processors, and large clusters of identical nodes. We describe the novel features for fast computation including internal matrix manipulations, propagator setups and acquisition strategies. For efficient calculation of powder averages, we implemented interpolation method of Alderman, Solum, and Grant, as well as recently introduced fast Wigner transform interpolation technique. The potential of the optimal control toolbox is greatly enhanced by higher precision gradients in combination with the efficient optimization algorithm known as limited memory Broyden-Fletcher-Goldfarb-Shanno. In addition, advanced parallelization can be used in all types of calculations, providing significant time reductions. SIMPSON is thus reflecting current knowledge in the field of numerical simulations of solid-state NMR experiments. The efficiency and novel features are demonstrated on the representative simulations. Copyright © 2014 Elsevier Inc. All rights reserved.

  10. Multi-dimensional transport modelling of corrosive agents through a bentonite buffer in a Canadian deep geological repository.

    PubMed

    Briggs, Scott; McKelvie, Jennifer; Sleep, Brent; Krol, Magdalena

    2017-12-01

    The use of a deep geological repository (DGR) for the long-term disposal of used nuclear fuel is an approach currently being investigated by several agencies worldwide, including Canada's Nuclear Waste Management Organization (NWMO). Within the DGR, used nuclear fuel will be placed in copper-coated steel containers and surrounded by a bentonite clay buffer. While copper is generally thermodynamically stable, corrosion can occur due to the presence of sulphide under anaerobic conditions. As such, understanding transport of sulphide through the engineered barrier system to the used fuel container is an important consideration in DGR design. In this study, a three-dimensional (3D) model of sulphide transport in a DGR was developed. The numerical model is implemented using COMSOL Multiphysics, a commercial finite element software package. Previous sulphide transport models of the NWMO repository used a simplified one-dimensional system. This work illustrates the importance of 3D modelling to capture non-uniform effects, as results showed locations of maximum sulphide flux are 1.7 times higher than the average flux to the used fuel container. Copyright © 2017. Published by Elsevier B.V.

  11. Probabilistic analysis of the efficiency of the damping devices against nuclear fuel container falling

    NASA Astrophysics Data System (ADS)

    Králik, Juraj

    2017-07-01

    The paper presents the probabilistic and sensitivity analysis of the efficiency of the damping devices cover of nuclear power plant under impact of the container of nuclear fuel of type TK C30 drop. The finite element idealization of nuclear power plant structure is used in space. The steel pipe damper system is proposed for dissipation of the kinetic energy of the container free fall. The experimental results of the shock-damper basic element behavior under impact loads are presented. The Newmark integration method is used for solution of the dynamic equations. The sensitivity and probabilistic analysis of damping devices was realized in the AntHILL and ANSYS software.

  12. 40 CFR 80.614 - What are the alternative defense requirements in lieu of § 80.613(a)(1)(vi)?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... the additive package has been added, together with supporting data which includes one of the following... a MVNRLM diesel fuel additive package into MVNRLM diesel fuel subject to the 15 ppm sulfur standards... alternative to the defense element under § 80.613(a)(1)(vi): (a)(1) The blender of the additive package has a...

  13. 40 CFR 80.614 - What are the alternative defense requirements in lieu of § 80.613(a)(1)(vi)?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... the additive package has been added, together with supporting data which includes one of the following... a MVNRLM diesel fuel additive package into MVNRLM diesel fuel subject to the 15 ppm sulfur standards... alternative to the defense element under § 80.613(a)(1)(vi): (a)(1) The blender of the additive package has a...

  14. 40 CFR 80.614 - What are the alternative defense requirements in lieu of § 80.613(a)(1)(vi)?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... the additive package has been added, together with supporting data which includes one of the following... a MVNRLM diesel fuel additive package into MVNRLM diesel fuel subject to the 15 ppm sulfur standards... alternative to the defense element under § 80.613(a)(1)(vi): (a)(1) The blender of the additive package has a...

  15. 40 CFR 80.614 - What are the alternative defense requirements in lieu of § 80.613(a)(1)(vi)?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... the additive package has been added, together with supporting data which includes one of the following... a MVNRLM diesel fuel additive package into MVNRLM diesel fuel subject to the 15 ppm sulfur standards... alternative to the defense element under § 80.613(a)(1)(vi): (a)(1) The blender of the additive package has a...

  16. 40 CFR 80.614 - What are the alternative defense requirements in lieu of § 80.613(a)(1)(vi)?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... the additive package has been added, together with supporting data which includes one of the following... a MVNRLM diesel fuel additive package into MVNRLM diesel fuel subject to the 15 ppm sulfur standards... alternative to the defense element under § 80.613(a)(1)(vi): (a)(1) The blender of the additive package has a...

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grant, Robert

    Under this grant, three significant software packages were developed or improved, all with the goal of improving the ease-of-use of HPC libraries. The first component is a Python package, named DistArray (originally named Odin), that provides a high-level interface to distributed array computing. This interface is based on the popular and widely used NumPy package and is integrated with the IPython project for enhanced interactive parallel distributed computing. The second Python package is the Distributed Array Protocol (DAP) that enables separate distributed array libraries to share arrays efficiently without copying or sending messages. If a distributed array library supports themore » DAP, it is then automatically able to communicate with any other library that also supports the protocol. This protocol allows DistArray to communicate with the Trilinos library via PyTrilinos, which was also enhanced during this project. A third package, PyTrilinos, was extended to support distributed structured arrays (in addition to the unstructured arrays of its original design), allow more flexible distributed arrays (i.e., the restriction to double precision data was lifted), and implement the DAP. DAP support includes both exporting the protocol so that external packages can use distributed Trilinos data structures, and importing the protocol so that PyTrilinos can work with distributed data from external packages.« less

  18. Comparison of estimates of left ventricular ejection fraction obtained from gated blood pool imaging, different software packages and cameras.

    PubMed

    Steyn, Rachelle; Boniaszczuk, John; Geldenhuys, Theodore

    2014-01-01

    To determine how two software packages, supplied by Siemens and Hermes, for processing gated blood pool (GBP) studies should be used in our department and whether the use of different cameras for the acquisition of raw data influences the results. The study had two components. For the first component, 200 studies were acquired on a General Electric (GE) camera and processed three times by three operators using the Siemens and Hermes software packages. For the second part, 200 studies were acquired on two different cameras (GE and Siemens). The matched pairs of raw data were processed by one operator using the Siemens and Hermes software packages. The Siemens method consistently gave estimates that were 4.3% higher than the Hermes method (p < 0.001). The differences were not associated with any particular level of left ventricular ejection fraction (LVEF). There was no difference in the estimates of LVEF obtained by the three operators (p = 0.1794). The reproducibility of estimates was good. In 95% of patients, using the Siemens method, the SD of the three estimates of LVEF by operator 1 was ≤ 1.7, operator 2 was ≤ 2.1 and operator 3 was ≤ 1.3. The corresponding values for the Hermes method were ≤ 2.5, ≤ 2.0 and ≤ 2.1. There was no difference in the results of matched pairs of data acquired on different cameras (p = 0.4933) CONCLUSION: Software packages for processing GBP studies are not interchangeable. The report should include the name and version of the software package used. Wherever possible, the same package should be used for serial studies. If this is not possible, the report should include the limits of agreement of the different packages. Data acquisition on different cameras did not influence the results.

  19. Analysis of Phenolic Antioxidants in Navy Mobility Fuels by Gas Chromatography-Mass Spectrometry

    DTIC Science & Technology

    2013-06-19

    8.0 LITERATURE CITED .........................................................................................14 APPENDIX A: Calibration Curves for...chromatogram from an F-76 diesel fuel containing 24 ppm of the AO-37 additive package, analyzed using single column GC-MS-SIM method...sulfur diesel fuel containing 6.25 ppm of the AO-37 additive package, analyzed using dual column Deans switch GC-MS-SIM method

  20. Rule-based optimization and multicriteria decision support for packaging a truck chassis

    NASA Astrophysics Data System (ADS)

    Berger, Martin; Lindroth, Peter; Welke, Richard

    2017-06-01

    Trucks are highly individualized products where exchangeable parts are flexibly combined to suit different customer requirements, this leading to a great complexity in product development. Therefore, an optimization approach based on constraint programming is proposed for automatically packaging parts of a truck chassis by following packaging rules expressed as constraints. A multicriteria decision support system is developed where a database of truck layouts is computed, among which interactive navigation then can be performed. The work has been performed in cooperation with Volvo Group Trucks Technology (GTT), from which specific rules have been used. Several scenarios are described where the methods developed can be successfully applied and lead to less time-consuming manual work, fewer mistakes, and greater flexibility in configuring trucks. A numerical evaluation is also presented showing the efficiency and practical relevance of the methods, which are implemented in a software tool.

  1. ELAS: A powerful, general purpose image processing package

    NASA Technical Reports Server (NTRS)

    Walters, David; Rickman, Douglas

    1991-01-01

    ELAS is a software package which has been utilized as an image processing tool for more than a decade. It has been the source of several commercial packages. Now available on UNIX workstations it is a very powerful, flexible set of software. Applications at Stennis Space Center have included a very wide range of areas including medicine, forestry, geology, ecological modeling, and sonar imagery. It remains one of the most powerful image processing packages available, either commercially or in the public domain.

  2. On the release of cppxfel for processing X-ray free-electron laser images.

    PubMed

    Ginn, Helen Mary; Evans, Gwyndaf; Sauter, Nicholas K; Stuart, David Ian

    2016-06-01

    As serial femtosecond crystallography expands towards a variety of delivery methods, including chip-based methods, and smaller collected data sets, the requirement to optimize the data analysis to produce maximum structure quality is becoming increasingly pressing. Here cppxfel , a software package primarily written in C++, which showcases several data analysis techniques, is released. This software package presently indexes images using DIALS (diffraction integration for advanced light sources) and performs an initial orientation matrix refinement, followed by post-refinement of individual images against a reference data set. Cppxfel is released with the hope that the unique and useful elements of this package can be repurposed for existing software packages. However, as released, it produces high-quality crystal structures and is therefore likely to be also useful to experienced users of X-ray free-electron laser (XFEL) software who wish to maximize the information extracted from a limited number of XFEL images.

  3. Analyzing longitudinal data with the linear mixed models procedure in SPSS.

    PubMed

    West, Brady T

    2009-09-01

    Many applied researchers analyzing longitudinal data share a common misconception: that specialized statistical software is necessary to fit hierarchical linear models (also known as linear mixed models [LMMs], or multilevel models) to longitudinal data sets. Although several specialized statistical software programs of high quality are available that allow researchers to fit these models to longitudinal data sets (e.g., HLM), rapid advances in general purpose statistical software packages have recently enabled analysts to fit these same models when using preferred packages that also enable other more common analyses. One of these general purpose statistical packages is SPSS, which includes a very flexible and powerful procedure for fitting LMMs to longitudinal data sets with continuous outcomes. This article aims to present readers with a practical discussion of how to analyze longitudinal data using the LMMs procedure in the SPSS statistical software package.

  4. On the release of cppxfel for processing X-ray free-electron laser images

    DOE PAGES

    Ginn, Helen Mary; Evans, Gwyndaf; Sauter, Nicholas K.; ...

    2016-05-11

    As serial femtosecond crystallography expands towards a variety of delivery methods, including chip-based methods, and smaller collected data sets, the requirement to optimize the data analysis to produce maximum structure quality is becoming increasingly pressing. Herecppxfel, a software package primarily written in C++, which showcases several data analysis techniques, is released. This software package presently indexes images using DIALS (diffraction integration for advanced light sources) and performs an initial orientation matrix refinement, followed by post-refinement of individual images against a reference data set.Cppxfelis released with the hope that the unique and useful elements of this package can be repurposed formore » existing software packages. However, as released, it produces high-quality crystal structures and is therefore likely to be also useful to experienced users of X-ray free-electron laser (XFEL) software who wish to maximize the information extracted from a limited number of XFEL images.« less

  5. The Multi-User Droplet Combustion Apparatus: the Development and Integration Concept for Droplet Combustion Payloads in the Fluids and Combustion Facility Combustion Integrated Rack

    NASA Astrophysics Data System (ADS)

    Myhre, C. A.

    2002-01-01

    The Multi-user Droplet Combustion Apparatus (MDCA) is a multi-user facility designed to accommodate four different droplet combustion science experiments. The MDCA will conduct experiments using the Combustion Integrated Rack (CIR) of the NASA Glenn Research Center's Fluids and Combustion Facility (FCF). The payload is planned for the International Space Station. The MDCA, in conjunction with the CIR, will allow for cost effective extended access to the microgravity environment, not possible on previous space flights. It is currently in the Engineering Model build phase with a planned flight launch with CIR in 2004. This paper provides an overview of the capabilities and development status of the MDCA. The MDCA contains the hardware and software required to conduct unique droplet combustion experiments in space. It consists of a Chamber Insert Assembly, an Avionics Package, and a multiple array of diagnostics. Its modular approach permits on-orbit changes for accommodating different fuels, fuel flow rates, soot sampling mechanisms, and varying droplet support and translation mechanisms to accommodate multiple investigations. Unique diagnostic measurement capabilities for each investigation are also provided. Additional hardware provided by the CIR facility includes the structural support, a combustion chamber, utilities for the avionics and diagnostic packages, and the fuel mixing capability for PI specific combustion chamber environments. Common diagnostics provided by the CIR will also be utilized by the MDCA. Single combustible fuel droplets of varying sizes, freely deployed or supported by a tether are planned for study using the MDCA. Such research supports how liquid-fuel-droplets ignite, spread, and extinguish under quiescent microgravity conditions. This understanding will help us develop more efficient energy production and propulsion systems on Earth and in space, deal better with combustion generated pollution, and address fire hazards associated with using liquid combustibles on Earth and in space. As a result of the concurrent design process of MDCA and CIR, the MDCA team continues to work closely with the CIR team, developing Integration Agreements and an Interface Control Document during preliminary integration activities. Integrated testing of hardware and software systems will occur at the Engineering Model and Flight Model phases. Because the engineering model is a high fidelity unit, it will be upgraded to a flight equivalent Ground Integration Unit (GIU) when the engineering model phase is completed. The GIU will be available on the ground for troubleshooting of any on-orbit problems. Integrated verification testing will be conducted with the MDCA flight unit and the CIR flight unit. Upon successful testing, the MDCA will be shipped to the Kennedy Space Center for a post-shipment checkout and final turn-over to CIR for final processing and launch to the International Space Station. Once on-orbit, the MDCA is managed from the GRC Telescience Support Center (TSC). The MDCA operations team resides at the TSC. Data is transmitted to the PI's at their home sites by means of TREK workstations, allowing direct interaction between the PI and operations staff to maximum science. Upon completion of a PI's experiment, the MDCA is reconfigured for the next of the three follow-on experiments or ultimately removed from the CIR, placed into stowage, and returned to Earth.

  6. Computing Interactions Of Free-Space Radiation With Matter

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Cucinotta, F. A.; Shinn, J. L.; Townsend, L. W.; Badavi, F. F.; Tripathi, R. K.; Silberberg, R.; Tsao, C. H.; Badwar, G. D.

    1995-01-01

    High Charge and Energy Transport (HZETRN) computer program computationally efficient, user-friendly package of software adressing problem of transport of, and shielding against, radiation in free space. Designed as "black box" for design engineers not concerned with physics of underlying atomic and nuclear radiation processes in free-space environment, but rather primarily interested in obtaining fast and accurate dosimetric information for design and construction of modules and devices for use in free space. Computational efficiency achieved by unique algorithm based on deterministic approach to solution of Boltzmann equation rather than computationally intensive statistical Monte Carlo method. Written in FORTRAN.

  7. Improving the environmental and performance characteristics of vehicles by introducing the surfactant additive into gasoline.

    PubMed

    Magaril, Elena; Magaril, Romen

    2016-09-01

    The operation of modern vehicles requires the introduction of package of fuel additives to ensure the required level of operating characteristics, some of which cannot be achieved by current oil refining methods. The use of additives allows flexibility of impact on the properties of the fuel at minimal cost, increasing the efficiency and environmental safety of vehicles. Among the wide assortment of additives available on the world market, many are surfactants. It has been shown that the introduction of some surfactants into gasoline concurrently reduces losses from gasoline evaporation, improves the mixture formation during injection of gasoline into the engine and improves detergent and anticorrosive properties. The surfactant gasoline additive that provides significant improvement in the quality of gasoline used and environmental and operating characteristics of vehicles has been developed and thoroughly investigated. The results of studies confirming the efficiency of the gasoline additive application are herein presented.

  8. GROMACS 4.5: a high-throughput and highly parallel open source molecular simulation toolkit

    PubMed Central

    Pronk, Sander; Páll, Szilárd; Schulz, Roland; Larsson, Per; Bjelkmar, Pär; Apostolov, Rossen; Shirts, Michael R.; Smith, Jeremy C.; Kasson, Peter M.; van der Spoel, David; Hess, Berk; Lindahl, Erik

    2013-01-01

    Motivation: Molecular simulation has historically been a low-throughput technique, but faster computers and increasing amounts of genomic and structural data are changing this by enabling large-scale automated simulation of, for instance, many conformers or mutants of biomolecules with or without a range of ligands. At the same time, advances in performance and scaling now make it possible to model complex biomolecular interaction and function in a manner directly testable by experiment. These applications share a need for fast and efficient software that can be deployed on massive scale in clusters, web servers, distributed computing or cloud resources. Results: Here, we present a range of new simulation algorithms and features developed during the past 4 years, leading up to the GROMACS 4.5 software package. The software now automatically handles wide classes of biomolecules, such as proteins, nucleic acids and lipids, and comes with all commonly used force fields for these molecules built-in. GROMACS supports several implicit solvent models, as well as new free-energy algorithms, and the software now uses multithreading for efficient parallelization even on low-end systems, including windows-based workstations. Together with hand-tuned assembly kernels and state-of-the-art parallelization, this provides extremely high performance and cost efficiency for high-throughput as well as massively parallel simulations. Availability: GROMACS is an open source and free software available from http://www.gromacs.org. Contact: erik.lindahl@scilifelab.se Supplementary information: Supplementary data are available at Bioinformatics online. PMID:23407358

  9. Q-nexus: a comprehensive and efficient analysis pipeline designed for ChIP-nexus.

    PubMed

    Hansen, Peter; Hecht, Jochen; Ibn-Salem, Jonas; Menkuec, Benjamin S; Roskosch, Sebastian; Truss, Matthias; Robinson, Peter N

    2016-11-04

    ChIP-nexus, an extension of the ChIP-exo protocol, can be used to map the borders of protein-bound DNA sequences at nucleotide resolution, requires less input DNA and enables selective PCR duplicate removal using random barcodes. However, the use of random barcodes requires additional preprocessing of the mapping data, which complicates the computational analysis. To date, only a very limited number of software packages are available for the analysis of ChIP-exo data, which have not yet been systematically tested and compared on ChIP-nexus data. Here, we present a comprehensive software package for ChIP-nexus data that exploits the random barcodes for selective removal of PCR duplicates and for quality control. Furthermore, we developed bespoke methods to estimate the width of the protected region resulting from protein-DNA binding and to infer binding positions from ChIP-nexus data. Finally, we applied our peak calling method as well as the two other methods MACE and MACS2 to the available ChIP-nexus data. The Q-nexus software is efficient and easy to use. Novel statistics about duplication rates in consideration of random barcodes are calculated. Our method for the estimation of the width of the protected region yields unbiased signatures that are highly reproducible for biological replicates and at the same time very specific for the respective factors analyzed. As judged by the irreproducible discovery rate (IDR), our peak calling algorithm shows a substantially better reproducibility. An implementation of Q-nexus is available at http://charite.github.io/Q/ .

  10. Development of a new software for analyzing 3-D fracture network

    NASA Astrophysics Data System (ADS)

    Um, Jeong-Gi; Noh, Young-Hwan; Choi, Yosoon

    2014-05-01

    A new software is presented to analyze fracture network in 3-D. Recently, we completed the software package based on information given in EGU2013. The software consists of several modules that play roles in management of borehole data, stochastic modelling of fracture network, construction of analysis domain, visualization of fracture geometry in 3-D, calculation of equivalent pipes and production of cross-section diagrams. Intel Parallel Studio XE 2013, Visual Studio.NET 2010 and the open source VTK library were utilized as development tools to efficiently implement the modules and the graphical user interface of the software. A case study was performed to analyze 3-D fracture network system at the Upper Devonian Grosmont Formation in Alberta, Canada. The results have suggested that the developed software is effective in modelling and visualizing 3-D fracture network system, and can provide useful information to tackle the geomechanical problems related to strength, deformability and hydraulic behaviours of the fractured rock masses. This presentation describes the concept and details of the development and implementation of the software.

  11. Wood-fired fuel cells in selected buildings

    NASA Astrophysics Data System (ADS)

    McIlveen-Wright, D. R.; McMullan, J. T.; Guiney, D. J.

    The positive attributes of fuel cells for high efficiency power generation at any scale and of biomass as a renewable energy source which is not intermittent, location-dependent or very difficult to store, suggest that a combined heat and power (CHP) system consisting of a fuel cell integrated with a wood gasifier (FCIWG) may offer a combination for delivering heat and electricity cleanly and efficiently. Phosphoric acid fuel cell (PAFC) systems, fuelled by natural gas, have already been used in a range of CHP applications in urban settings. Some of these applications are examined here using integrated biomass gasification/fuel cell systems in CHP configurations. Five building systems, which have different energy demand profiles, are assessed. These are a hospital, a hotel, a leisure centre, a multi-residential community and a university hall of residence. Heat and electricity use profiles for typical examples of these buildings were obtained and the FCIWG system was scaled to the power demand. The FCIWG system was modelled for two different types of fuel cell, the molten carbonate and the phosphoric acid. In each case an oxygen-fired gasification system is proposed, in order to eliminate the need for a methane reformer. Technical, environmental and economic analyses of each version were made, using the ECLIPSE process simulation package. Since fuel cell lifetimes are not yet precisely known, economics for a range of fuel cell lifetimes have been produced. The wood-fired PAFC system was found to have low electrical efficiency (13-16%), but much of the heat could be recovered, so that the overall efficiency was 64-67%, suitable where high heat/electricity values are required. The wood-fired molten carbonate fuel cell (MCFC) system was found to be quite efficient for electricity generation (24-27%), with an overall energy efficiency of 60-63%. The expected capital costs of both systems would currently make them uncompetitive for general use, but the specific features of selected buildings in rural areas, with regard to the high cost of importing other fuel, and/or lack of grid electricity, could still make these systems attractive options. Any economic analysis of these systems is beset with severe difficulties. Capital costs of the major system components are not known with any great precision. However, a guideline assessment of the payback period for such CHP systems was made. When the best available capital costs for system components were used, most of these systems were found to have unacceptably long payback periods, particularly where the fuel cell lifetimes are short, but the larger systems show the potential for a reasonable economic return.

  12. Office Computer Software: A Comprehensive Review of Software Programs.

    ERIC Educational Resources Information Center

    Secretary, 1992

    1992-01-01

    Describes types of software including system software, application software, spreadsheets, accounting software, graphics packages, desktop publishing software, database, desktop and personal information management software, project and records management software, groupware, and shareware. (JOW)

  13. Three-Dimensional (3D) Nanometrology Based on Scanning Electron Microscope (SEM) Stereophotogrammetry.

    PubMed

    Tondare, Vipin N; Villarrubia, John S; Vlada R, András E

    2017-10-01

    Three-dimensional (3D) reconstruction of a sample surface from scanning electron microscope (SEM) images taken at two perspectives has been known for decades. Nowadays, there exist several commercially available stereophotogrammetry software packages. For testing these software packages, in this study we used Monte Carlo simulated SEM images of virtual samples. A virtual sample is a model in a computer, and its true dimensions are known exactly, which is impossible for real SEM samples due to measurement uncertainty. The simulated SEM images can be used for algorithm testing, development, and validation. We tested two stereophotogrammetry software packages and compared their reconstructed 3D models with the known geometry of the virtual samples used to create the simulated SEM images. Both packages performed relatively well with simulated SEM images of a sample with a rough surface. However, in a sample containing nearly uniform and therefore low-contrast zones, the height reconstruction error was ≈46%. The present stereophotogrammetry software packages need further improvement before they can be used reliably with SEM images with uniform zones.

  14. Research related to improved computer aided design software package. [comparative efficiency of finite, boundary, and hybrid element methods in elastostatics

    NASA Technical Reports Server (NTRS)

    Walston, W. H., Jr.

    1986-01-01

    The comparative computational efficiencies of the finite element (FEM), boundary element (BEM), and hybrid boundary element-finite element (HVFEM) analysis techniques are evaluated for representative bounded domain interior and unbounded domain exterior problems in elastostatics. Computational efficiency is carefully defined in this study as the computer time required to attain a specified level of solution accuracy. The study found the FEM superior to the BEM for the interior problem, while the reverse was true for the exterior problem. The hybrid analysis technique was found to be comparable or superior to both the FEM and BEM for both the interior and exterior problems.

  15. Simulation and animation of sensor-driven robots.

    PubMed

    Chen, C; Trivedi, M M; Bidlack, C R

    1994-10-01

    Most simulation and animation systems utilized in robotics are concerned with simulation of the robot and its environment without simulation of sensors. These systems have difficulty in handling robots that utilize sensory feedback in their operation. In this paper, a new design of an environment for simulation, animation, and visualization of sensor-driven robots is presented. As sensor technology advances, increasing numbers of robots are equipped with various types of sophisticated sensors. The main goal of creating the visualization environment is to aid the automatic robot programming and off-line programming capabilities of sensor-driven robots. The software system will help the users visualize the motion and reaction of the sensor-driven robot under their control program. Therefore, the efficiency of the software development is increased, the reliability of the software and the operation safety of the robot are ensured, and the cost of new software development is reduced. Conventional computer-graphics-based robot simulation and animation software packages lack of capabilities for robot sensing simulation. This paper describes a system designed to overcome this deficiency.

  16. User Documentation for Multiple Software Releases

    NASA Technical Reports Server (NTRS)

    Humphrey, R.

    1982-01-01

    In proposed solution to problems of frequent software releases and updates, documentation would be divided into smaller packages, each of which contains data relating to only one of several software components. Changes would not affect entire document. Concept would improve dissemination of information regarding changes and would improve quality of data supporting packages. Would help to insure both timeliness and more thorough scrutiny of changes.

  17. Versatile Software Package For Near Real-Time Analysis of Experimental Data

    NASA Technical Reports Server (NTRS)

    Wieseman, Carol D.; Hoadley, Sherwood T.

    1998-01-01

    This paper provides an overview of a versatile software package developed for time- and frequency-domain analyses of experimental wind-tunnel data. This package, originally developed for analyzing data in the NASA Langley Transonic Dynamics Tunnel (TDT), is applicable for analyzing any time-domain data. A Matlab-based software package, TDT-analyzer, provides a compendium of commonly-required dynamic analysis functions in a user-friendly interactive and batch processing environment. TDT-analyzer has been used extensively to provide on-line near real-time and post-test examination and reduction of measured data acquired during wind tunnel tests of aeroelastically-scaled models of aircraft and rotorcraft as well as a flight test of the NASA High Alpha Research Vehicle (HARV) F-18. The package provides near real-time results in an informative and timely manner far exceeding prior methods of data reduction at the TDT.

  18. Evaluation of 3-D graphics software: A case study

    NASA Technical Reports Server (NTRS)

    Lores, M. E.; Chasen, S. H.; Garner, J. M.

    1984-01-01

    An efficient 3-D geometry graphics software package which is suitable for advanced design studies was developed. The advanced design system is called GRADE--Graphics for Advanced Design. Efficiency and ease of use are gained by sacrificing flexibility in surface representation. The immediate options were either to continue development of GRADE or to acquire a commercially available system which would replace or complement GRADE. Test cases which would reveal the ability of each system to satisfy the requirements were developed. A scoring method which adequately captured the relative capabilities of the three systems was presented. While more complex multi-attribute decision methods could be used, the selected method provides all the needed information without being so complex that it is difficult to understand. If the value factors are modestly perturbed, system Z is a clear winner based on its overall capabilities. System Z is superior in two vital areas: surfacing and ease of interface with application programs.

  19. CFD Fuel Slosh Modeling of Fluid-Structure Interaction in Spacecraft Propellant Tanks with Diaphragms

    NASA Technical Reports Server (NTRS)

    Sances, Dillon J.; Gangadharan, Sathya N.; Sudermann, James E.; Marsell, Brandon

    2010-01-01

    Liquid sloshing within spacecraft propellant tanks causes rapid energy dissipation at resonant modes, which can result in attitude destabilization of the vehicle. Identifying resonant slosh modes currently requires experimental testing and mechanical pendulum analogs to characterize the slosh dynamics. Computational Fluid Dynamics (CFD) techniques have recently been validated as an effective tool for simulating fuel slosh within free-surface propellant tanks. Propellant tanks often incorporate an internal flexible diaphragm to separate ullage and propellant which increases modeling complexity. A coupled fluid-structure CFD model is required to capture the damping effects of a flexible diaphragm on the propellant. ANSYS multidisciplinary engineering software employs a coupled solver for analyzing two-way Fluid Structure Interaction (FSI) cases such as the diaphragm propellant tank system. Slosh models generated by ANSYS software are validated by experimental lateral slosh test results. Accurate data correlation would produce an innovative technique for modeling fuel slosh within diaphragm tanks and provide an accurate and efficient tool for identifying resonant modes and the slosh dynamic response.

  20. Lin4Neuro: a customized Linux distribution ready for neuroimaging analysis

    PubMed Central

    2011-01-01

    Background A variety of neuroimaging software packages have been released from various laboratories worldwide, and many researchers use these packages in combination. Though most of these software packages are freely available, some people find them difficult to install and configure because they are mostly based on UNIX-like operating systems. We developed a live USB-bootable Linux package named "Lin4Neuro." This system includes popular neuroimaging analysis tools. The user interface is customized so that even Windows users can use it intuitively. Results The boot time of this system was only around 40 seconds. We performed a benchmark test of inhomogeneity correction on 10 subjects of three-dimensional T1-weighted MRI scans. The processing speed of USB-booted Lin4Neuro was as fast as that of the package installed on the hard disk drive. We also installed Lin4Neuro on a virtualization software package that emulates the Linux environment on a Windows-based operation system. Although the processing speed was slower than that under other conditions, it remained comparable. Conclusions With Lin4Neuro in one's hand, one can access neuroimaging software packages easily, and immediately focus on analyzing data. Lin4Neuro can be a good primer for beginners of neuroimaging analysis or students who are interested in neuroimaging analysis. It also provides a practical means of sharing analysis environments across sites. PMID:21266047

  1. Lin4Neuro: a customized Linux distribution ready for neuroimaging analysis.

    PubMed

    Nemoto, Kiyotaka; Dan, Ippeita; Rorden, Christopher; Ohnishi, Takashi; Tsuzuki, Daisuke; Okamoto, Masako; Yamashita, Fumio; Asada, Takashi

    2011-01-25

    A variety of neuroimaging software packages have been released from various laboratories worldwide, and many researchers use these packages in combination. Though most of these software packages are freely available, some people find them difficult to install and configure because they are mostly based on UNIX-like operating systems. We developed a live USB-bootable Linux package named "Lin4Neuro." This system includes popular neuroimaging analysis tools. The user interface is customized so that even Windows users can use it intuitively. The boot time of this system was only around 40 seconds. We performed a benchmark test of inhomogeneity correction on 10 subjects of three-dimensional T1-weighted MRI scans. The processing speed of USB-booted Lin4Neuro was as fast as that of the package installed on the hard disk drive. We also installed Lin4Neuro on a virtualization software package that emulates the Linux environment on a Windows-based operation system. Although the processing speed was slower than that under other conditions, it remained comparable. With Lin4Neuro in one's hand, one can access neuroimaging software packages easily, and immediately focus on analyzing data. Lin4Neuro can be a good primer for beginners of neuroimaging analysis or students who are interested in neuroimaging analysis. It also provides a practical means of sharing analysis environments across sites.

  2. 49 CFR 1104.3 - Copies.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... fully evaluate evidence, all spreadsheets must be fully accessible and manipulable. Electronic databases... Microsoft Open Database Connectivity (ODBC) standard. ODBC is a Windows technology that allows a database software package to import data from a database created using a different software package. We currently...

  3. 49 CFR 1104.3 - Copies.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... fully evaluate evidence, all spreadsheets must be fully accessible and manipulable. Electronic databases... Microsoft Open Database Connectivity (ODBC) standard. ODBC is a Windows technology that allows a database software package to import data from a database created using a different software package. We currently...

  4. 49 CFR 1104.3 - Copies.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... fully evaluate evidence, all spreadsheets must be fully accessible and manipulable. Electronic databases... Microsoft Open Database Connectivity (ODBC) standard. ODBC is a Windows technology that allows a database software package to import data from a database created using a different software package. We currently...

  5. 49 CFR 1104.3 - Copies.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... fully evaluate evidence, all spreadsheets must be fully accessible and manipulable. Electronic databases... Microsoft Open Database Connectivity (ODBC) standard. ODBC is a Windows technology that allows a database software package to import data from a database created using a different software package. We currently...

  6. Environmental databases and other computerized information tools

    NASA Technical Reports Server (NTRS)

    Clark-Ingram, Marceia

    1995-01-01

    Increasing environmental legislation has brought about the development of many new environmental databases and software application packages to aid in the quest for environmental compliance. These databases and software packages are useful tools and applicable to a wide range of environmental areas from atmospheric modeling to materials replacement technology. The great abundance of such products and services can be very overwhelming when trying to identify the tools which best meet specific needs. This paper will discuss the types of environmental databases and software packages available. This discussion will also encompass the affected environmental areas of concern, product capabilities, and hardware requirements for product utilization.

  7. eddy4R 0.2.0: a DevOps model for community-extensible processing and analysis of eddy-covariance data based on R, Git, Docker, and HDF5

    NASA Astrophysics Data System (ADS)

    Metzger, Stefan; Durden, David; Sturtevant, Cove; Luo, Hongyan; Pingintha-Durden, Natchaya; Sachs, Torsten; Serafimovich, Andrei; Hartmann, Jörg; Li, Jiahong; Xu, Ke; Desai, Ankur R.

    2017-08-01

    Large differences in instrumentation, site setup, data format, and operating system stymie the adoption of a universal computational environment for processing and analyzing eddy-covariance (EC) data. This results in limited software applicability and extensibility in addition to often substantial inconsistencies in flux estimates. Addressing these concerns, this paper presents the systematic development of portable, reproducible, and extensible EC software achieved by adopting a development and systems operation (DevOps) approach. This software development model is used for the creation of the eddy4R family of EC code packages in the open-source R language for statistical computing. These packages are community developed, iterated via the Git distributed version control system, and wrapped into a portable and reproducible Docker filesystem that is independent of the underlying host operating system. The HDF5 hierarchical data format then provides a streamlined mechanism for highly compressed and fully self-documented data ingest and output. The usefulness of the DevOps approach was evaluated for three test applications. First, the resultant EC processing software was used to analyze standard flux tower data from the first EC instruments installed at a National Ecological Observatory (NEON) field site. Second, through an aircraft test application, we demonstrate the modular extensibility of eddy4R to analyze EC data from other platforms. Third, an intercomparison with commercial-grade software showed excellent agreement (R2 = 1.0 for CO2 flux). In conjunction with this study, a Docker image containing the first two eddy4R packages and an executable example workflow, as well as first NEON EC data products are released publicly. We conclude by describing the work remaining to arrive at the automated generation of science-grade EC fluxes and benefits to the science community at large. This software development model is applicable beyond EC and more generally builds the capacity to deploy complex algorithms developed by scientists in an efficient and scalable manner. In addition, modularity permits meeting project milestones while retaining extensibility with time.

  8. Reviews.

    ERIC Educational Resources Information Center

    Radcliffe, George; And Others

    1988-01-01

    Reviews three software packages: 1) a package containing 68 programs covering general topics in chemistry; 2) a package dealing with acid-base titration curves and allows for variables to be changed; 3) a chemistry tutorial and drill package. (MVL)

  9. Wall adjustment strategy software for use with the NASA Langley 0.3-meter transonic cryogenic tunnel adaptive wall test section

    NASA Technical Reports Server (NTRS)

    Wolf, Stephen W. D.

    1988-01-01

    The Wall Adjustment Strategy (WAS) software provides successful on-line control of the 2-D flexible walled test section of the Langley 0.3-m Transonic Cryogenic Tunnel. This software package allows the level of operator intervention to be regulated as necessary for research and production type 2-D testing using and Adaptive Wall Test Section (AWTS). The software is designed to accept modification for future requirements, such as 3-D testing, with a minimum of complexity. The WAS software described is an attempt to provide a user friendly package which could be used to control any flexible walled AWTS. Control system constraints influence the details of data transfer, not the data type. Then this entire software package could be used in different control systems, if suitable interface software is available. A complete overview of the software highlights the data flow paths, the modular architecture of the software and the various operating and analysis modes available. A detailed description of the software modules includes listings of the code. A user's manual is provided to explain task generation, operating environment, user options and what to expect at execution.

  10. 77 FR 28406 - Spent Fuel Transportation Risk Assessment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-14

    ... Regulations (10 CFR) part 71, ``Packaging and Transportation of Radioactive Waste,'' dated January 26, 2004) for the packaging and transport of spent nuclear fuel (and other large quantities of radioactive... NUREG- 0170, ``Final Environmental Statement on the Transportation of Radioactive Material by Air and...

  11. TargetSearch--a Bioconductor package for the efficient preprocessing of GC-MS metabolite profiling data.

    PubMed

    Cuadros-Inostroza, Alvaro; Caldana, Camila; Redestig, Henning; Kusano, Miyako; Lisec, Jan; Peña-Cortés, Hugo; Willmitzer, Lothar; Hannah, Matthew A

    2009-12-16

    Metabolite profiling, the simultaneous quantification of multiple metabolites in an experiment, is becoming increasingly popular, particularly with the rise of systems-level biology. The workhorse in this field is gas-chromatography hyphenated with mass spectrometry (GC-MS). The high-throughput of this technology coupled with a demand for large experiments has led to data pre-processing, i.e. the quantification of metabolites across samples, becoming a major bottleneck. Existing software has several limitations, including restricted maximum sample size, systematic errors and low flexibility. However, the biggest limitation is that the resulting data usually require extensive hand-curation, which is subjective and can typically take several days to weeks. We introduce the TargetSearch package, an open source tool which is a flexible and accurate method for pre-processing even very large numbers of GC-MS samples within hours. We developed a novel strategy to iteratively correct and update retention time indices for searching and identifying metabolites. The package is written in the R programming language with computationally intensive functions written in C for speed and performance. The package includes a graphical user interface to allow easy use by those unfamiliar with R. TargetSearch allows fast and accurate data pre-processing for GC-MS experiments and overcomes the sample number limitations and manual curation requirements of existing software. We validate our method by carrying out an analysis against both a set of known chemical standard mixtures and of a biological experiment. In addition we demonstrate its capabilities and speed by comparing it with other GC-MS pre-processing tools. We believe this package will greatly ease current bottlenecks and facilitate the analysis of metabolic profiling data.

  12. TargetSearch - a Bioconductor package for the efficient preprocessing of GC-MS metabolite profiling data

    PubMed Central

    2009-01-01

    Background Metabolite profiling, the simultaneous quantification of multiple metabolites in an experiment, is becoming increasingly popular, particularly with the rise of systems-level biology. The workhorse in this field is gas-chromatography hyphenated with mass spectrometry (GC-MS). The high-throughput of this technology coupled with a demand for large experiments has led to data pre-processing, i.e. the quantification of metabolites across samples, becoming a major bottleneck. Existing software has several limitations, including restricted maximum sample size, systematic errors and low flexibility. However, the biggest limitation is that the resulting data usually require extensive hand-curation, which is subjective and can typically take several days to weeks. Results We introduce the TargetSearch package, an open source tool which is a flexible and accurate method for pre-processing even very large numbers of GC-MS samples within hours. We developed a novel strategy to iteratively correct and update retention time indices for searching and identifying metabolites. The package is written in the R programming language with computationally intensive functions written in C for speed and performance. The package includes a graphical user interface to allow easy use by those unfamiliar with R. Conclusions TargetSearch allows fast and accurate data pre-processing for GC-MS experiments and overcomes the sample number limitations and manual curation requirements of existing software. We validate our method by carrying out an analysis against both a set of known chemical standard mixtures and of a biological experiment. In addition we demonstrate its capabilities and speed by comparing it with other GC-MS pre-processing tools. We believe this package will greatly ease current bottlenecks and facilitate the analysis of metabolic profiling data. PMID:20015393

  13. Turbocharged Diesels

    NASA Technical Reports Server (NTRS)

    1984-01-01

    In a number of feasibility studies of turbine rotor designs, engineers of Cummins Engine Company, Inc.'s turbocharger group have utilized a computer program from COSMIC. Part of Cummins research effort is aimed toward introduction of advanced turbocharged engines that deliver extra power with greater fuel efficiency. Company claims use of COSMIC program substantially reduced software development costs.

  14. User’s guide for MapMark4—An R package for the probability calculations in three-part mineral resource assessments

    USGS Publications Warehouse

    Ellefsen, Karl J.

    2017-06-27

    MapMark4 is a software package that implements the probability calculations in three-part mineral resource assessments. Functions within the software package are written in the R statistical programming language. These functions, their documentation, and a copy of this user’s guide are bundled together in R’s unit of shareable code, which is called a “package.” This user’s guide includes step-by-step instructions showing how the functions are used to carry out the probability calculations. The calculations are demonstrated using test data, which are included in the package.

  15. Multifit / Polydefix : a framework for the analysis of polycrystal deformation using X-rays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merkel, Sébastien; Hilairet, Nadège

    2015-06-27

    Multifit/Polydefixis an open source IDL software package for the efficient processing of diffraction data obtained in deformation apparatuses at synchrotron beamlines.Multifitallows users to decompose two-dimensional diffraction images into azimuthal slices, fit peak positions, shapes and intensities, and propagate the results to other azimuths and images.Polydefixis for analysis of deformation experiments. Starting from output files created inMultifitor other packages, it will extract elastic lattice strains, evaluate sample pressure and differential stress, and prepare input files for further texture analysis. TheMultifit/Polydefixpackage is designed to make the tedious data analysis of synchrotron-based plasticity, rheology or other time-dependent experiments very straightforward and accessible tomore » a wider community.« less

  16. 49 CFR 1104.3 - Copies.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Microsoft Open Database Connectivity (ODBC) standard. ODBC is a Windows technology that allows a database software package to import data from a database created using a different software package. We currently...-compatible format. All databases must be supported with adequate documentation on data attributes, SQL...

  17. Scout 2008 Version 1.0 User Guide

    EPA Science Inventory

    The Scout 2008 version 1.0 software package provides a wide variety of classical and robust statistical methods that are not typically available in other commercial software packages. A major part of Scout deals with classical, robust, and resistant univariate and multivariate ou...

  18. INTERFACING SAS TO ORACLE IN THE UNIX ENVIRONMENT

    EPA Science Inventory

    SAS is an EPA standard data and statistical analysis software package while ORACLE is EPA's standard data base management system software package. RACLE has the advantage over SAS in data retrieval and storage capabilities but has limited data and statistical analysis capability....

  19. Mirion--a software package for automatic processing of mass spectrometric images.

    PubMed

    Paschke, C; Leisner, A; Hester, A; Maass, K; Guenther, S; Bouschen, W; Spengler, B

    2013-08-01

    Mass spectrometric imaging (MSI) techniques are of growing interest for the Life Sciences. In recent years, the development of new instruments employing ion sources that are tailored for spatial scanning allowed the acquisition of large data sets. A subsequent data processing, however, is still a bottleneck in the analytical process, as a manual data interpretation is impossible within a reasonable time frame. The transformation of mass spectrometric data into spatial distribution images of detected compounds turned out to be the most appropriate method to visualize the results of such scans, as humans are able to interpret images faster and easier than plain numbers. Image generation, thus, is a time-consuming and complex yet very efficient task. The free software package "Mirion," presented in this paper, allows the handling and analysis of data sets acquired by mass spectrometry imaging. Mirion can be used for image processing of MSI data obtained from many different sources, as it uses the HUPO-PSI-based standard data format imzML, which is implemented in the proprietary software of most of the mass spectrometer companies. Different graphical representations of the recorded data are available. Furthermore, automatic calculation and overlay of mass spectrometric images promotes direct comparison of different analytes for data evaluation. The program also includes tools for image processing and image analysis.

  20. TomoPhantom, a software package to generate 2D-4D analytical phantoms for CT image reconstruction algorithm benchmarks

    NASA Astrophysics Data System (ADS)

    Kazantsev, Daniil; Pickalov, Valery; Nagella, Srikanth; Pasca, Edoardo; Withers, Philip J.

    2018-01-01

    In the field of computerized tomographic imaging, many novel reconstruction techniques are routinely tested using simplistic numerical phantoms, e.g. the well-known Shepp-Logan phantom. These phantoms cannot sufficiently cover the broad spectrum of applications in CT imaging where, for instance, smooth or piecewise-smooth 3D objects are common. TomoPhantom provides quick access to an external library of modular analytical 2D/3D phantoms with temporal extensions. In TomoPhantom, quite complex phantoms can be built using additive combinations of geometrical objects, such as, Gaussians, parabolas, cones, ellipses, rectangles and volumetric extensions of them. Newly designed phantoms are better suited for benchmarking and testing of different image processing techniques. Specifically, tomographic reconstruction algorithms which employ 2D and 3D scanning geometries, can be rigorously analyzed using the software. TomoPhantom also provides a capability of obtaining analytical tomographic projections which further extends the applicability of software towards more realistic, free from the "inverse crime" testing. All core modules of the package are written in the C-OpenMP language and wrappers for Python and MATLAB are provided to enable easy access. Due to C-based multi-threaded implementation, volumetric phantoms of high spatial resolution can be obtained with computational efficiency.

  1. Atomdroid: a computational chemistry tool for mobile platforms.

    PubMed

    Feldt, Jonas; Mata, Ricardo A; Dieterich, Johannes M

    2012-04-23

    We present the implementation of a new molecular mechanics program designed for use in mobile platforms, the first specifically built for these devices. The software is designed to run on Android operating systems and is compatible with several modern tablet-PCs and smartphones available in the market. It includes molecular viewer/builder capabilities with integrated routines for geometry optimizations and Monte Carlo simulations. These functionalities allow it to work as a stand-alone tool. We discuss some particular development aspects, as well as the overall feasibility of using computational chemistry software packages in mobile platforms. Benchmark calculations show that through efficient implementation techniques even hand-held devices can be used to simulate midsized systems using force fields.

  2. ’Pushing a Big Rock Up a Steep Hill’: Acquisition Lessons Learned from DoD Applications Storefront

    DTIC Science & Technology

    2014-04-30

    software patches, web applications, widgets, and mobile application packages. The envisioned application store will deliver software from a central...automated delivery of software patches, web applications, widgets, and mobile application packages. The envisioned application store will deliver... mobile technologies, hoping to enhance warfighter situational awareness and access to information. Unfortunately, the Defense Acquisition System has not

  3. MicroSIFT Courseware Evaluation. [Set 13 (294-319), Set 14 (320-361), with Hardware (HRD) and Subject (SBJ) Indexes to Both Sets.

    ERIC Educational Resources Information Center

    Northwest Regional Educational Lab., Portland, OR.

    This document consists of 68 microcomputer software package evaluations prepared by MicroSIFT (Microcomputer Software and Information for Teachers) Clearinghouse at the Northwest Regional Education Laboratory. There are 26 packages in set 13 and 42 in set 14. Each software review lists producer, time and place of evaluation, cost, ability level,…

  4. PIV Data Validation Software Package

    NASA Technical Reports Server (NTRS)

    Blackshire, James L.

    1997-01-01

    A PIV data validation and post-processing software package was developed to provide semi-automated data validation and data reduction capabilities for Particle Image Velocimetry data sets. The software provides three primary capabilities including (1) removal of spurious vector data, (2) filtering, smoothing, and interpolating of PIV data, and (3) calculations of out-of-plane vorticity, ensemble statistics, and turbulence statistics information. The software runs on an IBM PC/AT host computer working either under Microsoft Windows 3.1 or Windows 95 operating systems.

  5. Measuring the electric activity of chick embryos heart through 16 bit audio card monitored by the Goldwavetm software

    NASA Astrophysics Data System (ADS)

    Silva, Dilson; Cortez, Celia Martins

    2015-12-01

    In the present work we used a high-resolution, low-cost apparatus capable of detecting waves fit inside the sound bandwidth, and the software package GoldwaveTM for graphical display, processing and monitoring the signals, to study aspects of the electric heart activity of early avian embryos, specifically at the 18th Hamburger & Hamilton stage of the embryo development. The species used was the domestic chick (Gallus gallus), and we carried out 23 experiments in which cardiographic spectra of QRS complex waves representing the propagation of depolarization waves through ventricles was recorded using microprobes and reference electrodes directly on the embryos. The results show that technique using 16 bit audio card monitored by the GoldwaveTM software was efficient to study signal aspects of heart electric activity of early avian embryos.

  6. Analyzing gene perturbation screens with nested effects models in R and bioconductor.

    PubMed

    Fröhlich, Holger; Beissbarth, Tim; Tresch, Achim; Kostka, Dennis; Jacob, Juby; Spang, Rainer; Markowetz, F

    2008-11-01

    Nested effects models (NEMs) are a class of probabilistic models introduced to analyze the effects of gene perturbation screens visible in high-dimensional phenotypes like microarrays or cell morphology. NEMs reverse engineer upstream/downstream relations of cellular signaling cascades. NEMs take as input a set of candidate pathway genes and phenotypic profiles of perturbing these genes. NEMs return a pathway structure explaining the observed perturbation effects. Here, we describe the package nem, an open-source software to efficiently infer NEMs from data. Our software implements several search algorithms for model fitting and is applicable to a wide range of different data types and representations. The methods we present summarize the current state-of-the-art in NEMs. Our software is written in the R language and freely avail-able via the Bioconductor project at http://www.bioconductor.org.

  7. Use of the NetBeans Platform for NASA Robotic Conjunction Assessment Risk Analysis

    NASA Technical Reports Server (NTRS)

    Sabey, Nickolas J.

    2014-01-01

    The latest Java and JavaFX technologies are very attractive software platforms for customers involved in space mission operations such as those of NASA and the US Air Force. For NASA Robotic Conjunction Assessment Risk Analysis (CARA), the NetBeans platform provided an environment in which scalable software solutions could be developed quickly and efficiently. Both Java 8 and the NetBeans platform are in the process of simplifying CARA development in secure environments by providing a significant amount of capability in a single accredited package, where accreditation alone can account for 6-8 months for each library or software application. Capabilities either in use or being investigated by CARA include: 2D and 3D displays with JavaFX, parallelization with the new Streams API, and scalability through the NetBeans plugin architecture.

  8. Improvements to the APBS biomolecular solvation software suite: Improvements to the APBS Software Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jurrus, Elizabeth; Engel, Dave; Star, Keith

    The Adaptive Poisson-Boltzmann Solver (APBS) software was developed to solve the equations of continuum electrostatics for large biomolecular assemblages that has provided impact in the study of a broad range of chemical, biological, and biomedical applications. APBS addresses three key technology challenges for understanding solvation and electrostatics in biomedical applications: accurate and efficient models for biomolecular solvation and electrostatics, robust and scalable software for applying those theories to biomolecular systems, and mechanisms for sharing and analyzing biomolecular electrostatics data in the scientific community. To address new research applications and advancing computational capabilities, we have continually updated APBS and its suitemore » of accompanying software since its release in 2001. In this manuscript, we discuss the models and capabilities that have recently been implemented within the APBS software package including: a Poisson-Boltzmann analytical and a semi-analytical solver, an optimized boundary element solver, a geometry-based geometric flow solvation model, a graph theory based algorithm for determining pKa values, and an improved web-based visualization tool for viewing electrostatics.« less

  9. AN ADA LINEAR ALGEBRA PACKAGE MODELED AFTER HAL/S

    NASA Technical Reports Server (NTRS)

    Klumpp, A. R.

    1994-01-01

    This package extends the Ada programming language to include linear algebra capabilities similar to those of the HAL/S programming language. The package is designed for avionics applications such as Space Station flight software. In addition to the HAL/S built-in functions, the package incorporates the quaternion functions used in the Shuttle and Galileo projects, and routines from LINPAK that solve systems of equations involving general square matrices. Language conventions in this package follow those of HAL/S to the maximum extent practical and minimize the effort required for writing new avionics software and translating existent software into Ada. Valid numeric types in this package include scalar, vector, matrix, and quaternion declarations. (Quaternions are fourcomponent vectors used in representing motion between two coordinate frames). Single precision and double precision floating point arithmetic is available in addition to the standard double precision integer manipulation. Infix operators are used instead of function calls to define dot products, cross products, quaternion products, and mixed scalar-vector, scalar-matrix, and vector-matrix products. The package contains two generic programs: one for floating point, and one for integer. The actual component type is passed as a formal parameter to the generic linear algebra package. The procedures for solving systems of linear equations defined by general matrices include GEFA, GECO, GESL, and GIDI. The HAL/S functions include ABVAL, UNIT, TRACE, DET, INVERSE, TRANSPOSE, GET, PUT, FETCH, PLACE, and IDENTITY. This package is written in Ada (Version 1.2) for batch execution and is machine independent. The linear algebra software depends on nothing outside the Ada language except for a call to a square root function for floating point scalars (such as SQRT in the DEC VAX MATHLIB library). This program was developed in 1989, and is a copyrighted work with all copyright vested in NASA.

  10. SIMA: Python software for analysis of dynamic fluorescence imaging data.

    PubMed

    Kaifosh, Patrick; Zaremba, Jeffrey D; Danielson, Nathan B; Losonczy, Attila

    2014-01-01

    Fluorescence imaging is a powerful method for monitoring dynamic signals in the nervous system. However, analysis of dynamic fluorescence imaging data remains burdensome, in part due to the shortage of available software tools. To address this need, we have developed SIMA, an open source Python package that facilitates common analysis tasks related to fluorescence imaging. Functionality of this package includes correction of motion artifacts occurring during in vivo imaging with laser-scanning microscopy, segmentation of imaged fields into regions of interest (ROIs), and extraction of signals from the segmented ROIs. We have also developed a graphical user interface (GUI) for manual editing of the automatically segmented ROIs and automated registration of ROIs across multiple imaging datasets. This software has been designed with flexibility in mind to allow for future extension with different analysis methods and potential integration with other packages. Software, documentation, and source code for the SIMA package and ROI Buddy GUI are freely available at http://www.losonczylab.org/sima/.

  11. New generation of exploration tools: interactive modeling software and microcomputers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krajewski, S.A.

    1986-08-01

    Software packages offering interactive modeling techniques are now available for use on microcomputer hardware systems. These packages are reasonably priced for both company and independent explorationists; they do not require users to have high levels of computer literacy; they are capable of rapidly completing complex ranges of sophisticated geologic and geophysical modeling tasks; and they can produce presentation-quality output for comparison with real-world data. For example, interactive packages are available for mapping, log analysis, seismic modeling, reservoir studies, and financial projects as well as for applying a variety of statistical and geostatistical techniques to analysis of exploration data. More importantly,more » these packages enable explorationists to directly apply their geologic expertise when developing and fine-tuning models for identifying new prospects and for extending producing fields. As a result of these features, microcomputers and interactive modeling software are becoming common tools in many exploration offices. Gravity and magnetics software programs illustrate some of the capabilities of such exploration tools.« less

  12. RxGen General Optical Model Prescription Generator

    NASA Technical Reports Server (NTRS)

    Sigrist, Norbert

    2012-01-01

    RxGen is a prescription generator for JPL's in-house optical modeling software package called MACOS (Modeling and Analysis for Controlled Optical Systems), which is an expert optical analysis software package focusing on modeling optics on dynamic structures, deformable optics, and controlled optics. The objectives of RxGen are to simplify and automate MACOS prescription generations, reducing errors associated with creating such optical prescriptions, and improving user efficiency without requiring MACOS proficiency. RxGen uses MATLAB (a high-level language and interactive environment developed by MathWorks) as the development and deployment platform, but RxGen can easily be ported to another optical modeling/analysis platform. Running RxGen within the modeling environment has the huge benefit that variations in optical models can be made an integral part of the modeling state. For instance, optical prescription parameters determined as external functional dependencies, optical variations by controlling the in-/exclusion of optical components like sub-systems, and/or controlling the state of all components. Combining the mentioned capabilities and flexibilities with RxGen's optical abstraction layer completely eliminates the hindering aspects for requiring proficiency in writing/editing MACOS prescriptions, allowing users to focus on the modeling aspects of optical systems, i.e., increasing productivity and efficiency. RxGen provides significant enhancements to MACOS and delivers a framework for fast prototyping as well as for developing very complex controlled optical systems.

  13. Optimizing the Performance of Reactive Molecular Dynamics Simulations for Multi-core Architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aktulga, Hasan Metin; Coffman, Paul; Shan, Tzu-Ray

    2015-12-01

    Hybrid parallelism allows high performance computing applications to better leverage the increasing on-node parallelism of modern supercomputers. In this paper, we present a hybrid parallel implementation of the widely used LAMMPS/ReaxC package, where the construction of bonded and nonbonded lists and evaluation of complex ReaxFF interactions are implemented efficiently using OpenMP parallelism. Additionally, the performance of the QEq charge equilibration scheme is examined and a dual-solver is implemented. We present the performance of the resulting ReaxC-OMP package on a state-of-the-art multi-core architecture Mira, an IBM BlueGene/Q supercomputer. For system sizes ranging from 32 thousand to 16.6 million particles, speedups inmore » the range of 1.5-4.5x are observed using the new ReaxC-OMP software. Sustained performance improvements have been observed for up to 262,144 cores (1,048,576 processes) of Mira with a weak scaling efficiency of 91.5% in larger simulations containing 16.6 million particles.« less

  14. Component-based integration of chemistry and optimization software.

    PubMed

    Kenny, Joseph P; Benson, Steven J; Alexeev, Yuri; Sarich, Jason; Janssen, Curtis L; McInnes, Lois Curfman; Krishnan, Manojkumar; Nieplocha, Jarek; Jurrus, Elizabeth; Fahlstrom, Carl; Windus, Theresa L

    2004-11-15

    Typical scientific software designs make rigid assumptions regarding programming language and data structures, frustrating software interoperability and scientific collaboration. Component-based software engineering is an emerging approach to managing the increasing complexity of scientific software. Component technology facilitates code interoperability and reuse. Through the adoption of methodology and tools developed by the Common Component Architecture Forum, we have developed a component architecture for molecular structure optimization. Using the NWChem and Massively Parallel Quantum Chemistry packages, we have produced chemistry components that provide capacity for energy and energy derivative evaluation. We have constructed geometry optimization applications by integrating the Toolkit for Advanced Optimization, Portable Extensible Toolkit for Scientific Computation, and Global Arrays packages, which provide optimization and linear algebra capabilities. We present a brief overview of the component development process and a description of abstract interfaces for chemical optimizations. The components conforming to these abstract interfaces allow the construction of applications using different chemistry and mathematics packages interchangeably. Initial numerical results for the component software demonstrate good performance, and highlight potential research enabled by this platform.

  15. Diagnosis diagrams for passing signals on an automatic block signaling railway section

    NASA Astrophysics Data System (ADS)

    Spunei, E.; Piroi, I.; Chioncel, C. P.; Piroi, F.

    2018-01-01

    This work presents a diagnosis method for railway traffic security installations. More specifically, the authors present a series of diagnosis charts for passing signals on a railway block equipped with an automatic block signaling installation. These charts are based on the exploitation electric schemes, and are subsequently used to develop a diagnosis software package. The thus developed software package contributes substantially to a reduction of failure detection and remedy for these types of installation faults. The use of the software package eliminates making wrong decisions in the fault detection process, decisions that may result in longer remedy times and, sometimes, to railway traffic events.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsai, Hanchung; Liu, Yung Y.; Shuler, James

    The ability to monitor critical environment parameters of nuclear plants at all times, particularly during and after a disruptive accident, is vital for the safety of plant personnel, rescue and recovery crews, and the surrounding communities. Conventional hard-wired assets that depend on supplied power may be decimated as a result of such events, as witnessed in the Japanese Fukushima nuclear power plant in March 2011. Self-powered monitoring devices operating on a wireless platform, on the other hand, may survive such calamity and remain functional. The devices would be pre-positioned at strategic locations, particularly where the dangerous build-up of contamination andmore » radiation may preclude subsequent manned entrance and surveillance. Equipped with sensors for β-γ radiation, neutrons, hydrogen gas, temperature, humidity, pressure, and water level, as well as with criticality alarms and imaging equipment for heat, video, and other capabilities, these devices can provide vital surveillance information for assessing the extent of plant damage, mandating responses (e.g., evacuation before impending hydrogen explosion), and enabling overall safe and efficient recovery in a disaster. A radio frequency identification (RFID)-based system - called ARG-US - may be modified and adapted for this task. Developed by Argonne for DOE, ARG-US (meaning 'watchful guardian') has been used successfully to monitor and track sensitive nuclear materials packages at DOE sites. It utilizes sensors in the tags to continuously monitor the state of health of the packaging and promptly disseminates alarms to authorized users when any of the preset sensor thresholds is violated. By adding plant-specific monitoring sensors to the already strong sensor suite and adopting modular hardware, firmware, and software subsystems that are tailored for specific subsystems of a plant, a Remote Area Modular Monitoring (RAMM) system, built on a wireless sensor network (WSN) platform, is being developed by Argonne National Laboratory. ARG-US RAMM, powered by on-board battery, can sustain extended autonomous surveillance operation during and following an incident. The benefits could be invaluable to such critical facilities as nuclear power plants, research and test reactors, fuel cycle manufacturing centers, spent-fuel dry-cask storage facilities, and other nuclear installations. (authors)« less

  17. Orbit determination for ISRO satellite missions

    NASA Astrophysics Data System (ADS)

    Rao, Ch. Sreehari; Sinha, S. K.

    Indian Space Research Organisation (ISRO) has been successful in using the in-house developed orbit determination and prediction software for satellite missions of Bhaskara, Rohini and APPLE. Considering the requirements of satellite missions, software packages are developed, tested and their accuracies are assessed. Orbit determination packages developed are SOIP, for low earth orbits of Bhaskara and Rohini missions, ORIGIN and ODPM, for orbits related to all phases of geo-stationary missions and SEGNIP, for drift and geo-stationary orbits. Software is tested and qualified using tracking data of SIGNE-3, D5-B, OTS, SYMPHONIE satellites with the help of software available with CNES, ESA and DFVLR. The results match well with those available from these agencies. These packages have supported orbit determination successfully throughout the mission life for all ISRO satellite missions. Member-Secretary

  18. Prototyping with Data Dictionaries for Requirements Analysis.

    DTIC Science & Technology

    1985-03-01

    statistical packages and software for screen layout. These items work at a higher level than another category of prototyping tool, program generators... Program generators are software packages which, when given specifications, produce source listings, usually in a high order language such as COBCL...with users and this will not happen if he must stop to develcp a detailed program . [Ref. 241] Hardware as well as software should be considered in

  19. Electronic and software subsystems for an autonomous roving vehicle. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Doig, G. A.

    1980-01-01

    The complete electronics packaging which controls the Mars roving vehicle is described in order to provide a broad overview of the systems that are part of that package. Some software debugging tools are also discussed. Particular emphasis is given to those systems that are controlled by the microprocessor. These include the laser mast, the telemetry system, the command link prime interface board, and the prime software.

  20. The Shock and Vibration Digest. Volume 17, Number 4

    DTIC Science & Technology

    1985-04-01

    software packages for engineering signed to be easy to use from the outset, computations which were specifically writ- and this design philosophy is largely...re- ten for use on microcomputers. Software sponsible for their increasing popularity; packages related to shock and vibration are this same design...philosophy appears to have available for both experimental and for been carried over to the design of today’s analytical applications. Typical software

  1. Developing a Virtual Physics World

    ERIC Educational Resources Information Center

    Wegener, Margaret; McIntyre, Timothy J.; McGrath, Dominic; Savage, Craig M.; Williamson, Michael

    2012-01-01

    In this article, the successful implementation of a development cycle for a physics teaching package based on game-like virtual reality software is reported. The cycle involved several iterations of evaluating students' use of the package followed by instructional and software development. The evaluation used a variety of techniques, including…

  2. Teaching Science and Mathematics Subjects Using the Excel Spreadsheet Package

    ERIC Educational Resources Information Center

    Ibrahim, Dogan

    2009-01-01

    The teaching of scientific subjects usually require laboratories where students can put the theory they have learned into practice. Traditionally, electronic programmable calculators, dedicated software, or expensive software simulation packages, such as MATLAB have been used to simulate scientific experiments. Recently, spreadsheet programs have…

  3. Description of the IV + V System Software Package.

    ERIC Educational Resources Information Center

    Microcomputers for Information Management: An International Journal for Library and Information Services, 1984

    1984-01-01

    Describes the IV + V System, a software package designed by the Institut fur Maschinelle Dokumentation for the United Nations General Information Programme and UNISIST to support automation of local information and documentation services. Principle program features and functions outlined include input/output, databank, text image, output, and…

  4. Roots Air Management System with Integrated Expander

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stretch, Dale; Wright, Brad; Fortini, Matt

    2016-07-06

    PEM fuel cells remain an emerging technology in the vehicle market with several cost and reliability challenges that must be overcome in order to increase market penetration and acceptance. The DOE has identified the lack of a cost effective, reliable, and efficient air supply system that meets the operational requirements of a pressurized PEM 80kW fuel cell as one of the major technological barriers that must be overcome. This project leveraged Roots positive displacement development advancements and demonstrated an efficient and low cost fuel cell air management system. Eaton built upon its P-Series Roots positive displacement design and shifted themore » peak efficiency making it ideal for use on an 80kW PEM stack. Advantages to this solution include: • Lower speed of the Roots device eliminates complex air bearings present on other systems. • Broad efficiency map of Roots based systems provides an overall higher drive cycle fuel economy. • Core Roots technology has been developed and validated for other transportation applications. Eaton modified their novel R340 Twin Vortices Series (TVS) Roots-type supercharger for this application. The TVS delivers more power and better fuel economy in a smaller package as compared to other supercharger technologies. By properly matching the helix angle with the rotor’s physical aspect ratio, the supercharger’s peak efficiency can be moved to the operating range where it is most beneficial for the application. The compressor was designed to meet the 90 g/s flow at a pressure ratio of 2.5, similar in design to the P-Series 340. A net shape plastic expander housing with integrated motor and compressor was developed to significantly reduce the cost of the system. This integrated design reduced part count by incorporating an overhung expander and motor rotors into the design such that only four bearings and two shafts were utilized.« less

  5. User’s guide for GcClust—An R package for clustering of regional geochemical data

    USGS Publications Warehouse

    Ellefsen, Karl J.; Smith, David B.

    2016-04-08

    GcClust is a software package developed by the U.S. Geological Survey for statistical clustering of regional geochemical data, and similar data such as regional mineralogical data. Functions within the software package are written in the R statistical programming language. These functions, their documentation, and a copy of the user’s guide are bundled together in R’s unit of sharable code, which is called a “package.” The user’s guide includes step-by-step instructions showing how the functions are used to cluster data and to evaluate the clustering results. These functions are demonstrated in this report using test data, which are included in the package.

  6. Toward a leaner and greener transportation system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ross, M.

    1993-04-01

    Transportation is responsible for 25% of CO{sub 2} emissions in the U.S. and is largely responsible for excessive ozone or carbon monoxide in several metropolitan areas. In turns out that emissions from new cars are much higher in use than laboratory tests and standards would appear to suggest. Transportation is also responsible for the lion`s share of U.S. petroleum consumption; and, although growth in the use of petroleum has been constrained by improvements in fuel economy, it is set to start again as the benefits of the CAFE standards are fully exploited, and travel continues to increase. In the shortmore » term, more efficient petroleum-fueled vehicles, based, e.g., on lean burn engines, sophisticated transmission management, idle off, efficient accessories and more light materials, would help. In the medium term, natural gas vehicles might provide a lower-emissions alternative with good performance and costs, and, if vehicle efficiency is high, good range. In the long term, fuel cells appear very attractive, and might profit from experience with a gaseous fuel. There are of course other interesting possibilities. R & D challenges will be discussed. One need is support for fundamental research at universities. Policies to encourage adoption of such technologies will also be addressed, including the issue of excessive reliance on regulations that are based on vehicle tests. To improve the environmental performance of such a pervasive activity as transportation a multifaceted package of policies is needed including correcting policies on the books that encourage automotive travel.« less

  7. InterFace: A software package for face image warping, averaging, and principal components analysis.

    PubMed

    Kramer, Robin S S; Jenkins, Rob; Burton, A Mike

    2017-12-01

    We describe InterFace, a software package for research in face recognition. The package supports image warping, reshaping, averaging of multiple face images, and morphing between faces. It also supports principal components analysis (PCA) of face images, along with tools for exploring the "face space" produced by PCA. The package uses a simple graphical user interface, allowing users to perform these sophisticated image manipulations without any need for programming knowledge. The program is available for download in the form of an app, which requires that users also have access to the (freely available) MATLAB Runtime environment.

  8. Evolution of a modular software network

    PubMed Central

    Fortuna, Miguel A.; Bonachela, Juan A.; Levin, Simon A.

    2011-01-01

    “Evolution behaves like a tinkerer” (François Jacob, Science, 1977). Software systems provide a singular opportunity to understand biological processes using concepts from network theory. The Debian GNU/Linux operating system allows us to explore the evolution of a complex network in a unique way. The modular design detected during its growth is based on the reuse of existing code in order to minimize costs during programming. The increase of modularity experienced by the system over time has not counterbalanced the increase in incompatibilities between software packages within modules. This negative effect is far from being a failure of design. A random process of package installation shows that the higher the modularity, the larger the fraction of packages working properly in a local computer. The decrease in the relative number of conflicts between packages from different modules avoids a failure in the functionality of one package spreading throughout the entire system. Some potential analogies with the evolutionary and ecological processes determining the structure of ecological networks of interacting species are discussed. PMID:22106260

  9. High speed finite element simulations on the graphics card

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huthwaite, P.; Lowe, M. J. S.

    A software package is developed to perform explicit time domain finite element simulations of ultrasonic propagation on the graphical processing unit, using Nvidia’s CUDA. Of critical importance for this problem is the arrangement of nodes in memory, allowing data to be loaded efficiently and minimising communication between the independently executed blocks of threads. The initial stage of memory arrangement is partitioning the mesh; both a well established ‘greedy’ partitioner and a new, more efficient ‘aligned’ partitioner are investigated. A method is then developed to efficiently arrange the memory within each partition. The technique is compared to a commercial CPU equivalent,more » demonstrating an overall speedup of at least 100 for a non-destructive testing weld model.« less

  10. TkPl_SU: An Open-source Perl Script Builder for Seismic Unix

    NASA Astrophysics Data System (ADS)

    Lorenzo, J. M.

    2017-12-01

    TkPl_SU (beta) is a graphical user interface (GUI) to select parameters for Seismic Unix (SU) modules. Seismic Unix (Stockwell, 1999) is a widely distributed free software package for processing seismic reflection and signal processing. Perl/Tk is a mature, well-documented and free object-oriented graphical user interface for Perl. In a classroom environment, shell scripting of SU modules engages students and helps focus on the theoretical limitations and strengths of signal processing. However, complex interactive processing stages, e.g., selection of optimal stacking velocities, killing bad data traces, or spectral analysis requires advanced flows beyond the scope of introductory classes. In a research setting, special functionality from other free seismic processing software such as SioSeis (UCSD-NSF) can be incorporated readily via an object-oriented style to programming. An object oriented approach is a first step toward efficient extensible programming of multi-step processes, and a simple GUI simplifies parameter selection and decision making. Currently, in TkPl_SU, Perl 5 packages wrap 19 of the most common SU modules that are used in teaching undergraduate and first-year graduate student classes (e.g., filtering, display, velocity analysis and stacking). Perl packages (classes) can advantageously add new functionality around each module and clarify parameter names for easier usage. For example, through the use of methods, packages can isolate the user from repetitive control structures, as well as replace the names of abbreviated parameters with self-describing names. Moose, an extension of the Perl 5 object system, greatly facilitates an object-oriented style. Perl wrappers are self-documenting via Perl programming document markup language.

  11. Wood Combustion Behaviour in a Fixed Bed Combustor

    NASA Astrophysics Data System (ADS)

    Tokit, Ernie Mat; Aziz, Azhar Abdul; Ghazali, Normah Mohd

    2010-06-01

    Waste wood is used as feedstock for Universiti Teknologi Malaysia's newly-developed two-stage incinerator system. The research goals are to optimize the operation of the thermal system to the primary chamber, to improve its combustion efficiency and to minimize its pollutants formation. The combustion process is evaluated with the variation of fuel's moisture content. For optimum operating condition, where the gasification efficiency is 95.53%, the moisture content of the fuel is best set at 17%; giving outlet operating temperature of 550°C and exhaust gas concentrations with 1213 ppm of CO, 6% of CO2 and 14% of O2 respectively. In line to the experimental work, a computational fluid dynamics software, Fluent is used to simulate the performance of the primary chamber. Here the predicted optimum gasification efficiency stands at 95.49% with CO, CO2 and O2 concentrations as 1301 ppm, 6.5% and 13.5% respectively.

  12. Image analysis software versus direct anthropometry for breast measurements.

    PubMed

    Quieregatto, Paulo Rogério; Hochman, Bernardo; Furtado, Fabianne; Machado, Aline Fernanda Perez; Sabino Neto, Miguel; Ferreira, Lydia Masako

    2014-10-01

    To compare breast measurements performed using the software packages ImageTool(r), AutoCAD(r) and Adobe Photoshop(r) with direct anthropometric measurements. Points were marked on the breasts and arms of 40 volunteer women aged between 18 and 60 years. When connecting the points, seven linear segments and one angular measurement on each half of the body, and one medial segment common to both body halves were defined. The volunteers were photographed in a standardized manner. Photogrammetric measurements were performed by three independent observers using the three software packages and compared to direct anthropometric measurements made with calipers and a protractor. Measurements obtained with AutoCAD(r) were the most reproducible and those made with ImageTool(r) were the most similar to direct anthropometry, while measurements with Adobe Photoshop(r) showed the largest differences. Except for angular measurements, significant differences were found between measurements of line segments made using the three software packages and those obtained by direct anthropometry. AutoCAD(r) provided the highest precision and intermediate accuracy; ImageTool(r) had the highest accuracy and lowest precision; and Adobe Photoshop(r) showed intermediate precision and the worst accuracy among the three software packages.

  13. System and Method for Dynamic Aeroelastic Control

    NASA Technical Reports Server (NTRS)

    Suh, Peter M. (Inventor)

    2015-01-01

    The present invention proposes a hardware and software architecture for dynamic modal structural monitoring that uses a robust modal filter to monitor a potentially very large-scale array of sensors in real time, and tolerant of asymmetric sensor noise and sensor failures, to achieve aircraft performance optimization such as minimizing aircraft flutter, drag and maximizing fuel efficiency.

  14. PyPedal, an open source software package for pedigree analysis

    USDA-ARS?s Scientific Manuscript database

    The open source software package PyPedal (http://pypedal.sourceforge.net/) was first released in 2002, and provided users with a set of simple tools for manipulating pedigrees. Its flexibility has been demonstrated by its used in a number of settings for large and small populations. After substantia...

  15. A Simple Interactive Software Package for Plotting, Animating, and Calculating

    ERIC Educational Resources Information Center

    Engelhardt, Larry

    2012-01-01

    We introduce a new open source (free) software package that provides a simple, highly interactive interface for carrying out certain mathematical tasks that are commonly encountered in physics. These tasks include plotting and animating functions, solving systems of coupled algebraic equations, and basic calculus (differentiating and integrating…

  16. A Software Development Approach for Computer Assisted Language Learning

    ERIC Educational Resources Information Center

    Cushion, Steve

    2005-01-01

    Over the last 5 years we have developed, produced, tested, and evaluated an authoring software package to produce web-based, interactive, audio-enhanced language-learning material. That authoring package has been used to produce language-learning material in French, Spanish, German, Arabic, and Tamil. We are currently working on increasing…

  17. Analysis of Variance: What Is Your Statistical Software Actually Doing?

    ERIC Educational Resources Information Center

    Li, Jian; Lomax, Richard G.

    2011-01-01

    Users assume statistical software packages produce accurate results. In this article, the authors systematically examined Statistical Package for the Social Sciences (SPSS) and Statistical Analysis System (SAS) for 3 analysis of variance (ANOVA) designs, mixed-effects ANOVA, fixed-effects analysis of covariance (ANCOVA), and nested ANOVA. For each…

  18. A Multi-User Microcomputer System for Small Libraries.

    ERIC Educational Resources Information Center

    Leggate, Peter

    1988-01-01

    Describes the development of Bookshelf, a multi-user microcomputer system for small libraries that uses an integrated software package. The discussion covers the design parameters of the package, which were based on a survey of seven small libraries, and some characteristics of the software. (three notes with references) (CLB)

  19. Microcomputer Software Programs for Vocational Education.

    ERIC Educational Resources Information Center

    Rodenstein, Judith, Ed.; Lambert, Roger, Ed.

    Over 200 microcomputer software packages applicable to vocational education are listed. Most of the programs are available for the Apple, TRS-80, and Commodore microcomputers. The packages have been reviewed, but have not been formally evaluated. Titles of the programs with names and addresses of the distributors are provided. Telephone numbers…

  20. Software, Copyright, and Site-License Agreements: Publishers' Perspective of Library Practice.

    ERIC Educational Resources Information Center

    Happer, Stephanie K.

    Thirty-one academic publishers of stand-alone software and book/disk packages were surveyed to determine whether publishers have addressed the copyright issues inherent in circulating these packages within the library environment. Twenty-two questionnaires were returned, providing a 71% return rate. There were 18 usable questionnaires. Publishers…

  1. Computerised data reduction.

    PubMed

    Datson, D J; Carter, N G

    1988-10-01

    The use of personal computers in accountancy and business generally has been stimulated by the availability of flexible software packages. We describe the implementation of a commercial software package designed for interfacing with laboratory instruments and highlight the ease with which it can be implemented, without the need for specialist computer programming staff.

  2. wbstats

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piburn, Jesse

    2016-04-22

    Availability of accessing the World Bank Data API through the R language was limited to one existing package, which is limited in its ability. The software provides access to all of the features in World Bank API in one software package for the R language and provides functions for searching and downloading data from the World Bank API.

  3. Propensity Score Analysis in R: A Software Review

    ERIC Educational Resources Information Center

    Keller, Bryan; Tipton, Elizabeth

    2016-01-01

    In this article, we review four software packages for implementing propensity score analysis in R: "Matching, MatchIt, PSAgraphics," and "twang." After briefly discussing essential elements for propensity score analysis, we apply each package to a data set from the Early Childhood Longitudinal Study in order to estimate the…

  4. Development Of A Centrifugal Hydrogen Pipeline Gas Compressor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Di Bella, Francis A.

    2015-04-16

    Concepts NREC (CN) has completed a Department of Energy (DOE) sponsored project to analyze, design, and fabricate a pipeline capacity hydrogen compressor. The pipeline compressor is a critical component in the DOE strategy to provide sufficient quantities of hydrogen to support the expected shift in transportation fuels from liquid and natural gas to hydrogen. The hydrogen would be generated by renewable energy (solar, wind, and perhaps even tidal or ocean), and would be electrolyzed from water. The hydrogen would then be transported to the population centers in the U.S., where fuel-cell vehicles are expected to become popular and necessary tomore » relieve dependency on fossil fuels. The specifications for the required pipeline hydrogen compressor indicates a need for a small package that is efficient, less costly, and more reliable than what is available in the form of a multi-cylinder, reciprocating (positive displacement) compressor for compressing hydrogen in the gas industry.« less

  5. New Automotive Air Conditioning System Simulation Tool Developed in MATLAB/Simulink

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kiss, T.; Chaney, L.; Meyer, J.

    Further improvements in vehicle fuel efficiency require accurate evaluation of the vehicle's transient total power requirement. When operated, the air conditioning (A/C) system is the largest auxiliary load on a vehicle; therefore, accurate evaluation of the load it places on the vehicle's engine and/or energy storage system is especially important. Vehicle simulation software, such as 'Autonomie,' has been used by OEMs to evaluate vehicles' energy performance. A transient A/C simulation tool incorporated into vehicle simulation models would also provide a tool for developing more efficient A/C systems through a thorough consideration of the transient A/C system performance. The dynamic systemmore » simulation software Matlab/Simulink was used to develop new and more efficient vehicle energy system controls. The various modeling methods used for the new simulation tool are described in detail. Comparison with measured data is provided to demonstrate the validity of the model.« less

  6. BEST Winery Guidebook: Benchmarking and Energy and Water SavingsTool for the Wine Industry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Galitsky, Christina; Worrell, Ernst; Radspieler, Anthony

    2005-10-15

    Not all industrial facilities have the staff or the opportunity to perform a detailed audit of their operations. The lack of knowledge of energy efficiency opportunities provides an important barrier to improving efficiency. Benchmarking has demonstrated to help energy users understand energy use and the potential for energy efficiency improvement, reducing the information barrier. In California, the wine making industry is not only one of the economic pillars of the economy; it is also a large energy consumer, with a considerable potential for energy-efficiency improvement. Lawrence Berkeley National Laboratory and Fetzer Vineyards developed an integrated benchmarking and self-assessment tool formore » the California wine industry called ''BEST''(Benchmarking and Energy and water Savings Tool) Winery. BEST Winery enables a winery to compare its energy efficiency to a best practice winery, accounting for differences in product mix and other characteristics of the winery. The tool enables the user to evaluate the impact of implementing energy and water efficiency measures. The tool facilitates strategic planning of efficiency measures, based on the estimated impact of the measures, their costs and savings. BEST Winery is available as a software tool in an Excel environment. This report serves as background material, documenting assumptions and information on the included energy and water efficiency measures. It also serves as a user guide for the software package.« less

  7. Software engineering and data management for automated payload experiment tool

    NASA Technical Reports Server (NTRS)

    Maddux, Gary A.; Provancha, Anna; Chattam, David

    1994-01-01

    The Microgravity Projects Office identified a need to develop a software package that will lead experiment developers through the development planning process, obtain necessary information, establish an electronic data exchange avenue, and allow easier manipulation/reformatting of the collected information. An MS-DOS compatible software package called the Automated Payload Experiment Tool (APET) has been developed and delivered. The objective of this task is to expand on the results of the APET work previously performed by UAH and provide versions of the software in a Macintosh and Windows compatible format.

  8. Advanced fingerprint verification software

    NASA Astrophysics Data System (ADS)

    Baradarani, A.; Taylor, J. R. B.; Severin, F.; Maev, R. Gr.

    2016-05-01

    We have developed a fingerprint software package that can be used in a wide range of applications from law enforcement to public and private security systems, and to personal devices such as laptops, vehicles, and door- locks. The software and processing units are a unique implementation of new and sophisticated algorithms that compete with the current best systems in the world. Development of the software package has been in line with the third generation of our ultrasonic fingerprinting machine1. Solid and robust performance is achieved in the presence of misplaced and low quality fingerprints.

  9. Efficient and Robust Optimization for Building Energy Simulation

    PubMed Central

    Pourarian, Shokouh; Kearsley, Anthony; Wen, Jin; Pertzborn, Amanda

    2016-01-01

    Efficiently, robustly and accurately solving large sets of structured, non-linear algebraic and differential equations is one of the most computationally expensive steps in the dynamic simulation of building energy systems. Here, the efficiency, robustness and accuracy of two commonly employed solution methods are compared. The comparison is conducted using the HVACSIM+ software package, a component based building system simulation tool. The HVACSIM+ software presently employs Powell’s Hybrid method to solve systems of nonlinear algebraic equations that model the dynamics of energy states and interactions within buildings. It is shown here that the Powell’s method does not always converge to a solution. Since a myriad of other numerical methods are available, the question arises as to which method is most appropriate for building energy simulation. This paper finds considerable computational benefits result from replacing the Powell’s Hybrid method solver in HVACSIM+ with a solver more appropriate for the challenges particular to numerical simulations of buildings. Evidence is provided that a variant of the Levenberg-Marquardt solver has superior accuracy and robustness compared to the Powell’s Hybrid method presently used in HVACSIM+. PMID:27325907

  10. Efficient and Robust Optimization for Building Energy Simulation.

    PubMed

    Pourarian, Shokouh; Kearsley, Anthony; Wen, Jin; Pertzborn, Amanda

    2016-06-15

    Efficiently, robustly and accurately solving large sets of structured, non-linear algebraic and differential equations is one of the most computationally expensive steps in the dynamic simulation of building energy systems. Here, the efficiency, robustness and accuracy of two commonly employed solution methods are compared. The comparison is conducted using the HVACSIM+ software package, a component based building system simulation tool. The HVACSIM+ software presently employs Powell's Hybrid method to solve systems of nonlinear algebraic equations that model the dynamics of energy states and interactions within buildings. It is shown here that the Powell's method does not always converge to a solution. Since a myriad of other numerical methods are available, the question arises as to which method is most appropriate for building energy simulation. This paper finds considerable computational benefits result from replacing the Powell's Hybrid method solver in HVACSIM+ with a solver more appropriate for the challenges particular to numerical simulations of buildings. Evidence is provided that a variant of the Levenberg-Marquardt solver has superior accuracy and robustness compared to the Powell's Hybrid method presently used in HVACSIM+.

  11. Report: Scientific Software.

    ERIC Educational Resources Information Center

    Borman, Stuart A.

    1985-01-01

    Discusses various aspects of scientific software, including evaluation and selection of commercial software products; program exchanges, catalogs, and other information sources; major data analysis packages; statistics and chemometrics software; and artificial intelligence. (JN)

  12. Mixing enhancement in a scramjet combustor using fuel jet injection swirl

    NASA Astrophysics Data System (ADS)

    Flesberg, Sonja M.

    The scramjet engine has proven to be a viable means of powering a hypersonic vehicle, especially after successful flights of the X-51 WaveRider and various Hy-SHOT test vehicles. The major challenge associated with operating a scramjet engine is the short residence time of the fuel and oxidizer in the combustor. The fuel and oxidizer have only milliseconds to mix, ignite and combust in the combustion chamber. Combustion cannot occur until the fuel and oxidizer are mixed on a molecular level. Therefore the improvement of mixing is of utmost interest since this can increase combustion efficiency. This study investigated mixing enhancement of fuel and oxidizer within the combustion chamber of a scramjet by introducing swirl to the fuel jet. The investigation was accomplished with numerical simulations using STAR-CCM+ computational fluid dynamic software. The geometry of the University of Virginia Supersonic Combustion Facility was used to model the isolator, combustor and nozzle of a scramjet engine for simulation purposes. Experimental data from previous research at the facility was used to verify the simulation model before investigating the effect of fuel jet swirl on mixing. The model used coaxial fuel jet with a swirling annular jet. Single coaxial fuel jet and dual coaxial fuel jet configurations were simulated for the investigation. The coaxial fuel jets were modelled with a swirling annular jet and non-swirling core jet. Numerical analysis showed that fuel jet swirl not only increased mixing and entrainment of the fuel with the oxidizer but the mixing occurred further upstream than without fuel jet swirl. The burning efficiency was calculated for the all the configurations. An increase in burning efficiency indicated an increase in the mixing of H2 with O2. In the case of the single fuel jet models, the maximum burning efficiency increase due to fuel injection jet swirl was 23.3%. The research also investigated the possibility that interaction between two swirling jets would produce increased mixing and to study how the distance between the two fuel injector exits would affect mixing. Three swirl patterns were investigated: 1) the first swirl pattern as viewed by an observer looking downstream had the right fuel annular jet swirling counter clockwise and the left fuel annular jet swirling clockwise, 2) the second swirl pattern as viewed by an observer looking downstream had the right fuel jet swirling clockwise and the left fuel jet swirling counter clockwise, 3) the third swirl pattern as viewed by an observer looking downstream had both the right and left fuel jet swirling in the same clockwise direction. Each one of the swirl patterns were simulated with the distances between the center points of the fuel jets modelled 3, 4, and 5 times the fuel injector radius. The swirl pattern that produced the greatest increase in burning efficiency differed according to the fuel injector spacing. The maximum increase in burning efficiency compared to the corresponding non-swirling two jet baseline case was 24.6% and was produced by the first swirl pattern with the distance between the center points of the fuel jets being 5 times the fuel injector radius. The burning efficiency for the single jet non-swirling baseline case and the first swirl pattern with the distance between the center points of the fuel jets being 5 times the fuel injector radius was 0.70 and 0.90 respectively indicating a 29% increase due to dual fuel injection swirl.

  13. Electronic nose for space program applications

    NASA Technical Reports Server (NTRS)

    Young, Rebecca C.; Buttner, William J.; Linnell, Bruce R.; Ramesham, Rajeshuni

    2003-01-01

    The ability to monitor air contaminants in the shuttle and the International Space Station is important to ensure the health and safety of astronauts, and equipment integrity. Three specific space applications have been identified that would benefit from a chemical monitor: (a) organic contaminants in space cabin air; (b) hypergolic propellant contaminants in the shuttle airlock; (c) pre-combustion signature vapors from electrical fires. NASA at Kennedy Space Center (KSC) is assessing several commercial and developing electronic noses (E-noses) for these applications. A short series of tests identified those E-noses that exhibited sufficient sensitivity to the vapors of interest. Only two E-noses exhibited sufficient sensitivity for hypergolic fuels at the required levels, while several commercial E-noses showed sufficient sensitivity of common organic vapors. These E-noses were subjected to further tests to assess their ability to identify vapors. Development and testing of E-nose models using vendor supplied software packages correctly identified vapors with an accuracy of 70-90%. In-house software improvements increased the identification rates between 90 and 100%. Further software enhancements are under development. Details on the experimental setup, test protocols, and results on E-nose performance are presented in this paper along with special emphasis on specific software enhancements. c2003 Elsevier Science B.V. All rights reserved.

  14. The IUE Science Operations Ground System

    NASA Technical Reports Server (NTRS)

    Pitts, Ronald E.; Arquilla, Richard

    1994-01-01

    The International Ultraviolet Explorer (IUE) Science Operations System provides full realtime operations capabilities and support to the operations staff and astronomer users. The components of this very diverse and extremely flexible hardware and software system have played a major role in maintaining the scientific efficiency and productivity of the IUE. The software provides the staff and user with all the tools necessary for pre-visit and real-time planning and operations analysis for any day of the year. Examples of such tools include the effects of spacecraft constraints on target availability, maneuver times between targets, availability of guide stars, target identification, coordinate transforms, e-mail transfer of Observatory forms and messages, and quick-look analysis of image data. Most of this extensive software package can also be accessed remotely by individual users for information, scheduling of shifts, pre-visit planning, and actual observing program execution. Astronomers, with a modest investment in hardware and software, may establish remote observing sites. We currently have over 20 such sites in our remote observers' network.

  15. Open Marketplace for Simulation Software on the Basis of a Web Platform

    NASA Astrophysics Data System (ADS)

    Kryukov, A. P.; Demichev, A. P.

    2016-02-01

    The focus in development of a new generation of middleware shifts from the global grid systems to building convenient and efficient web platforms for remote access to individual computing resources. Further line of their development, suggested in this work, is related not only with the quantitative increase in their number and with the expansion of scientific, engineering, and manufacturing areas in which they are used, but also with improved technology for remote deployment of application software on the resources interacting with the web platforms. Currently, the services for providers of application software in the context of scientific-oriented web platforms is not developed enough. The proposed in this work new web platforms of application software market should have all the features of the existing web platforms for submissions of jobs to remote resources plus the provision of specific web services for interaction on market principles between the providers and consumers of application packages. The suggested approach will be approved on the example of simulation applications in the field of nonlinear optics.

  16. Development of an e-VLBI Data Transport Software Suite with VDIF

    NASA Technical Reports Server (NTRS)

    Sekido, Mamoru; Takefuji, Kazuhiro; Kimura, Moritaka; Hobiger, Thomas; Kokado, Kensuke; Nozawa, Kentarou; Kurihara, Shinobu; Shinno, Takuya; Takahashi, Fujinobu

    2010-01-01

    We have developed a software library (KVTP-lib) for VLBI data transmission over the network with the VDIF (VLBI Data Interchange Format), which is the newly proposed standard VLBI data format designed for electronic data transfer over the network. The software package keeps the application layer (VDIF frame) and the transmission layer separate, so that each layer can be developed efficiently. The real-time VLBI data transmission tool sudp-send is an application tool based on the KVTP-lib library. sudp-send captures the VLBI data stream from the VSI-H interface with the K5/VSI PC-board and writes the data to file in standard Linux file format or transmits it to the network using the simple- UDP (SUDP) protocol. Another tool, sudp-recv , receives the data stream from the network and writes the data to file in a specific VLBI format (K5/VSSP, VDIF, or Mark 5B). This software system has been implemented on the Wettzell Tsukuba baseline; evaluation before operational employment is under way.

  17. Novel optimization technique of isolated microgrid with hydrogen energy storage.

    PubMed

    Beshr, Eman Hassan; Abdelghany, Hazem; Eteiba, Mahmoud

    2018-01-01

    This paper presents a novel optimization technique for energy management studies of an isolated microgrid. The system is supplied by various Distributed Energy Resources (DERs), Diesel Generator (DG), a Wind Turbine Generator (WTG), Photovoltaic (PV) arrays and supported by fuel cell/electrolyzer Hydrogen storage system for short term storage. Multi-objective optimization is used through non-dominated sorting genetic algorithm to suit the load requirements under the given constraints. A novel multi-objective flower pollination algorithm is utilized to check the results. The Pros and cons of the two optimization techniques are compared and evaluated. An isolated microgrid is modelled using MATLAB software package, dispatch of active/reactive power, optimal load flow analysis with slack bus selection are carried out to be able to minimize fuel cost and line losses under realistic constraints. The performance of the system is studied and analyzed during both summer and winter conditions and three case studies are presented for each condition. The modified IEEE 15 bus system is used to validate the proposed algorithm.

  18. Novel optimization technique of isolated microgrid with hydrogen energy storage

    PubMed Central

    Abdelghany, Hazem; Eteiba, Mahmoud

    2018-01-01

    This paper presents a novel optimization technique for energy management studies of an isolated microgrid. The system is supplied by various Distributed Energy Resources (DERs), Diesel Generator (DG), a Wind Turbine Generator (WTG), Photovoltaic (PV) arrays and supported by fuel cell/electrolyzer Hydrogen storage system for short term storage. Multi-objective optimization is used through non-dominated sorting genetic algorithm to suit the load requirements under the given constraints. A novel multi-objective flower pollination algorithm is utilized to check the results. The Pros and cons of the two optimization techniques are compared and evaluated. An isolated microgrid is modelled using MATLAB software package, dispatch of active/reactive power, optimal load flow analysis with slack bus selection are carried out to be able to minimize fuel cost and line losses under realistic constraints. The performance of the system is studied and analyzed during both summer and winter conditions and three case studies are presented for each condition. The modified IEEE 15 bus system is used to validate the proposed algorithm. PMID:29466433

  19. Plane-Wave DFT Methods for Chemistry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bylaska, Eric J.

    A detailed description of modern plane-wave DFT methods and software (contained in the NWChem package) are described that allow for both geometry optimization and ab initio molecular dynamics simulations. Significant emphasis is placed on aspects of these methods that are of interest to computational chemists and useful for simulating chemistry, including techniques for calculating charged systems, exact exchange (i.e. hybrid DFT methods), and highly efficient AIMD/MM methods. Sample applications on the structure of the goethite+water interface and the hydrolysis of nitroaromatic molecules are described.

  20. Achieving better cooling of turbine blades using numerical simulation methods

    NASA Astrophysics Data System (ADS)

    Inozemtsev, A. A.; Tikhonov, A. S.; Sendyurev, C. I.; Samokhvalov, N. Yu.

    2013-02-01

    A new design of the first-stage nozzle vane for the turbine of a prospective gas-turbine engine is considered. The blade's thermal state is numerically simulated in conjugate statement using the ANSYS CFX 13.0 software package. Critical locations in the blade design are determined from the distribution of heat fluxes, and measures aimed at achieving more efficient cooling are analyzed. Essentially lower (by 50-100°C) maximal temperature of metal has been achieved owing to the results of the performed work.

  1. Burner liner thermal-structural load modeling

    NASA Technical Reports Server (NTRS)

    Maffeo, R.

    1986-01-01

    The software package Transfer Analysis Code to Interface Thermal/Structural Problems (TRANCITS) was developed. The TRANCITS code is used to interface temperature data between thermal and structural analytical models. The use of this transfer module allows the heat transfer analyst to select the thermal mesh density and thermal analysis code best suited to solve the thermal problem and gives the same freedoms to the stress analyst, without the efficiency penalties associated with common meshes and the accuracy penalties associated with the manual transfer of thermal data.

  2. Numerical simulation of deformation and failure processes of a complex technical object under impact loading

    NASA Astrophysics Data System (ADS)

    Kraus, E. I.; Shabalin, I. I.; Shabalin, T. I.

    2018-04-01

    The main points of development of numerical tools for simulation of deformation and failure of complex technical objects under nonstationary conditions of extreme loading are presented. The possibility of extending the dynamic method for construction of difference grids to the 3D case is shown. A 3D realization of discrete-continuum approach to the deformation and failure of complex technical objects is carried out. The efficiency of the existing software package for 3D modelling is shown.

  3. PB-AM: An open-source, fully analytical linear poisson-boltzmann solver

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Felberg, Lisa E.; Brookes, David H.; Yap, Eng-Hui

    2016-11-02

    We present the open source distributed software package Poisson-Boltzmann Analytical Method (PB-AM), a fully analytical solution to the linearized Poisson Boltzmann equation. The PB-AM software package includes the generation of outputs files appropriate for visualization using VMD, a Brownian dynamics scheme that uses periodic boundary conditions to simulate dynamics, the ability to specify docking criteria, and offers two different kinetics schemes to evaluate biomolecular association rate constants. Given that PB-AM defines mutual polarization completely and accurately, it can be refactored as a many-body expansion to explore 2- and 3-body polarization. Additionally, the software has been integrated into the Adaptive Poisson-Boltzmannmore » Solver (APBS) software package to make it more accessible to a larger group of scientists, educators and students that are more familiar with the APBS framework.« less

  4. Microstructural modeling of thermal conductivity of high burn-up mixed oxide fuel

    NASA Astrophysics Data System (ADS)

    Teague, Melissa; Tonks, Michael; Novascone, Stephen; Hayes, Steven

    2014-01-01

    Predicting the thermal conductivity of oxide fuels as a function of burn-up and temperature is fundamental to the efficient and safe operation of nuclear reactors. However, modeling the thermal conductivity of fuel is greatly complicated by the radially inhomogeneous nature of irradiated fuel in both composition and microstructure. In this work, radially and temperature-dependent models for effective thermal conductivity were developed utilizing optical micrographs of high burn-up mixed oxide fuel. The micrographs were employed to create finite element meshes with the OOF2 software. The meshes were then used to calculate the effective thermal conductivity of the microstructures using the BISON [1] fuel performance code. The new thermal conductivity models were used to calculate thermal profiles at end of life for the fuel pellets. These results were compared to thermal conductivity models from the literature, and comparison between the new finite element-based thermal conductivity model and the Duriez-Lucuta model was favorable.

  5. Microstructural Modeling of Thermal Conductivity of High Burn-up Mixed Oxide Fuel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melissa Teague; Michael Tonks; Stephen Novascone

    2014-01-01

    Predicting the thermal conductivity of oxide fuels as a function of burn-up and temperature is fundamental to the efficient and safe operation of nuclear reactors. However, modeling the thermal conductivity of fuel is greatly complicated by the radially inhomogeneous nature of irradiated fuel in both composition and microstructure. In this work, radially and temperature-dependent models for effective thermal conductivity were developed utilizing optical micrographs of high burn-up mixed oxide fuel. The micrographs were employed to create finite element meshes with the OOF2 software. The meshes were then used to calculate the effective thermal conductivity of the microstructures using the BISONmore » fuel performance code. The new thermal conductivity models were used to calculate thermal profiles at end of life for the fuel pellets. These results were compared to thermal conductivity models from the literature, and comparison between the new finite element-based thermal conductivity model and the Duriez–Lucuta model was favorable.« less

  6. PHAST: Protein-like heteropolymer analysis by statistical thermodynamics

    NASA Astrophysics Data System (ADS)

    Frigori, Rafael B.

    2017-06-01

    PHAST is a software package written in standard Fortran, with MPI and CUDA extensions, able to efficiently perform parallel multicanonical Monte Carlo simulations of single or multiple heteropolymeric chains, as coarse-grained models for proteins. The outcome data can be straightforwardly analyzed within its microcanonical Statistical Thermodynamics module, which allows for computing the entropy, caloric curve, specific heat and free energies. As a case study, we investigate the aggregation of heteropolymers bioinspired on Aβ25-33 fragments and their cross-seeding with IAPP20-29 isoforms. Excellent parallel scaling is observed, even under numerically difficult first-order like phase transitions, which are properly described by the built-in fully reconfigurable force fields. Still, the package is free and open source, this shall motivate users to readily adapt it to specific purposes.

  7. LPmerge: an R package for merging genetic maps by linear programming.

    PubMed

    Endelman, Jeffrey B; Plomion, Christophe

    2014-06-01

    Consensus genetic maps constructed from multiple populations are an important resource for both basic and applied research, including genome-wide association analysis, genome sequence assembly and studies of evolution. The LPmerge software uses linear programming to efficiently minimize the mean absolute error between the consensus map and the linkage maps from each population. This minimization is performed subject to linear inequality constraints that ensure the ordering of the markers in the linkage maps is preserved. When marker order is inconsistent between linkage maps, a minimum set of ordinal constraints is deleted to resolve the conflicts. LPmerge is on CRAN at http://cran.r-project.org/web/packages/LPmerge. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  8. The AIROPA software package: milestones for testing general relativity in the strong gravity regime with AO

    NASA Astrophysics Data System (ADS)

    Witzel, Gunther; Lu, Jessica R.; Ghez, Andrea M.; Martinez, Gregory D.; Fitzgerald, Michael P.; Britton, Matthew; Sitarski, Breann N.; Do, Tuan; Campbell, Randall D.; Service, Maxwell; Matthews, Keith; Morris, Mark R.; Becklin, E. E.; Wizinowich, Peter L.; Ragland, Sam; Doppmann, Greg; Neyman, Chris; Lyke, James; Kassis, Marc; Rizzi, Luca; Lilley, Scott; Rampy, Rachel

    2016-07-01

    General relativity can be tested in the strong gravity regime by monitoring stars orbiting the supermassive black hole at the Galactic Center with adaptive optics. However, the limiting source of uncertainty is the spatial PSF variability due to atmospheric anisoplanatism and instrumental aberrations. The Galactic Center Group at UCLA has completed a project developing algorithms to predict PSF variability for Keck AO images. We have created a new software package (AIROPA), based on modified versions of StarFinder and Arroyo, that takes atmospheric turbulence profiles, instrumental aberration maps, and images as inputs and delivers improved photometry and astrometry on crowded fields. This software package will be made publicly available soon.

  9. WannierTools: An open-source software package for novel topological materials

    NASA Astrophysics Data System (ADS)

    Wu, QuanSheng; Zhang, ShengNan; Song, Hai-Feng; Troyer, Matthias; Soluyanov, Alexey A.

    2018-03-01

    We present an open-source software package WannierTools, a tool for investigation of novel topological materials. This code works in the tight-binding framework, which can be generated by another software package Wannier90 (Mostofi et al., 2008). It can help to classify the topological phase of a given material by calculating the Wilson loop, and can get the surface state spectrum, which is detected by angle resolved photoemission (ARPES) and in scanning tunneling microscopy (STM) experiments. It also identifies positions of Weyl/Dirac points and nodal line structures, calculates the Berry phase around a closed momentum loop and Berry curvature in a part of the Brillouin zone (BZ).

  10. Use of symbolic computation in robotics education

    NASA Technical Reports Server (NTRS)

    Vira, Naren; Tunstel, Edward

    1992-01-01

    An application of symbolic computation in robotics education is described. A software package is presented which combines generality, user interaction, and user-friendliness with the systematic usage of symbolic computation and artificial intelligence techniques. The software utilizes MACSYMA, a LISP-based symbolic algebra language, to automatically generate closed-form expressions representing forward and inverse kinematics solutions, the Jacobian transformation matrices, robot pose error-compensation models equations, and Lagrange dynamics formulation for N degree-of-freedom, open chain robotic manipulators. The goal of such a package is to aid faculty and students in the robotics course by removing burdensome tasks of mathematical manipulations. The software package has been successfully tested for its accuracy using commercially available robots.

  11. ImagePy: an open-source, Python-based and platform-independent software package for boimage analysis.

    PubMed

    Wang, Anliang; Yan, Xiaolong; Wei, Zhijun

    2018-04-27

    This note presents the design of a scalable software package named ImagePy for analysing biological images. Our contribution is concentrated on facilitating extensibility and interoperability of the software through decoupling the data model from the user interface. Especially with assistance from the Python ecosystem, this software framework makes modern computer algorithms easier to be applied in bioimage analysis. ImagePy is free and open source software, with documentation and code available at https://github.com/Image-Py/imagepy under the BSD license. It has been tested on the Windows, Mac and Linux operating systems. wzjdlut@dlut.edu.cn or yxdragon@imagepy.org.

  12. IBS: an illustrator for the presentation and visualization of biological sequences.

    PubMed

    Liu, Wenzhong; Xie, Yubin; Ma, Jiyong; Luo, Xiaotong; Nie, Peng; Zuo, Zhixiang; Lahrmann, Urs; Zhao, Qi; Zheng, Yueyuan; Zhao, Yong; Xue, Yu; Ren, Jian

    2015-10-15

    Biological sequence diagrams are fundamental for visualizing various functional elements in protein or nucleotide sequences that enable a summarization and presentation of existing information as well as means of intuitive new discoveries. Here, we present a software package called illustrator of biological sequences (IBS) that can be used for representing the organization of either protein or nucleotide sequences in a convenient, efficient and precise manner. Multiple options are provided in IBS, and biological sequences can be manipulated, recolored or rescaled in a user-defined mode. Also, the final representational artwork can be directly exported into a publication-quality figure. The standalone package of IBS was implemented in JAVA, while the online service was implemented in HTML5 and JavaScript. Both the standalone package and online service are freely available at http://ibs.biocuckoo.org. renjian.sysu@gmail.com or xueyu@hust.edu.cn Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  13. ParallelStructure: A R Package to Distribute Parallel Runs of the Population Genetics Program STRUCTURE on Multi-Core Computers

    PubMed Central

    Besnier, Francois; Glover, Kevin A.

    2013-01-01

    This software package provides an R-based framework to make use of multi-core computers when running analyses in the population genetics program STRUCTURE. It is especially addressed to those users of STRUCTURE dealing with numerous and repeated data analyses, and who could take advantage of an efficient script to automatically distribute STRUCTURE jobs among multiple processors. It also consists of additional functions to divide analyses among combinations of populations within a single data set without the need to manually produce multiple projects, as it is currently the case in STRUCTURE. The package consists of two main functions: MPI_structure() and parallel_structure() as well as an example data file. We compared the performance in computing time for this example data on two computer architectures and showed that the use of the present functions can result in several-fold improvements in terms of computation time. ParallelStructure is freely available at https://r-forge.r-project.org/projects/parallstructure/. PMID:23923012

  14. IBS: an illustrator for the presentation and visualization of biological sequences

    PubMed Central

    Liu, Wenzhong; Xie, Yubin; Ma, Jiyong; Luo, Xiaotong; Nie, Peng; Zuo, Zhixiang; Lahrmann, Urs; Zhao, Qi; Zheng, Yueyuan; Zhao, Yong; Xue, Yu; Ren, Jian

    2015-01-01

    Summary: Biological sequence diagrams are fundamental for visualizing various functional elements in protein or nucleotide sequences that enable a summarization and presentation of existing information as well as means of intuitive new discoveries. Here, we present a software package called illustrator of biological sequences (IBS) that can be used for representing the organization of either protein or nucleotide sequences in a convenient, efficient and precise manner. Multiple options are provided in IBS, and biological sequences can be manipulated, recolored or rescaled in a user-defined mode. Also, the final representational artwork can be directly exported into a publication-quality figure. Availability and implementation: The standalone package of IBS was implemented in JAVA, while the online service was implemented in HTML5 and JavaScript. Both the standalone package and online service are freely available at http://ibs.biocuckoo.org. Contact: renjian.sysu@gmail.com or xueyu@hust.edu.cn Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26069263

  15. Extreme Ultraviolet Imaging Telescope (EIT)

    NASA Technical Reports Server (NTRS)

    Lemen, J. R.; Freeland, S. L.

    1997-01-01

    Efforts concentrated on development and implementation of the SolarSoft (SSW) data analysis system. From an EIT analysis perspective, this system was designed to facilitate efficient reuse and conversion of software developed for Yohkoh/SXT and to take advantage of a large existing body of software developed by the SDAC, Yohkoh, and SOHO instrument teams. Another strong motivation for this system was to provide an EIT analysis environment which permits coordinated analysis of EIT data in conjunction with data from important supporting instruments, including Yohkoh/SXT and the other SOHO coronal instruments; CDS, SUMER, and LASCO. In addition, the SSW system will support coordinated EIT/TRACE analysis (by design) when TRACE data is available; TRACE launch is currently planned for March 1998. Working with Jeff Newmark, the Chianti software package (K.P. Dere et al) and UV /EUV data base was fully integrated into the SSW system to facilitate EIT temperature and emission analysis.

  16. A User-Friendly Software Package for HIFU Simulation

    NASA Astrophysics Data System (ADS)

    Soneson, Joshua E.

    2009-04-01

    A freely-distributed, MATLAB (The Mathworks, Inc., Natick, MA)-based software package for simulating axisymmetric high-intensity focused ultrasound (HIFU) beams and their heating effects is discussed. The package (HIFU_Simulator) consists of a propagation module which solves the Khokhlov-Zabolotskaya-Kuznetsov (KZK) equation and a heating module which solves Pennes' bioheat transfer (BHT) equation. The pressure, intensity, heating rate, temperature, and thermal dose fields are computed, plotted, the output is released to the MATLAB workspace for further user analysis or postprocessing.

  17. Multiscale Modeling and Uncertainty Quantification for Nuclear Fuel Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Estep, Donald; El-Azab, Anter; Pernice, Michael

    2017-03-23

    In this project, we will address the challenges associated with constructing high fidelity multiscale models of nuclear fuel performance. We (*) propose a novel approach for coupling mesoscale and macroscale models, (*) devise efficient numerical methods for simulating the coupled system, and (*) devise and analyze effective numerical approaches for error and uncertainty quantification for the coupled multiscale system. As an integral part of the project, we will carry out analysis of the effects of upscaling and downscaling, investigate efficient methods for stochastic sensitivity analysis of the individual macroscale and mesoscale models, and carry out a posteriori error analysis formore » computed results. We will pursue development and implementation of solutions in software used at Idaho National Laboratories on models of interest to the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program.« less

  18. Comparison of requirements and capabilities of major multipurpose software packages.

    PubMed

    Igo, Robert P; Schnell, Audrey H

    2012-01-01

    The aim of this chapter is to introduce the reader to commonly used software packages and illustrate their input requirements, analysis options, strengths, and limitations. We focus on packages that perform more than one function and include a program for quality control, linkage, and association analyses. Additional inclusion criteria were (1) programs that are free to academic users and (2) currently supported, maintained, and developed. Using those criteria, we chose to review three programs: Statistical Analysis for Genetic Epidemiology (S.A.G.E.), PLINK, and Merlin. We will describe the required input format and analysis options. We will not go into detail about every possible program in the packages, but we will give an overview of the packages requirements and capabilities.

  19. StochKit2: software for discrete stochastic simulation of biochemical systems with events.

    PubMed

    Sanft, Kevin R; Wu, Sheng; Roh, Min; Fu, Jin; Lim, Rone Kwei; Petzold, Linda R

    2011-09-01

    StochKit2 is the first major upgrade of the popular StochKit stochastic simulation software package. StochKit2 provides highly efficient implementations of several variants of Gillespie's stochastic simulation algorithm (SSA), and tau-leaping with automatic step size selection. StochKit2 features include automatic selection of the optimal SSA method based on model properties, event handling, and automatic parallelism on multicore architectures. The underlying structure of the code has been completely updated to provide a flexible framework for extending its functionality. StochKit2 runs on Linux/Unix, Mac OS X and Windows. It is freely available under GPL version 3 and can be downloaded from http://sourceforge.net/projects/stochkit/. petzold@engineering.ucsb.edu.

  20. In-ground operation of Geothermic Fuel Cells for unconventional oil and gas recovery

    NASA Astrophysics Data System (ADS)

    Sullivan, Neal; Anyenya, Gladys; Haun, Buddy; Daubenspeck, Mark; Bonadies, Joseph; Kerr, Rick; Fischer, Bernhard; Wright, Adam; Jones, Gerald; Li, Robert; Wall, Mark; Forbes, Alan; Savage, Marshall

    2016-01-01

    This paper presents operating and performance characteristics of a nine-stack solid-oxide fuel cell combined-heat-and-power system. Integrated with a natural-gas fuel processor, air compressor, reactant-gas preheater, and diagnostics and control equipment, the system is designed for use in unconventional oil-and-gas processing. Termed a ;Geothermic Fuel Cell; (GFC), the heat liberated by the fuel cell during electricity generation is harnessed to process oil shale into high-quality crude oil and natural gas. The 1.5-kWe SOFC stacks are packaged within three-stack GFC modules. Three GFC modules are mechanically and electrically coupled to a reactant-gas preheater and installed within the earth. During operation, significant heat is conducted from the Geothermic Fuel Cell to the surrounding geology. The complete system was continuously operated on hydrogen and natural-gas fuels for ∼600 h. A quasi-steady operating point was established to favor heat generation (29.1 kWth) over electricity production (4.4 kWe). Thermodynamic analysis reveals a combined-heat-and-power efficiency of 55% at this condition. Heat flux to the geology averaged 3.2 kW m-1 across the 9-m length of the Geothermic Fuel Cell-preheater assembly. System performance is reviewed; some suggestions for improvement are proposed.

  1. Software package for modeling spin-orbit motion in storage rings

    NASA Astrophysics Data System (ADS)

    Zyuzin, D. V.

    2015-12-01

    A software package providing a graphical user interface for computer experiments on the motion of charged particle beams in accelerators, as well as analysis of obtained data, is presented. The software package was tested in the framework of the international project on electric dipole moment measurement JEDI (Jülich Electric Dipole moment Investigations). The specific features of particle spin motion imply the requirement to use a cyclic accelerator (storage ring) consisting of electrostatic elements, which makes it possible to preserve horizontal polarization for a long time. Computer experiments study the dynamics of 106-109 particles in a beam during 109 turns in an accelerator (about 1012-1015 integration steps for the equations of motion). For designing an optimal accelerator structure, a large number of computer experiments on polarized beam dynamics are required. The numerical core of the package is COSY Infinity, a program for modeling spin-orbit dynamics.

  2. Reference datasets for bioequivalence trials in a two-group parallel design.

    PubMed

    Fuglsang, Anders; Schütz, Helmut; Labes, Detlew

    2015-03-01

    In order to help companies qualify and validate the software used to evaluate bioequivalence trials with two parallel treatment groups, this work aims to define datasets with known results. This paper puts a total 11 datasets into the public domain along with proposed consensus obtained via evaluations from six different software packages (R, SAS, WinNonlin, OpenOffice Calc, Kinetica, EquivTest). Insofar as possible, datasets were evaluated with and without the assumption of equal variances for the construction of a 90% confidence interval. Not all software packages provide functionality for the assumption of unequal variances (EquivTest, Kinetica), and not all packages can handle datasets with more than 1000 subjects per group (WinNonlin). Where results could be obtained across all packages, one showed questionable results when datasets contained unequal group sizes (Kinetica). A proposal is made for the results that should be used as validation targets.

  3. Painting a picture across the landscape with ModelMap

    Treesearch

    Brian Cooke; Elizabeth Freeman; Gretchen Moisen; Tracey Frescino

    2017-01-01

    Scientists and statisticians working for the Rocky Mountain Research Station have created a software package that simplifies and automates many of the processes needed for converting models into maps. This software package, called ModelMap, has helped a variety of specialists and land managers to quickly convert data into easily understood graphical images. The...

  4. "FluSpec": A Simulated Experiment in Fluorescence Spectroscopy

    ERIC Educational Resources Information Center

    Bigger, Stephen W.; Bigger, Andrew S.; Ghiggino, Kenneth P.

    2014-01-01

    The "FluSpec" educational software package is a fully contained tutorial on the technique of fluorescence spectroscopy as well as a simulator on which experiments can be performed. The procedure for each of the experiments is also contained within the package along with example analyses of results that are obtained using the software.

  5. Comparative Analyses of MIRT Models and Software (BMIRT and flexMIRT)

    ERIC Educational Resources Information Center

    Yavuz, Guler; Hambleton, Ronald K.

    2017-01-01

    Application of MIRT modeling procedures is dependent on the quality of parameter estimates provided by the estimation software and techniques used. This study investigated model parameter recovery of two popular MIRT packages, BMIRT and flexMIRT, under some common measurement conditions. These packages were specifically selected to investigate the…

  6. Macintosh Computer Classroom and Laboratory Security: Preventing Unwanted Changes to the System.

    ERIC Educational Resources Information Center

    Senn, Gary J.; Smyth, Thomas J. C.

    Because of the graphical interface and "openness" of the operating system, Macintosh computers are susceptible to undesirable changes by the user. This presentation discusses the advantages and disadvantages of software packages that offer protection for the Macintosh system. The two basic forms of software security packages include a…

  7. Improvements to the APBS biomolecular solvation software suite.

    PubMed

    Jurrus, Elizabeth; Engel, Dave; Star, Keith; Monson, Kyle; Brandi, Juan; Felberg, Lisa E; Brookes, David H; Wilson, Leighton; Chen, Jiahui; Liles, Karina; Chun, Minju; Li, Peter; Gohara, David W; Dolinsky, Todd; Konecny, Robert; Koes, David R; Nielsen, Jens Erik; Head-Gordon, Teresa; Geng, Weihua; Krasny, Robert; Wei, Guo-Wei; Holst, Michael J; McCammon, J Andrew; Baker, Nathan A

    2018-01-01

    The Adaptive Poisson-Boltzmann Solver (APBS) software was developed to solve the equations of continuum electrostatics for large biomolecular assemblages that have provided impact in the study of a broad range of chemical, biological, and biomedical applications. APBS addresses the three key technology challenges for understanding solvation and electrostatics in biomedical applications: accurate and efficient models for biomolecular solvation and electrostatics, robust and scalable software for applying those theories to biomolecular systems, and mechanisms for sharing and analyzing biomolecular electrostatics data in the scientific community. To address new research applications and advancing computational capabilities, we have continually updated APBS and its suite of accompanying software since its release in 2001. In this article, we discuss the models and capabilities that have recently been implemented within the APBS software package including a Poisson-Boltzmann analytical and a semi-analytical solver, an optimized boundary element solver, a geometry-based geometric flow solvation model, a graph theory-based algorithm for determining pK a values, and an improved web-based visualization tool for viewing electrostatics. © 2017 The Protein Society.

  8. Simulation and animation of sensor-driven robots

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, C.; Trivedi, M.M.; Bidlack, C.R.

    1994-10-01

    Most simulation and animation systems utilized in robotics are concerned with simulation of the robot and its environment without simulation of sensors. These systems have difficulty in handling robots that utilize sensory feedback in their operation. In this paper, a new design of an environment for simulation, animation, and visualization of sensor-driven robots is presented. As sensor technology advances, increasing numbers of robots are equipped with various types of sophisticated sensors. The main goal of creating the visualization environment is to aide the automatic robot programming and off-line programming capabilities of sensor-driven robots. The software system will help the usersmore » visualize the motion and reaction of the sensor-driven robot under their control program. Therefore, the efficiency of the software development is increased, the reliability of the software and the operation safety of the robot are ensured, and the cost of new software development is reduced. Conventional computer-graphics-based robot simulation and animation software packages lack of capabilities for robot sensing simulation. This paper describes a system designed to overcome this deficiency.« less

  9. Virginia Transit Performance Evaluation Package (VATPEP).

    DOT National Transportation Integrated Search

    1987-01-01

    The Virginia Transit Performance Evaluation Package (VATPEP), a computer software package, is documented. This is the computerized version of the methodology used by the Virginia Department of Transportation to evaluate the performance of public tran...

  10. Cooperative Work and Sustainable Scientific Software Practices in R

    NASA Astrophysics Data System (ADS)

    Weber, N.

    2013-12-01

    Most scientific software projects are dependent on the work of many diverse people, institutions and organizations. Incentivizing these actors to cooperatively develop software that is both reliable, and sustainable is complicated by the fact that the reward structures of these various actors greatly differ: research scientists want results from a software or model run in order to publish papers, produce new data, or test a hypothesis; software engineers and research centers want compilable, well documented code that is refactorable, reusable and reproducible in future research scenarios. While much research has been done on incentives and motivations for participating in open source software projects or cyberinfrastrcture development, little work has been done on what motivates or incentivizes developers to maintain scientific software projects beyond their original application. This poster will present early results of research into the incentives and motivation for cooperative scientific software development. In particular, this work focuses on motivations for the maintenance and repair of libraries on the software platform R. Our work here uses a sample of R packages that were created by research centers, or are specific to earth, environmental and climate science applications. We first mined 'check' logs from the Comprehensive R Archive Network (CRAN) to determine the amount of time a package has existed, the number of versions it has gone through over this time, the number of releases, and finally the contact information for each official package 'maintainer'. We then sent a survey to each official maintainer, asking them questions about what role they played in developing the original package, and what their motivations were for sustaining the project over time. We will present early results from this mining and our survey of R maintainers.

  11. Laser velocimeter data acquisition system for the Langley 14- by 22-foot subsonic tunnel. Software reference guide version 3.3

    NASA Technical Reports Server (NTRS)

    Jumper, Judith K.

    1994-01-01

    The Laser Velocimeter Data Acquisition System (LVDAS) in the Langley 14- by 22-Foot Tunnel is controlled by a comprehensive software package. The software package was designed to control the data acquisition process during wind tunnel tests which employ a laser velocimeter measurement system. This report provides detailed explanations on how to configure and operate the LVDAS system to acquire laser velocimeter and static wind tunnel data.

  12. Space-Shuttle Emulator Software

    NASA Technical Reports Server (NTRS)

    Arnold, Scott; Askew, Bill; Barry, Matthew R.; Leigh, Agnes; Mermelstein, Scott; Owens, James; Payne, Dan; Pemble, Jim; Sollinger, John; Thompson, Hiram; hide

    2007-01-01

    A package of software has been developed to execute a raw binary image of the space shuttle flight software for simulation of the computational effects of operation of space shuttle avionics. This software can be run on inexpensive computer workstations. Heretofore, it was necessary to use real flight computers to perform such tests and simulations. The package includes a program that emulates the space shuttle orbiter general- purpose computer [consisting of a central processing unit (CPU), input/output processor (IOP), master sequence controller, and buscontrol elements]; an emulator of the orbiter display electronics unit and models of the associated cathode-ray tubes, keyboards, and switch controls; computational models of the data-bus network; computational models of the multiplexer-demultiplexer components; an emulation of the pulse-code modulation master unit; an emulation of the payload data interleaver; a model of the master timing unit; a model of the mass memory unit; and a software component that ensures compatibility of telemetry and command services between the simulated space shuttle avionics and a mission control center. The software package is portable to several host platforms.

  13. Development of high performance scientific components for interoperability of computing packages

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gulabani, Teena Pratap

    2008-01-01

    Three major high performance quantum chemistry computational packages, NWChem, GAMESS and MPQC have been developed by different research efforts following different design patterns. The goal is to achieve interoperability among these packages by overcoming the challenges caused by the different communication patterns and software design of each of these packages. A chemistry algorithm is hard to develop as well as being a time consuming process; integration of large quantum chemistry packages will allow resource sharing and thus avoid reinvention of the wheel. Creating connections between these incompatible packages is the major motivation of the proposed work. This interoperability is achievedmore » by bringing the benefits of Component Based Software Engineering through a plug-and-play component framework called Common Component Architecture (CCA). In this thesis, I present a strategy and process used for interfacing two widely used and important computational chemistry methodologies: Quantum Mechanics and Molecular Mechanics. To show the feasibility of the proposed approach the Tuning and Analysis Utility (TAU) has been coupled with NWChem code and its CCA components. Results show that the overhead is negligible when compared to the ease and potential of organizing and coping with large-scale software applications.« less

  14. AlgoRun: a Docker-based packaging system for platform-agnostic implemented algorithms.

    PubMed

    Hosny, Abdelrahman; Vera-Licona, Paola; Laubenbacher, Reinhard; Favre, Thibauld

    2016-08-01

    There is a growing need in bioinformatics for easy-to-use software implementations of algorithms that are usable across platforms. At the same time, reproducibility of computational results is critical and often a challenge due to source code changes over time and dependencies. The approach introduced in this paper addresses both of these needs with AlgoRun, a dedicated packaging system for implemented algorithms, using Docker technology. Implemented algorithms, packaged with AlgoRun, can be executed through a user-friendly interface directly from a web browser or via a standardized RESTful web API to allow easy integration into more complex workflows. The packaged algorithm includes the entire software execution environment, thereby eliminating the common problem of software dependencies and the irreproducibility of computations over time. AlgoRun-packaged algorithms can be published on http://algorun.org, a centralized searchable directory to find existing AlgoRun-packaged algorithms. AlgoRun is available at http://algorun.org and the source code under GPL license is available at https://github.com/algorun laubenbacher@uchc.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  15. Predictive Utility of Marketed Volumetric Software Tools in Subjects at Risk for Alzheimer's: Do Regions Outside the Hippocampus Matter?

    PubMed Central

    Tanpitukpongse, Teerath P.; Mazurowski, Maciej A.; Ikhena, John; Petrella, Jeffrey R.

    2016-01-01

    Background and Purpose To assess prognostic efficacy of individual versus combined regional volumetrics in two commercially-available brain volumetric software packages for predicting conversion of patients with mild cognitive impairment to Alzheimer's disease. Materials and Methods Data was obtained through the Alzheimer's Disease Neuroimaging Initiative. 192 subjects (mean age 74.8 years, 39% female) diagnosed with mild cognitive impairment at baseline were studied. All had T1WI MRI sequences at baseline and 3-year clinical follow-up. Analysis was performed with NeuroQuant® and Neuroreader™. Receiver operating characteristic curves assessing the prognostic efficacy of each software package were generated using a univariable approach employing individual regional brain volumes, as well as two multivariable approaches (multiple regression and random forest), combining multiple volumes. Results On univariable analysis of 11 NeuroQuant® and 11 Neuroreader™ regional volumes, hippocampal volume had the highest area under the curve for both software packages (0.69 NeuroQuant®, 0.68 Neuroreader™), and was not significantly different (p > 0.05) between packages. Multivariable analysis did not increase the area under the curve for either package (0.63 logistic regression, 0.60 random forest NeuroQuant®; 0.65 logistic regression, 0.62 random forest Neuroreader™). Conclusion Of the multiple regional volume measures available in FDA-cleared brain volumetric software packages, hippocampal volume remains the best single predictor of conversion of mild cognitive impairment to Alzheimer's disease at 3-year follow-up. Combining volumetrics did not add additional prognostic efficacy. Therefore, future prognostic studies in MCI, combining such tools with demographic and other biomarker measures, are justified in using hippocampal volume as the only volumetric biomarker. PMID:28057634

  16. Comparison of Perfusion CT Software to Predict the Final Infarct Volume After Thrombectomy.

    PubMed

    Austein, Friederike; Riedel, Christian; Kerby, Tina; Meyne, Johannes; Binder, Andreas; Lindner, Thomas; Huhndorf, Monika; Wodarg, Fritz; Jansen, Olav

    2016-09-01

    Computed tomographic perfusion represents an interesting physiological imaging modality to select patients for reperfusion therapy in acute ischemic stroke. The purpose of our study was to determine the accuracy of different commercial perfusion CT software packages (Philips (A), Siemens (B), and RAPID (C)) to predict the final infarct volume (FIV) after mechanical thrombectomy. Single-institutional computed tomographic perfusion data from 147 mechanically recanalized acute ischemic stroke patients were postprocessed. Ischemic core and FIV were compared about thrombolysis in cerebral infarction (TICI) score and time interval to reperfusion. FIV was measured at follow-up imaging between days 1 and 8 after stroke. In 118 successfully recanalized patients (TICI 2b/3), a moderately to strongly positive correlation was observed between ischemic core and FIV. The highest accuracy and best correlation are shown in early and fully recanalized patients (Pearson r for A=0.42, B=0.64, and C=0.83; P<0.001). Bland-Altman plots and boxplots demonstrate smaller ranges in package C than in A and B. Significant differences were found between the packages about over- and underestimation of the ischemic core. Package A, compared with B and C, estimated more than twice as many patients with a malignant stroke profile (P<0.001). Package C best predicted hypoperfusion volume in nonsuccessfully recanalized patients. Our study demonstrates best accuracy and approximation between the results of a fully automated software (RAPID) and FIV, especially in early and fully recanalized patients. Furthermore, this software package overestimated the FIV to a significantly lower degree and estimated a malignant mismatch profile less often than other software. © 2016 American Heart Association, Inc.

  17. Genome-wide study of correlations between genomic features and their relationship with the regulation of gene expression.

    PubMed

    Kravatsky, Yuri V; Chechetkin, Vladimir R; Tchurikov, Nikolai A; Kravatskaya, Galina I

    2015-02-01

    The broad class of tasks in genetics and epigenetics can be reduced to the study of various features that are distributed over the genome (genome tracks). The rapid and efficient processing of the huge amount of data stored in the genome-scale databases cannot be achieved without the software packages based on the analytical criteria. However, strong inhomogeneity of genome tracks hampers the development of relevant statistics. We developed the criteria for the assessment of genome track inhomogeneity and correlations between two genome tracks. We also developed a software package, Genome Track Analyzer, based on this theory. The theory and software were tested on simulated data and were applied to the study of correlations between CpG islands and transcription start sites in the Homo sapiens genome, between profiles of protein-binding sites in chromosomes of Drosophila melanogaster, and between DNA double-strand breaks and histone marks in the H. sapiens genome. Significant correlations between transcription start sites on the forward and the reverse strands were observed in genomes of D. melanogaster, Caenorhabditis elegans, Mus musculus, H. sapiens, and Danio rerio. The observed correlations may be related to the regulation of gene expression in eukaryotes. Genome Track Analyzer is freely available at http://ancorr.eimb.ru/. © The Author 2015. Published by Oxford University Press on behalf of Kazusa DNA Research Institute.

  18. Argillite And Crystalline Disposal Research: Accomplishments And Path-Forward.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McMahon, Kevin A.; Jove-Colon, Carlos F.; Wang, Yifeng

    The intention of this document is to provide a path-forward for research and development (R&D) for two host rock media-specific (argillite and crystalline) disposal research work packages within the Used Fuel Disposition Campaign (UFDC). The two work packages, Argillite Disposal R&D and Crystalline Disposal R&D, support the achievement of the overarching mission and objectives of the Department of Energy Office of Nuclear Energy Fuel Cycle Technologies Program. These two work packages cover many of the fundamental technical issues that will have multiple implications to other disposal research work packages by bridging knowledge gaps to support the development of the safetymore » case. The path-forward begins with the assumption of target dates that are set out in the January 2013 DOE Strategy for the Management and Disposal of Used Nuclear Fuel and High-Level Radioactive Waste (http://energy.gov/downloads/strategy-management-and-disposal-used-nuclear-fuel-and-high-levelradioactive- waste). The path-forward will be maintained as a living document and will be updated as needed in response to available funding and the progress of multiple R&D tasks in the Used Fuel Disposition Campaign and the Fuel Cycle Technologies Program. This path forward is developed based on the report of “Used Fuel Disposition Campaign Disposal Research and Development Roadmap (FCR&D-USED- 2011-000065 REV0)” (DOE, 2011). This document delineates the goals and objectives of the UFDC R&D program, needs for generic disposal concept design, and summarizes the prioritization of R&D issues.« less

  19. User's Guide for MapIMG 2: Map Image Re-projection Software Package

    USGS Publications Warehouse

    Finn, Michael P.; Trent, Jason R.; Buehler, Robert A.

    2006-01-01

    BACKGROUND Scientists routinely accomplish small-scale geospatial modeling in the raster domain, using high-resolution datasets for large parts of continents and low-resolution to high-resolution datasets for the entire globe. Direct implementation of point-to-point transformation with appropriate functions yields the variety of projections available in commercial software packages, but implementation with data other than points requires specific adaptation of the transformation equations or prior preparation of the data to allow the transformation to succeed. It seems that some of these packages use the U.S. Geological Survey's (USGS) General Cartographic Transformation Package (GCTP) or similar point transformations without adaptation to the specific characteristics of raster data (Usery and others, 2003a). Usery and others (2003b) compiled and tabulated the accuracy of categorical areas in projected raster datasets of global extent. Based on the shortcomings identified in these studies, geographers and applications programmers at the USGS expanded and evolved a USGS software package, MapIMG, for raster map projection transformation (Finn and Trent, 2004). Daniel R. Steinwand of Science Applications International Corporation, National Center for Earth Resources Observation and Science, originally developed MapIMG for the USGS, basing it on GCTP. Through previous and continuing efforts at the USGS' National Geospatial Technical Operations Center, this program has been transformed from an application based on command line input into a software package based on a graphical user interface for Windows, Linux, and other UNIX machines.

  20. CMIP: a software package capable of reconstructing genome-wide regulatory networks using gene expression data.

    PubMed

    Zheng, Guangyong; Xu, Yaochen; Zhang, Xiujun; Liu, Zhi-Ping; Wang, Zhuo; Chen, Luonan; Zhu, Xin-Guang

    2016-12-23

    A gene regulatory network (GRN) represents interactions of genes inside a cell or tissue, in which vertexes and edges stand for genes and their regulatory interactions respectively. Reconstruction of gene regulatory networks, in particular, genome-scale networks, is essential for comparative exploration of different species and mechanistic investigation of biological processes. Currently, most of network inference methods are computationally intensive, which are usually effective for small-scale tasks (e.g., networks with a few hundred genes), but are difficult to construct GRNs at genome-scale. Here, we present a software package for gene regulatory network reconstruction at a genomic level, in which gene interaction is measured by the conditional mutual information measurement using a parallel computing framework (so the package is named CMIP). The package is a greatly improved implementation of our previous PCA-CMI algorithm. In CMIP, we provide not only an automatic threshold determination method but also an effective parallel computing framework for network inference. Performance tests on benchmark datasets show that the accuracy of CMIP is comparable to most current network inference methods. Moreover, running tests on synthetic datasets demonstrate that CMIP can handle large datasets especially genome-wide datasets within an acceptable time period. In addition, successful application on a real genomic dataset confirms its practical applicability of the package. This new software package provides a powerful tool for genomic network reconstruction to biological community. The software can be accessed at http://www.picb.ac.cn/CMIP/ .

  1. An Integrated Fuel Depletion Calculator for Fuel Cycle Options Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schneider, Erich; Scopatz, Anthony

    2016-04-25

    Bright-lite is a reactor modeling software developed at the University of Texas Austin to expand upon the work done with the Bright [1] reactor modeling software. Originally, bright-lite was designed to function as a standalone reactor modeling software. However, this aim was refocused t couple bright-lite with the Cyclus fuel cycle simulator [2] to make it a module for the fuel cycle simulator.

  2. Orchid: a novel management, annotation and machine learning framework for analyzing cancer mutations.

    PubMed

    Cario, Clinton L; Witte, John S

    2018-03-15

    As whole-genome tumor sequence and biological annotation datasets grow in size, number and content, there is an increasing basic science and clinical need for efficient and accurate data management and analysis software. With the emergence of increasingly sophisticated data stores, execution environments and machine learning algorithms, there is also a need for the integration of functionality across frameworks. We present orchid, a python based software package for the management, annotation and machine learning of cancer mutations. Building on technologies of parallel workflow execution, in-memory database storage and machine learning analytics, orchid efficiently handles millions of mutations and hundreds of features in an easy-to-use manner. We describe the implementation of orchid and demonstrate its ability to distinguish tissue of origin in 12 tumor types based on 339 features using a random forest classifier. Orchid and our annotated tumor mutation database are freely available at https://github.com/wittelab/orchid. Software is implemented in python 2.7, and makes use of MySQL or MemSQL databases. Groovy 2.4.5 is optionally required for parallel workflow execution. JWitte@ucsf.edu. Supplementary data are available at Bioinformatics online.

  3. Near real-time, on-the-move software PED using VPEF

    NASA Astrophysics Data System (ADS)

    Green, Kevin; Geyer, Chris; Burnette, Chris; Agarwal, Sanjeev; Swett, Bruce; Phan, Chung; Deterline, Diane

    2015-05-01

    The scope of the Micro-Cloud for Operational, Vehicle-Based EO-IR Reconnaissance System (MOVERS) development effort, managed by the Night Vision and Electronic Sensors Directorate (NVESD), is to develop, integrate, and demonstrate new sensor technologies and algorithms that improve improvised device/mine detection using efficient and effective exploitation and fusion of sensor data and target cues from existing and future Route Clearance Package (RCP) sensor systems. Unfortunately, the majority of forward looking Full Motion Video (FMV) and computer vision processing, exploitation, and dissemination (PED) algorithms are often developed using proprietary, incompatible software. This makes the insertion of new algorithms difficult due to the lack of standardized processing chains. In order to overcome these limitations, EOIR developed the Government off-the-shelf (GOTS) Video Processing and Exploitation Framework (VPEF) to be able to provide standardized interfaces (e.g., input/output video formats, sensor metadata, and detected objects) for exploitation software and to rapidly integrate and test computer vision algorithms. EOIR developed a vehicle-based computing framework within the MOVERS and integrated it with VPEF. VPEF was further enhanced for automated processing, detection, and publishing of detections in near real-time, thus improving the efficiency and effectiveness of RCP sensor systems.

  4. The MR-Base platform supports systematic causal inference across the human phenome

    PubMed Central

    Wade, Kaitlin H; Haberland, Valeriia; Baird, Denis; Laurin, Charles; Burgess, Stephen; Bowden, Jack; Langdon, Ryan; Tan, Vanessa Y; Yarmolinsky, James; Shihab, Hashem A; Timpson, Nicholas J; Evans, David M; Relton, Caroline; Martin, Richard M; Davey Smith, George

    2018-01-01

    Results from genome-wide association studies (GWAS) can be used to infer causal relationships between phenotypes, using a strategy known as 2-sample Mendelian randomization (2SMR) and bypassing the need for individual-level data. However, 2SMR methods are evolving rapidly and GWAS results are often insufficiently curated, undermining efficient implementation of the approach. We therefore developed MR-Base (http://www.mrbase.org): a platform that integrates a curated database of complete GWAS results (no restrictions according to statistical significance) with an application programming interface, web app and R packages that automate 2SMR. The software includes several sensitivity analyses for assessing the impact of horizontal pleiotropy and other violations of assumptions. The database currently comprises 11 billion single nucleotide polymorphism-trait associations from 1673 GWAS and is updated on a regular basis. Integrating data with software ensures more rigorous application of hypothesis-driven analyses and allows millions of potential causal relationships to be efficiently evaluated in phenome-wide association studies. PMID:29846171

  5. Comparison of four software packages for CT lung volumetry in healthy individuals.

    PubMed

    Nemec, Stefan F; Molinari, Francesco; Dufresne, Valerie; Gosset, Natacha; Silva, Mario; Bankier, Alexander A

    2015-06-01

    To compare CT lung volumetry (CTLV) measurements provided by different software packages, and to provide normative data for lung densitometric measurements in healthy individuals. This retrospective study included 51 chest CTs of 17 volunteers (eight men and nine women; mean age, 30 ± 6 years), who underwent spirometrically monitored CT at total lung capacity (TLC), functional residual capacity (FRC), and mean inspiratory capacity (MIC). Volumetric differences assessed by four commercial software packages were compared with analysis of variance (ANOVA) for repeated measurements and benchmarked against the threshold for acceptable variability between spirometric measurements. Mean lung density (MLD) and parenchymal heterogeneity (MLD-SD) were also compared with ANOVA. Volumetric differences ranged from 12 to 213 ml (0.20 % to 6.45 %). Although 16/18 comparisons (among four software packages at TLC, MIC, and FRC) were statistically significant (P < 0.001 to P = 0.004), only 3/18 comparisons, one at MIC and two at FRC, exceeded the spirometry variability threshold. MLD and MLD-SD significantly increased with decreasing volumes, and were significantly larger in lower compared to upper lobes (P < 0.001). Lung volumetric differences provided by different software packages are small. These differences should not be interpreted based on statistical significance alone, but together with absolute volumetric differences. • Volumetric differences, assessed by different CTLV software, are small but statistically significant. • Volumetric differences are smaller at TLC than at MIC and FRC. • Volumetric differences rarely exceed spirometric repeatability thresholds at MIC and FRC. • Differences between CTLV measurements should be interpreted based on comparison of absolute differences. • MLD increases with decreasing volumes, and is larger in lower compared to upper lobes.

  6. Flight simulation software at NASA Dryden Flight Research Center

    NASA Technical Reports Server (NTRS)

    Norlin, Ken A.

    1995-01-01

    The NASA Dryden Flight Research Center has developed a versatile simulation software package that is applicable to a broad range of fixed-wing aircraft. This package has evolved in support of a variety of flight research programs. The structure is designed to be flexible enough for use in batch-mode, real-time pilot-in-the-loop, and flight hardware-in-the-loop simulation. Current simulations operate on UNIX-based platforms and are coded with a FORTRAN shell and C support routines. This paper discusses the features of the simulation software design and some basic model development techniques. The key capabilities that have been included in the simulation are described. The NASA Dryden simulation software is in use at other NASA centers, within industry, and at several universities. The straightforward but flexible design of this well-validated package makes it especially useful in an engineering environment.

  7. The Ettention software package.

    PubMed

    Dahmen, Tim; Marsalek, Lukas; Marniok, Nico; Turoňová, Beata; Bogachev, Sviatoslav; Trampert, Patrick; Nickels, Stefan; Slusallek, Philipp

    2016-02-01

    We present a novel software package for the problem "reconstruction from projections" in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. PB-AM: An open-source, fully analytical linear poisson-boltzmann solver.

    PubMed

    Felberg, Lisa E; Brookes, David H; Yap, Eng-Hui; Jurrus, Elizabeth; Baker, Nathan A; Head-Gordon, Teresa

    2017-06-05

    We present the open source distributed software package Poisson-Boltzmann Analytical Method (PB-AM), a fully analytical solution to the linearized PB equation, for molecules represented as non-overlapping spherical cavities. The PB-AM software package includes the generation of outputs files appropriate for visualization using visual molecular dynamics, a Brownian dynamics scheme that uses periodic boundary conditions to simulate dynamics, the ability to specify docking criteria, and offers two different kinetics schemes to evaluate biomolecular association rate constants. Given that PB-AM defines mutual polarization completely and accurately, it can be refactored as a many-body expansion to explore 2- and 3-body polarization. Additionally, the software has been integrated into the Adaptive Poisson-Boltzmann Solver (APBS) software package to make it more accessible to a larger group of scientists, educators, and students that are more familiar with the APBS framework. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  9. Geospatial approach towards enumerative analysis of suspended sediment concentration for Ganges-Brahmaputra Bay

    NASA Astrophysics Data System (ADS)

    Pandey, Palak; Kunte, Pravin D.

    2016-10-01

    This study presents an easy, modular, user-friendly, and flexible software package for processing of Landsat 7 ETM and Landsat 8 OLI-TIRS data for estimating suspended particulate matter concentrations in the coastal waters. This package includes 1) algorithm developed using freely downloadable SCILAB package, 2) ERDAS Models for iterative processing of Landsat images and 3) ArcMAP tool for plotting and map making. Utilizing SCILAB package, a module is written for geometric corrections, radiometric corrections and obtaining normalized water-leaving reflectance by incorporating Landsat 8 OLI-TIRS and Landsat 7 ETM+ data. Using ERDAS models, a sequence of modules are developed for iterative processing of Landsat images and estimating suspended particulate matter concentrations. Processed images are used for preparing suspended sediment concentration maps. The applicability of this software package is demonstrated by estimating and plotting seasonal suspended sediment concentration maps off the Bengal delta. The software is flexible enough to accommodate other remotely sensed data like Ocean Color monitor (OCM) data, Indian Remote Sensing data (IRS), MODIS data etc. by replacing a few parameters in the algorithm, for estimating suspended sediment concentration in coastal waters.

  10. DOVIS 2.0: an efficient and easy to use parallel virtual screening tool based on AutoDock 4.0.

    PubMed

    Jiang, Xiaohui; Kumar, Kamal; Hu, Xin; Wallqvist, Anders; Reifman, Jaques

    2008-09-08

    Small-molecule docking is an important tool in studying receptor-ligand interactions and in identifying potential drug candidates. Previously, we developed a software tool (DOVIS) to perform large-scale virtual screening of small molecules in parallel on Linux clusters, using AutoDock 3.05 as the docking engine. DOVIS enables the seamless screening of millions of compounds on high-performance computing platforms. In this paper, we report significant advances in the software implementation of DOVIS 2.0, including enhanced screening capability, improved file system efficiency, and extended usability. To keep DOVIS up-to-date, we upgraded the software's docking engine to the more accurate AutoDock 4.0 code. We developed a new parallelization scheme to improve runtime efficiency and modified the AutoDock code to reduce excessive file operations during large-scale virtual screening jobs. We also implemented an algorithm to output docked ligands in an industry standard format, sd-file format, which can be easily interfaced with other modeling programs. Finally, we constructed a wrapper-script interface to enable automatic rescoring of docked ligands by arbitrarily selected third-party scoring programs. The significance of the new DOVIS 2.0 software compared with the previous version lies in its improved performance and usability. The new version makes the computation highly efficient by automating load balancing, significantly reducing excessive file operations by more than 95%, providing outputs that conform to industry standard sd-file format, and providing a general wrapper-script interface for rescoring of docked ligands. The new DOVIS 2.0 package is freely available to the public under the GNU General Public License.

  11. A PC-based computer package for automatic detection and location of earthquakes: Application to a seismic network in eastern sicity (Italy)

    NASA Astrophysics Data System (ADS)

    Patanè, Domenico; Ferrari, Ferruccio; Giampiccolo, Elisabetta; Gresta, Stefano

    Few automated data acquisition and processing systems operate on mainframes, some run on UNIX-based workstations and others on personal computers, equipped with either DOS/WINDOWS or UNIX-derived operating systems. Several large and complex software packages for automatic and interactive analysis of seismic data have been developed in recent years (mainly for UNIX-based systems). Some of these programs use a variety of artificial intelligence techniques. The first operational version of a new software package, named PC-Seism, for analyzing seismic data from a local network is presented in Patanè et al. (1999). This package, composed of three separate modules, provides an example of a new generation of visual object-oriented programs for interactive and automatic seismic data-processing running on a personal computer. In this work, we mainly discuss the automatic procedures implemented in the ASDP (Automatic Seismic Data-Processing) module and real time application to data acquired by a seismic network running in eastern Sicily. This software uses a multi-algorithm approach and a new procedure MSA (multi-station-analysis) for signal detection, phase grouping and event identification and location. It is designed for an efficient and accurate processing of local earthquake records provided by single-site and array stations. Results from ASDP processing of two different data sets recorded at Mt. Etna volcano by a regional network are analyzed to evaluate its performance. By comparing the ASDP pickings with those revised manually, the detection and subsequently the location capabilities of this software are assessed. The first data set is composed of 330 local earthquakes recorded in the Mt. Etna erea during 1997 by the telemetry analog seismic network. The second data set comprises about 970 automatic locations of more than 2600 local events recorded at Mt. Etna during the last eruption (July 2001) at the present network. For the former data set, a comparison of the automatic results with the manual picks indicates that the ASDP module can accurately pick 80% of the P-waves and 65% of S-waves. The on-line application on the latter data set shows that automatic locations are affected by larger errors, due to the preliminary setting of the configuration parameters in the program. However, both automatic ASDP and manual hypocenter locations are comparable within the estimated error bounds. New improvements of the PC-Seism software for on-line analysis are also discussed.

  12. Improving Building Energy Simulation Programs Through Diagnostic Testing (Fact Sheet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2012-02-01

    New test procedure evaluates quality and accuracy of energy analysis tools for the residential building retrofit market. Reducing the energy use of existing homes in the United States offers significant energy-saving opportunities, which can be identified through building simulation software tools that calculate optimal packages of efficiency measures. To improve the accuracy of energy analysis for residential buildings, the National Renewable Energy Laboratory's (NREL) Buildings Research team developed the Building Energy Simulation Test for Existing Homes (BESTEST-EX), a method for diagnosing and correcting errors in building energy audit software and calibration procedures. BESTEST-EX consists of building physics and utility billmore » calibration test cases, which software developers can use to compare their tools simulation findings to reference results generated with state-of-the-art simulation tools. Overall, the BESTEST-EX methodology: (1) Tests software predictions of retrofit energy savings in existing homes; (2) Ensures building physics calculations and utility bill calibration procedures perform to a minimum standard; and (3) Quantifies impacts of uncertainties in input audit data and occupant behavior. BESTEST-EX is helping software developers identify and correct bugs in their software, as well as develop and test utility bill calibration procedures.« less

  13. A Microcomputer-Based Software Package for Eye-Monitoring Research. Technical Report No. 434.

    ERIC Educational Resources Information Center

    McConkie, George W.; And Others

    A software package is described that collects and reduces eye behavior data (eye position and pupil size) using an IBM-PC compatible computer. Written in C language for speed and portability, it includes several features: (1) data can be simultaneously collected from other sources (such as electroencephalography and electromyography); (2)…

  14. Overview of Current Activities in Combustion Instability

    DTIC Science & Technology

    2015-10-02

    and avoid liquid rocket engine combustion stability problems Approach:  1) Develop a  SOA  combustion stability software package  called Stable...phase II will invest in Multifidelity Tools and Methodologies – CSTD will develop a SOA combustion stability software package called Stable Combustion

  15. Sigma 2 Graphic Display Software Program Description

    NASA Technical Reports Server (NTRS)

    Johnson, B. T.

    1973-01-01

    A general purpose, user oriented graphic support package was implemented. A comprehensive description of the two software components comprising this package is given: Display Librarian and Display Controller. These programs have been implemented in FORTRAN on the XDS Sigma 2 Computer Facility. This facility consists of an XDS Sigma 2 general purpose computer coupled to a Computek Display Terminal.

  16. Physical particularities of nuclear reactors using heavy moderators of neutrons

    NASA Astrophysics Data System (ADS)

    Kulikov, G. G.; Shmelev, A. N.

    2016-12-01

    In nuclear reactors, thermal neutron spectra are formed using moderators with small atomic weights. For fast reactors, inserting such moderators in the core may create problems since they efficiently decelerate the neutrons. In order to form an intermediate neutron spectrum, it is preferable to employ neutron moderators with sufficiently large atomic weights, using 233U as a fissile nuclide and 232Th and 231Pa as fertile ones. The aim of the work is to investigate the properties of heavy neutron moderators and to assess their advantages. The analysis employs the JENDL-4.0 nuclear data library and the SCALE program package for simulating the variation of fuel composition caused by irradiation in the reactor. The following main results are obtained. By using heavy moderators with small neutron moderation steps, one is able to (1) increase the rate of resonance capture, so that the amount of fertile material in the fuel may be reduced while maintaining the breeding factor of the core; (2) use the vacant space for improving the fuel-element properties by adding inert, strong, and thermally conductive materials and by implementing dispersive fuel elements in which the fissile material is self-replenished and neutron multiplication remains stable during the process of fuel burnup; and (3) employ mixtures of different fertile materials with resonance capture cross sections in order to increase the resonance-lattice density and the probability of resonance neutron capture leading to formation of fissile material. The general conclusion is that, by forming an intermediate neutron spectrum with heavy neutron moderators, one can use the fuel more efficiently and improve nuclear safety.

  17. Textbook Software versus Professional Software: Which Is Better for Instructional Purposes?

    ERIC Educational Resources Information Center

    Snell, Meggan; Yatsenko, Olga

    2002-01-01

    Compares textbook software with professional packages such as Peachtree for teaching accounting, in terms of cost, availability, ease of teaching and learning, and applicability. Makes suggestions for choosing accounting software. (SK)

  18. System for real-time generation of georeferenced terrain models

    NASA Astrophysics Data System (ADS)

    Schultz, Howard J.; Hanson, Allen R.; Riseman, Edward M.; Stolle, Frank; Zhu, Zhigang; Hayward, Christopher D.; Slaymaker, Dana

    2001-02-01

    A growing number of law enforcement applications, especially in the areas of border security, drug enforcement and anti- terrorism require high-resolution wide area surveillance from unmanned air vehicles. At the University of Massachusetts we are developing an aerial reconnaissance system capable of generating high resolution, geographically registered terrain models (in the form of a seamless mosaic) in real-time from a single down-looking digital video camera. The efficiency of the processing algorithms, as well as the simplicity of the hardware, will provide the user with the ability to produce and roam through stereoscopic geo-referenced mosaic images in real-time, and to automatically generate highly accurate 3D terrain models offline in a fraction of the time currently required by softcopy conventional photogrammetry systems. The system is organized around a set of integrated sensor and software components. The instrumentation package is comprised of several inexpensive commercial-off-the-shelf components, including a digital video camera, a differential GPS, and a 3-axis heading and reference system. At the heart of the system is a set of software tools for image registration, mosaic generation, geo-location and aircraft state vector recovery. Each process is designed to efficiently handle the data collected by the instrument package. Particular attention is given to minimizing geospatial errors at each stage, as well as modeling propagation of errors through the system. Preliminary results for an urban and forested scene are discussed in detail.

  19. Object-oriented design of medical imaging software.

    PubMed

    Ligier, Y; Ratib, O; Logean, M; Girard, C; Perrier, R; Scherrer, J R

    1994-01-01

    A special software package for interactive display and manipulation of medical images was developed at the University Hospital of Geneva, as part of a hospital wide Picture Archiving and Communication System (PACS). This software package, called Osiris, was especially designed to be easily usable and adaptable to the needs of noncomputer-oriented physicians. The Osiris software has been developed to allow the visualization of medical images obtained from any imaging modality. It provides generic manipulation tools, processing tools, and analysis tools more specific to clinical applications. This software, based on an object-oriented paradigm, is portable and extensible. Osiris is available on two different operating systems: the Unix X-11/OSF-Motif based workstations, and the Macintosh family.

  20. Modelling of diesel engine fuelled with biodiesel using engine simulation software

    NASA Astrophysics Data System (ADS)

    Said, Mohd Farid Muhamad; Said, Mazlan; Aziz, Azhar Abdul

    2012-06-01

    This paper is about modelling of a diesel engine that operates using biodiesel fuels. The model is used to simulate or predict the performance and combustion of the engine by simplified the geometry of engine component in the software. The model is produced using one-dimensional (1D) engine simulation software called GT-Power. The fuel properties library in the software is expanded to include palm oil based biodiesel fuels. Experimental works are performed to investigate the effect of biodiesel fuels on the heat release profiles and the engine performance curves. The model is validated with experimental data and good agreement is observed. The simulation results show that combustion characteristics and engine performances differ when biodiesel fuels are used instead of no. 2 diesel fuel.

  1. A data reduction package for multiple object spectroscopy

    NASA Technical Reports Server (NTRS)

    Hill, J. M.; Eisenhamer, J. D.; Silva, D. R.

    1986-01-01

    Experience with fiber-optic spectrometers has demonstrated improvements in observing efficiency for clusters of 30 or more objects that must in turn be matched by data reduction capability increases. The Medusa Automatic Reduction System reduces data generated by multiobject spectrometers in the form of two-dimensional images containing 44 to 66 individual spectra, using both software and hardware improvements to efficiently extract the one-dimensional spectra. Attention is given to the ridge-finding algorithm for automatic location of the spectra in the CCD frame. A simultaneous extraction of calibration frames allows an automatic wavelength calibration routine to determine dispersion curves, and both line measurements and cross-correlation techniques are used to determine galaxy redshifts.

  2. Cleanup Verification Package for the 118-C-1, 105-C Solid Waste Burial Ground

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    M. J. Appel and J. M. Capron

    2007-07-25

    This cleanup verification package documents completion of remedial action for the 118-C-1, 105-C Solid Waste Burial Ground. This waste site was the primary burial ground for general wastes from the operation of the 105-C Reactor and received process tubes, aluminum fuel spacers, control rods, reactor hardware, spent nuclear fuel and soft wastes.

  3. Decon2LS: An open-source software package for automated processing and visualization of high resolution mass spectrometry data.

    PubMed

    Jaitly, Navdeep; Mayampurath, Anoop; Littlefield, Kyle; Adkins, Joshua N; Anderson, Gordon A; Smith, Richard D

    2009-03-17

    Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS)-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC) elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to identify and quantify peptides in the biological sample. Decon2LS is an efficient software package for discovering and visualizing features in proteomics studies that require automated interpretation of mass spectra. Besides being easy to use, fast, and reliable, Decon2LS is also open-source, which allows developers in the proteomics and bioinformatics communities to reuse and refine the algorithms to meet individual needs.Decon2LS source code, installer, and tutorials may be downloaded free of charge at http://http:/ncrr.pnl.gov/software/.

  4. rEHR: An R package for manipulating and analysing Electronic Health Record data.

    PubMed

    Springate, David A; Parisi, Rosa; Olier, Ivan; Reeves, David; Kontopantelis, Evangelos

    2017-01-01

    Research with structured Electronic Health Records (EHRs) is expanding as data becomes more accessible; analytic methods advance; and the scientific validity of such studies is increasingly accepted. However, data science methodology to enable the rapid searching/extraction, cleaning and analysis of these large, often complex, datasets is less well developed. In addition, commonly used software is inadequate, resulting in bottlenecks in research workflows and in obstacles to increased transparency and reproducibility of the research. Preparing a research-ready dataset from EHRs is a complex and time consuming task requiring substantial data science skills, even for simple designs. In addition, certain aspects of the workflow are computationally intensive, for example extraction of longitudinal data and matching controls to a large cohort, which may take days or even weeks to run using standard software. The rEHR package simplifies and accelerates the process of extracting ready-for-analysis datasets from EHR databases. It has a simple import function to a database backend that greatly accelerates data access times. A set of generic query functions allow users to extract data efficiently without needing detailed knowledge of SQL queries. Longitudinal data extractions can also be made in a single command, making use of parallel processing. The package also contains functions for cutting data by time-varying covariates, matching controls to cases, unit conversion and construction of clinical code lists. There are also functions to synthesise dummy EHR. The package has been tested with one for the largest primary care EHRs, the Clinical Practice Research Datalink (CPRD), but allows for a common interface to other EHRs. This simplified and accelerated work flow for EHR data extraction results in simpler, cleaner scripts that are more easily debugged, shared and reproduced.

  5. Highly-optimized TWSM software package for seismic diffraction modeling adapted for GPU-cluster

    NASA Astrophysics Data System (ADS)

    Zyatkov, Nikolay; Ayzenberg, Alena; Aizenberg, Arkady

    2015-04-01

    Oil producing companies concern to increase resolution capability of seismic data for complex oil-and-gas bearing deposits connected with salt domes, basalt traps, reefs, lenses, etc. Known methods of seismic wave theory define shape of hydrocarbon accumulation with nonsufficient resolution, since they do not account for multiple diffractions explicitly. We elaborate alternative seismic wave theory in terms of operators of propagation in layers and reflection-transmission at curved interfaces. Approximation of this theory is realized in the seismic frequency range as the Tip-Wave Superposition Method (TWSM). TWSM based on the operator theory allows to evaluate of wavefield in bounded domains/layers with geometrical shadow zones (in nature it can be: salt domes, basalt traps, reefs, lenses, etc.) accounting for so-called cascade diffraction. Cascade diffraction includes edge waves from sharp edges, creeping waves near concave parts of interfaces, waves of the whispering galleries near convex parts of interfaces, etc. The basic algorithm of TWSM package is based on multiplication of large-size matrices (make hundreds of terabytes in size). We use advanced information technologies for effective realization of numerical procedures of the TWSM. In particular, we actively use NVIDIA CUDA technology and GPU accelerators allowing to significantly improve the performance of the TWSM software package, that is important in using it for direct and inverse problems. The accuracy, stability and efficiency of the algorithm are justified by numerical examples with curved interfaces. TWSM package and its separate components can be used in different modeling tasks such as planning of acquisition systems, physical interpretation of laboratory modeling, modeling of individual waves of different types and in some inverse tasks such as imaging in case of laterally inhomogeneous overburden, AVO inversion.

  6. Optimization of transonic wind tunnel data acquisition and control systems for providing continuous mode tests

    NASA Astrophysics Data System (ADS)

    Petronevich, V. V.

    2016-10-01

    The paper observes the issues related to the increase of efficiency and information content of experimental research in transonic wind tunnels (WT). In particular, questions of optimizing the WT Data Acquisition and Control Systems (DACS) to provide the continuous mode test method are discussed. The problem of Mach number (M number) stabilization in the test section of the large transonic compressor-type wind tunnels at subsonic flow conditions with continuous change of the aircraft model angle of attack is observed on the example of T-128 wind tunnel. To minimize the signals distortion in T-128 DACS measurement channels the optimal MGCplus filter settings of the data acquisition system used in T-128 wind tunnel to measure loads were experimentally determined. As a result of the tests performed a good agreement of the results of balance measurements for pitch/pause and continuous test modes was obtained. Carrying out balance tests for pitch/pause and continuous test methods was provided by the regular data acquisition and control system of T-128 wind tunnel with unified software package POTOK. The architecture and functional abilities of POTOK software package are observed.

  7. Network Meta-Analysis Using R: A Review of Currently Available Automated Packages

    PubMed Central

    Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph

    2014-01-01

    Network meta-analysis (NMA) – a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously – has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA. PMID:25541687

  8. Network meta-analysis using R: a review of currently available automated packages.

    PubMed

    Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph

    2014-01-01

    Network meta-analysis (NMA)--a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously--has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA.

  9. Experimental validation of the DARWIN2.3 package for fuel cycle applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    San-Felice, L.; Eschbach, R.; Bourdot, P.

    2012-07-01

    The DARWIN package, developed by the CEA and its French partners (AREVA and EDF) provides the required parameters for fuel cycle applications: fuel inventory, decay heat, activity, neutron, {gamma}, {alpha}, {beta} sources and spectrum, radiotoxicity. This paper presents the DARWIN2.3 experimental validation for fuel inventory and decay heat calculations on Pressurized Water Reactor (PWR). In order to validate this code system for spent fuel inventory a large program has been undertaken, based on spent fuel chemical assays. This paper deals with the experimental validation of DARWIN2.3 for the Pressurized Water Reactor (PWR) Uranium Oxide (UOX) and Mixed Oxide (MOX) fuelmore » inventory calculation, focused on the isotopes involved in Burn-Up Credit (BUC) applications and decay heat computations. The calculation - experiment (C/E-1) discrepancies are calculated with the latest European evaluation file JEFF-3.1.1 associated with the SHEM energy mesh. An overview of the tendencies is obtained on a complete range of burn-up from 10 to 85 GWd/t (10 to 60 GWcVt for MOX fuel). The experimental validation of the DARWIN2.3 package for decay heat calculation is performed using calorimetric measurements carried out at the Swedish Interim Spent Fuel Storage Facility for Pressurized Water Reactor (PWR) assemblies, covering a large burn-up (20 to 50 GWd/t) and cooling time range (10 to 30 years). (authors)« less

  10. Fast interactive elastic registration of 12-bit multi-spectral images with subvoxel accuracy using display hardware

    NASA Astrophysics Data System (ADS)

    Noordmans, Herke Jan; de Roode, Rowland; Verdaasdonk, Rudolf

    2007-03-01

    Multi-spectral images of human tissue taken in-vivo often contain image alignment problems as patients have difficulty in retaining their posture during the acquisition time of 20 seconds. Previously, it has been attempted to correct motion errors with image registration software developed for MR or CT data but these algorithms have been proven to be too slow and erroneous for practical use with multi-spectral images. A new software package has been developed which allows the user to play a decisive role in the registration process as the user can monitor the progress of the registration continuously and force it in the right direction when it starts to fail. The software efficiently exploits videocard hardware to gain speed and to provide a perfect subvoxel correspondence between registration field and display. An 8 bit graphic card was used to efficiently register and resample 12 bit images using the hardware interpolation modes present on the graphic card. To show the feasibility of this new registration process, the software was applied in clinical practice evaluating the dosimetry for psoriasis and KTP laser treatment. The microscopic differences between images of normal skin and skin exposed to UV light proved that an affine registration step including zooming and slanting is critical for a subsequent elastic match to have success. The combination of user interactive registration software with optimal addressing the potentials of PC video card hardware greatly improves the speed of multi spectral image registration.

  11. Fast interactive registration tool for reproducible multi-spectral imaging for wound healing and treatment evaluation

    NASA Astrophysics Data System (ADS)

    Noordmans, Herke J.; de Roode, Rowland; Verdaasdonk, Rudolf

    2007-02-01

    Multi-spectral images of human tissue taken in-vivo often contain image alignment problems as patients have difficulty in retaining their posture during the acquisition time of 20 seconds. Previously, it has been attempted to correct motion errors with image registration software developed for MR or CT data but these algorithms have been proven to be too slow and erroneous for practical use with multi-spectral images. A new software package has been developed which allows the user to play a decisive role in the registration process as the user can monitor the progress of the registration continuously and force it in the right direction when it starts to fail. The software efficiently exploits videocard hardware to gain speed and to provide a perfect subvoxel correspondence between registration field and display. An 8 bit graphic card was used to efficiently register and resample 12 bit images using the hardware interpolation modes present on the graphic card. To show the feasibility of this new registration process, the software was applied in clinical practice evaluating the dosimetry for psoriasis and KTP laser treatment. The microscopic differences between images of normal skin and skin exposed to UV light proved that an affine registration step including zooming and slanting is critical for a subsequent elastic match to have success. The combination of user interactive registration software with optimal addressing the potentials of PC video card hardware greatly improves the speed of multi spectral image registration.

  12. The NOD3 software package: A graphical user interface-supported reduction package for single-dish radio continuum and polarisation observations

    NASA Astrophysics Data System (ADS)

    Müller, Peter; Krause, Marita; Beck, Rainer; Schmidt, Philip

    2017-10-01

    Context. The venerable NOD2 data reduction software package for single-dish radio continuum observations, which was developed for use at the 100-m Effelsberg radio telescope, has been successfully applied over many decades. Modern computing facilities, however, call for a new design. Aims: We aim to develop an interactive software tool with a graphical user interface for the reduction of single-dish radio continuum maps. We make a special effort to reduce the distortions along the scanning direction (scanning effects) by combining maps scanned in orthogonal directions or dual- or multiple-horn observations that need to be processed in a restoration procedure. The package should also process polarisation data and offer the possibility to include special tasks written by the individual user. Methods: Based on the ideas of the NOD2 package we developed NOD3, which includes all necessary tasks from the raw maps to the final maps in total intensity and linear polarisation. Furthermore, plot routines and several methods for map analysis are available. The NOD3 package is written in Python, which allows the extension of the package via additional tasks. The required data format for the input maps is FITS. Results: The NOD3 package is a sophisticated tool to process and analyse maps from single-dish observations that are affected by scanning effects from clouds, receiver instabilities, or radio-frequency interference. The "basket-weaving" tool combines orthogonally scanned maps into a final map that is almost free of scanning effects. The new restoration tool for dual-beam observations reduces the noise by a factor of about two compared to the NOD2 version. Combining single-dish with interferometer data in the map plane ensures the full recovery of the total flux density. Conclusions: This software package is available under the open source license GPL for free use at other single-dish radio telescopes of the astronomical community. The NOD3 package is designed to be extendable to multi-channel data represented by data cubes in Stokes I, Q, and U.

  13. Spinoff 2010

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Topics covered include: Burnishing Techniques Strengthen Hip Implants; Signal Processing Methods Monitor Cranial Pressure; Ultraviolet-Blocking Lenses Protect, Enhance Vision; Hyperspectral Systems Increase Imaging Capabilities; Programs Model the Future of Air Traffic Management; Tail Rotor Airfoils Stabilize Helicopters, Reduce Noise; Personal Aircraft Point to the Future of Transportation; Ducted Fan Designs Lead to Potential New Vehicles; Winglets Save Billions of Dollars in Fuel Costs; Sensor Systems Collect Critical Aerodynamics Data; Coatings Extend Life of Engines and Infrastructure; Radiometers Optimize Local Weather Prediction; Energy-Efficient Systems Eliminate Icing Danger for UAVs; Rocket-Powered Parachutes Rescue Entire Planes; Technologies Advance UAVs for Science, Military; Inflatable Antennas Support Emergency Communication; Smart Sensors Assess Structural Health; Hand-Held Devices Detect Explosives and Chemical Agents; Terahertz Tools Advance Imaging for Security, Industry; LED Systems Target Plant Growth; Aerogels Insulate Against Extreme Temperatures; Image Sensors Enhance Camera Technologies; Lightweight Material Patches Allow for Quick Repairs; Nanomaterials Transform Hairstyling Tools; Do-It-Yourself Additives Recharge Auto Air Conditioning; Systems Analyze Water Quality in Real Time; Compact Radiometers Expand Climate Knowledge; Energy Servers Deliver Clean, Affordable Power; Solutions Remediate Contaminated Groundwater; Bacteria Provide Cleanup of Oil Spills, Wastewater; Reflective Coatings Protect People and Animals; Innovative Techniques Simplify Vibration Analysis; Modeling Tools Predict Flow in Fluid Dynamics; Verification Tools Secure Online Shopping, Banking; Toolsets Maintain Health of Complex Systems; Framework Resources Multiply Computing Power; Tools Automate Spacecraft Testing, Operation; GPS Software Packages Deliver Positioning Solutions; Solid-State Recorders Enhance Scientific Data Collection; Computer Models Simulate Fine Particle Dispersion; Composite Sandwich Technologies Lighten Components; Cameras Reveal Elements in the Short Wave Infrared; Deformable Mirrors Correct Optical Distortions; Stitching Techniques Advance Optics Manufacturing; Compact, Robust Chips Integrate Optical Functions; Fuel Cell Stations Automate Processes, Catalyst Testing; Onboard Systems Record Unique Videos of Space Missions; Space Research Results Purify Semiconductor Materials; and Toolkits Control Motion of Complex Robotics.

  14. Evaluation of Open-Source Hard Real Time Software Packages

    NASA Technical Reports Server (NTRS)

    Mattei, Nicholas S.

    2004-01-01

    Reliable software is, at times, hard to find. No piece of software can be guaranteed to work in every situation that may arise during its use here at Glenn Research Center or in space. The job of the Software Assurance (SA) group in the Risk Management Office is to rigorously test the software in an effort to ensure it matches the contract specifications. In some cases the SA team also researches new alternatives for selected software packages. This testing and research is an integral part of the department of Safety and Mission Assurance. Real Time operation in reference to a computer system is a particular style of handing the timing and manner with which inputs and outputs are handled. A real time system executes these commands and appropriate processing within a defined timing constraint. Within this definition there are two other classifications of real time systems: hard and soft. A soft real time system is one in which if the particular timing constraints are not rigidly met there will be no critical results. On the other hand, a hard real time system is one in which if the timing constraints are not met the results could be catastrophic. An example of a soft real time system is a DVD decoder. If the particular piece of data from the input is not decoded and displayed to the screen at exactly the correct moment nothing critical will become of it, the user may not even notice it. However, a hard real time system is needed to control the timing of fuel injections or steering on the Space Shuttle; a delay of even a fraction of a second could be catastrophic in such a complex system. The current real time system employed by most NASA projects is Wind River's VxWorks operating system. This is a proprietary operating system that can be configured to work with many of NASA s needs and it provides very accurate and reliable hard real time performance. The down side is that since it is a proprietary operating system it is also costly to implement. The prospect of replacing this somewhat costly implementation is the focus of one of the SA group s current research projects. The explosion of open source software in the last ten years has led to the development of a multitude of software solutions which were once only produced by major corporations. The benefits of these open projects include faster release and bug patching cycles as well as inexpensive if not free software solutions. The main packages for hard real time solutions under Linux are Real Time Application Interface (RTAI) and two varieties of Real Time Linux (RTL), RTLFree and RTLPro. During my time here at NASA I have been testing various hard real time solutions operating as layers on the Linux Operating System. All testing is being run on an Intel SBC 2590 which is a common embedded hardware platform. The test plan was provided to me by the Software Assurance group at the start of my internship and my job has been to test the systems by developing and executing the test cases on the hardware. These tests are constructed so that the Software Assurance group can get hard test data for a comparison between the open source and proprietary implementations of hard real time solutions.

  15. Implementation of EAM and FS potentials in HOOMD-blue

    NASA Astrophysics Data System (ADS)

    Yang, Lin; Zhang, Feng; Travesset, Alex; Wang, Caizhuang; Ho, Kaiming

    HOOMD-blue is a general-purpose software to perform classical molecular dynamics simulations entirely on GPUs. We provide full support for EAM and FS type potentials in HOOMD-blue, and report accuracy and efficiency benchmarks, including comparisons with the LAMMPS GPU package. Two problems were selected to test the accuracy: the determination of the glass transition temperature of Cu64.5Zr35.5 alloy using an FS potential and the calculation of pair distribution functions of Ni3Al using an EAM potential. In both cases, the results using HOOMD-blue are indistinguishable from those obtained by the GPU package in LAMMPS within statistical uncertainties. As tests for time efficiency, we benchmark time-steps per second using LAMMPS GPU and HOOMD-blue on one NVIDIA Tesla GPU. Compared to our typical LAMMPS simulations on one CPU cluster node which has 16 CPUs, LAMMPS GPU can be 3-3.5 times faster, and HOOMD-blue can be 4-5.5 times faster. We acknowledge the support from Laboratory Directed Research and Development (LDRD) of Ames Laboratory.

  16. Using Kepler for Tool Integration in Microarray Analysis Workflows.

    PubMed

    Gan, Zhuohui; Stowe, Jennifer C; Altintas, Ilkay; McCulloch, Andrew D; Zambon, Alexander C

    Increasing numbers of genomic technologies are leading to massive amounts of genomic data, all of which requires complex analysis. More and more bioinformatics analysis tools are being developed by scientist to simplify these analyses. However, different pipelines have been developed using different software environments. This makes integrations of these diverse bioinformatics tools difficult. Kepler provides an open source environment to integrate these disparate packages. Using Kepler, we integrated several external tools including Bioconductor packages, AltAnalyze, a python-based open source tool, and R-based comparison tool to build an automated workflow to meta-analyze both online and local microarray data. The automated workflow connects the integrated tools seamlessly, delivers data flow between the tools smoothly, and hence improves efficiency and accuracy of complex data analyses. Our workflow exemplifies the usage of Kepler as a scientific workflow platform for bioinformatics pipelines.

  17. Gro2mat: a package to efficiently read gromacs output in MATLAB.

    PubMed

    Dien, Hung; Deane, Charlotte M; Knapp, Bernhard

    2014-07-30

    Molecular dynamics (MD) simulations are a state-of-the-art computational method used to investigate molecular interactions at atomic scale. Interaction processes out of experimental reach can be monitored using MD software, such as Gromacs. Here, we present the gro2mat package that allows fast and easy access to Gromacs output files from Matlab. Gro2mat enables direct parsing of the most common Gromacs output formats including the binary xtc-format. No openly available Matlab parser currently exists for this format. The xtc reader is orders of magnitudes faster than other available pdb/ascii workarounds. Gro2mat is especially useful for scientists with an interest in quick prototyping of new mathematical and statistical approaches for Gromacs trajectory analyses. © 2014 Wiley Periodicals, Inc. Copyright © 2014 Wiley Periodicals, Inc.

  18. JT8D-15/17 High Pressure Turbine Root Discharged Blade Performance Improvement. [engine design

    NASA Technical Reports Server (NTRS)

    Janus, A. S.

    1981-01-01

    The JT8D high pressure turbine blade and seal were modified, using a more efficient blade cooling system, improved airfoil aerodynamics, more effective control of secondary flows, and improved blade tip sealing. Engine testing was conducted to determine the effect of these improvements on performance. The modified turbine package demonstrated significant thrust specific fuel consumption and exhaust gas temperature improvements in sea level and altitude engine tests. Inspection of the improved blade and seal hardware after testing revealed no unusual wear or degradation.

  19. Automated data collection in single particle electron microscopy

    PubMed Central

    Tan, Yong Zi; Cheng, Anchi; Potter, Clinton S.; Carragher, Bridget

    2016-01-01

    Automated data collection is an integral part of modern workflows in single particle electron microscopy (EM) research. This review surveys the software packages available for automated single particle EM data collection. The degree of automation at each stage of data collection is evaluated, and the capabilities of the software packages are described. Finally, future trends in automation are discussed. PMID:26671944

  20. Defense AT and L. Volume 42, Number 1

    DTIC Science & Technology

    2013-02-01

    Agnish The U.S. Army late last year began equipping brigade combat teams with its first package of radios, satellite systems, software applications...Army’s first package of radios, satellite systems, software applications, smartphone-like devices, and other network components that provide integrated... satellite communications, intelligence, mission command applications, and the integration of C4ISR equip- ment onto various vehicle platforms. This

  1. Increasing the Number of Replications in Item Response Theory Simulations: Automation through SAS and Disk Operating System

    ERIC Educational Resources Information Center

    Gagne, Phill; Furlow, Carolyn; Ross, Terris

    2009-01-01

    In item response theory (IRT) simulation research, it is often necessary to use one software package for data generation and a second software package to conduct the IRT analysis. Because this can substantially slow down the simulation process, it is sometimes offered as a justification for using very few replications. This article provides…

  2. Software for Teaching about AIDS & Sex: A Critical Review of Products. A MicroSIFT Report.

    ERIC Educational Resources Information Center

    Weaver, Dave

    This document contains critical reviews of 10 microcomputer software packages and two interactive videodisc products designed for use in teaching about Acquired Immune Deficiency Syndrome (AIDS) and sex at the secondary school level and above. Each package was reviewed by one or two secondary school health teachers and by a staff member from the…

  3. User's manual for the VAX-Gerber link software package. Revision 1. 0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Isobe, G.W.

    1985-10-01

    This manual provides a user the information necessary to run the VAX-Gerber link software package. It is expected that the user already knows how to login to the VAX, and is familiar with the Gerber Photo Plotter. It is also highly desirable that the user be familiar with the full screen editor on the VAX, EDT.

  4. cit: hypothesis testing software for mediation analysis in genomic applications.

    PubMed

    Millstein, Joshua; Chen, Gary K; Breton, Carrie V

    2016-08-01

    The challenges of successfully applying causal inference methods include: (i) satisfying underlying assumptions, (ii) limitations in data/models accommodated by the software and (iii) low power of common multiple testing approaches. The causal inference test (CIT) is based on hypothesis testing rather than estimation, allowing the testable assumptions to be evaluated in the determination of statistical significance. A user-friendly software package provides P-values and optionally permutation-based FDR estimates (q-values) for potential mediators. It can handle single and multiple binary and continuous instrumental variables, binary or continuous outcome variables and adjustment covariates. Also, the permutation-based FDR option provides a non-parametric implementation. Simulation studies demonstrate the validity of the cit package and show a substantial advantage of permutation-based FDR over other common multiple testing strategies. The cit open-source R package is freely available from the CRAN website (https://cran.r-project.org/web/packages/cit/index.html) with embedded C ++ code that utilizes the GNU Scientific Library, also freely available (http://www.gnu.org/software/gsl/). joshua.millstein@usc.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  5. Design Evolution Study - Aging Options

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    P. McDaniel

    The purpose of this study is to identify options and issues for aging commercial spent nuclear fuel received for disposal at the Yucca Mountain Mined Geologic Repository. Some early shipments of commercial spent nuclear fuel to the repository may be received with high-heat-output (younger) fuel assemblies that will need to be managed to meet thermal goals for emplacement. The capability to age as much as 40,000 metric tons of heavy metal of commercial spent nuclear he1 would provide more flexibility in the design to manage this younger fuel and to decouple waste receipt and waste emplacement. The following potential agingmore » location options are evaluated: (1) Surface aging at four locations near the North Portal; (2) Subsurface aging in the permanent emplacement drifts; and (3) Subsurface aging in a new subsurface area. The following aging container options are evaluated: (1) Complete Waste Package; (2) Stainless Steel inner liner of the waste package; (3) Dual Purpose Canisters; (4) Multi-Purpose Canisters; and (5) New disposable canister for uncanistered commercial spent nuclear fuel. Each option is compared to a ''Base Case,'' which is the expected normal waste packaging process without aging. A Value Engineering approach is used to score each option against nine technical criteria and rank the options. Open issues with each of the options and suggested future actions are also presented. Costs for aging containers and aging locations are evaluated separately. Capital costs are developed for direct costs and distributable field costs. To the extent practical, unit costs are presented. Indirect costs, operating costs, and total system life cycle costs will be evaluated outside of this study. Three recommendations for aging commercial spent nuclear fuel--subsurface, surface, and combined surface and subsurface are presented for further review in the overall design re-evaluation effort. Options that were evaluated but not recommended are: subsurface aging in a new subsurface area (high cost); surface aging in the complete waste package (risk to the waste package and impact on the Waste Handling Facility); and aging in the stainless steel liner (impact on the waste package design and new high risk operations added to the waste packaging process). The selection of a design basis for aging will be made in conjunction with the other design re-evaluation studies.« less

  6. Impact of flight systems integration on future aircraft design

    NASA Technical Reports Server (NTRS)

    Hood, R. V.; Dollyhigh, S. M.; Newsom, J. R.

    1984-01-01

    Integrations trends in aircraft are discussed with an eye to manifestations in future aircraft designs through interdisciplinary technology integration. Current practices use software changes or small hardware fixes to solve problems late in the design process, e.g., low static stability to upgrade fuel efficiency. A total energy control system has been devised to integrate autopilot and autothrottle functions, thereby eliminating hardware, reducing the software, pilot workload, and cost, and improving flight efficiency and performance. Integrated active controls offer reduced weight and larger payloads for transport aircraft. The introduction of vectored thrust may eliminate horizontal and vertical stabilizers, and location of the thrust at the vehicle center of gravity can provide vertical takeoff and landing capabilities. It is suggested that further efforts will open a new discipline, aeroservoelasticity, and tests will become multidisciplinary, involving controls, aerodynamics, propulsion and structures.

  7. Software engineering and data management for automated payload experiment tool

    NASA Technical Reports Server (NTRS)

    Maddux, Gary A.; Provancha, Anna; Chattam, David

    1994-01-01

    The Microgravity Projects Office identified a need to develop a software package that will lead experiment developers through the development planning process, obtain necessary information, establish an electronic data exchange avenue, and allow easier manipulation/reformatting of the collected information. An MS-DOS compatible software package called the Automated Payload Experiment Tool (APET) has been developed and delivered. The objective of this task is to expand on the results of the APET work previously performed by University of Alabama in Huntsville (UAH) and provide versions of the software in a Macintosh and Windows compatible format. Appendix 1 science requirements document (SRD) Users Manual is attached.

  8. PsyToolkit: a software package for programming psychological experiments using Linux.

    PubMed

    Stoet, Gijsbert

    2010-11-01

    PsyToolkit is a set of software tools for programming psychological experiments on Linux computers. Given that PsyToolkit is freely available under the Gnu Public License, open source, and designed such that it can easily be modified and extended for individual needs, it is suitable not only for technically oriented Linux users, but also for students, researchers on small budgets, and universities in developing countries. The software includes a high-level scripting language, a library for the programming language C, and a questionnaire presenter. The software easily integrates with other open source tools, such as the statistical software package R. PsyToolkit is designed to work with external hardware (including IoLab and Cedrus response keyboards and two common digital input/output boards) and to support millisecond timing precision. Four in-depth examples explain the basic functionality of PsyToolkit. Example 1 demonstrates a stimulus-response compatibility experiment. Example 2 demonstrates a novel mouse-controlled visual search experiment. Example 3 shows how to control light emitting diodes using PsyToolkit, and Example 4 shows how to build a light-detection sensor. The last two examples explain the electronic hardware setup such that they can even be used with other software packages.

  9. Modular modeling system for building distributed hydrologic models with a user-friendly software package

    NASA Astrophysics Data System (ADS)

    Wi, S.; Ray, P. A.; Brown, C.

    2015-12-01

    A software package developed to facilitate building distributed hydrologic models in a modular modeling system is presented. The software package provides a user-friendly graphical user interface that eases its practical use in water resources-related research and practice. The modular modeling system organizes the options available to users when assembling models according to the stages of hydrological cycle, such as potential evapotranspiration, soil moisture accounting, and snow/glacier melting processes. The software is intended to be a comprehensive tool that simplifies the task of developing, calibrating, validating, and using hydrologic models through the inclusion of intelligent automation to minimize user effort, and reduce opportunities for error. Processes so far automated include the definition of system boundaries (i.e., watershed delineation), climate and geographical input generation, and parameter calibration. Built-in post-processing toolkits greatly improve the functionality of the software as a decision support tool for water resources system management and planning. Example post-processing toolkits enable streamflow simulation at ungauged sites with predefined model parameters, and perform climate change risk assessment by means of the decision scaling approach. The software is validated through application to watersheds representing a variety of hydrologic regimes.

  10. visCOS: An R-package to evaluate model performance of hydrological models

    NASA Astrophysics Data System (ADS)

    Klotz, Daniel; Herrnegger, Mathew; Wesemann, Johannes; Schulz, Karsten

    2016-04-01

    The evaluation of model performance is a central part of (hydrological) modelling. Much attention has been given to the development of evaluation criteria and diagnostic frameworks. (Klemeš, 1986; Gupta et al., 2008; among many others). Nevertheless, many applications exist for which objective functions do not yet provide satisfying summaries. Thus, the necessity to visualize results arises in order to explore a wider range of model capacities, be it strengths or deficiencies. Visualizations are usually devised for specific projects and these efforts are often not distributed to a broader community (e.g. via open source software packages). Hence, the opportunity to explicitly discuss a state-of-the-art presentation technique is often missed. We therefore present a comprehensive R-package for evaluating model performance by visualizing and exploring different aspects of hydrological time-series. The presented package comprises a set of useful plots and visualization methods, which complement existing packages, such as hydroGOF (Zambrano-Bigiarini et al., 2012). It is derived from practical applications of the hydrological models COSERO and COSEROreg (Kling et al., 2014). visCOS, providing an interface in R, represents an easy-to-use software package for visualizing and assessing model performance and can be implemented in the process of model calibration or model development. The package provides functions to load hydrological data into R, clean the data, process, visualize, explore and finally save the results in a consistent way. Together with an interactive zoom function of the time series, an online calculation of the objective functions for variable time-windows is included. Common hydrological objective functions, such as the Nash-Sutcliffe Efficiency and the Kling-Gupta Efficiency, can also be evaluated and visualized in different ways for defined sub-periods like hydrological years or seasonal sections. Many hydrologists use long-term water-balances as a pivotal tool in model evaluation. They allow inferences about different systematic model-shortcomings and are an efficient way for communicating these in practice (Schulz et al., 2015). The evaluation and construction of such water balances is implemented with the presented package. During the (manual) calibration of a model or in the scope of model development, many model runs and iterations are necessary. Thus, users are often interested in comparing different model results in a visual way in order to learn about the model and to analyse parameter-changes on the output. A method to illuminate these differences and the evolution of changes is also included. References: • Gupta, H.V.; Wagener, T.; Liu, Y. (2008): Reconciling theory with observations: elements of a diagnostic approach to model evaluation, Hydrol. Process. 22, doi: 10.1002/hyp.6989. • Klemeš, V. (1986): Operational testing of hydrological simulation models, Hydrolog. Sci. J., doi: 10.1080/02626668609491024. • Kling, H.; Stanzel, P.; Fuchs, M.; and Nachtnebel, H. P. (2014): Performance of the COSERO precipitation-runoff model under non-stationary conditions in basins with different climates, Hydrolog. Sci. J., doi: 10.1080/02626667.2014.959956. • Schulz, K., Herrnegger, M., Wesemann, J., Klotz, D. Senoner, T. (2015): Kalibrierung COSERO - Mur für Pro Vis, Verbund Trading GmbH (Abteilung STG), final report, Institute of Water Management, Hydrology and Hydraulic Engineering, University of Natural Resources and Applied Life Sciences, Vienna, Austria, 217pp. • Zambrano-Bigiarini, M; Bellin, A. (2010): Comparing Goodness-of-fit Measures for Calibration of Models Focused on Extreme Events. European Geosciences Union (EGU), Geophysical Research Abstracts 14, EGU2012-11549-1.

  11. Fundamental Researches on the High-speed and High-efficiency Steelmaking Reaction

    NASA Astrophysics Data System (ADS)

    Kitamura, Shin-ya; Shibata, Hiroyuki; Maruoka, Nobuhiro

    2012-06-01

    Traditionally, steelmaking reactions have been analyzed by thermodynamics. Recently, software packages that can be used to calculate the equilibrium conditions have improved greatly. In some cases, information obtained in this software is useful for analyzing the steelmaking reaction. On the other hand, in industrial operation, steelmaking reactions, i.e., decarburization, dephosphorization, desulfurization or nitrogen removal, do not reach the equilibrium condition. Therefore, the kinetic model is very important for gaining a theoretical understanding of the steelmaking reaction. In this paper, the following recent research activities were shown; 1) mass transfer of impurities between solid oxide and liquid slag, 2) simulation model of hot metal dephosphorization by multiphase slag, 3) evaluation of reaction rate at bath surface in gas-liquid reaction system and 4) condition for forming of metal emulsion by bottom bubbling.

  12. Software package for modeling spin–orbit motion in storage rings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zyuzin, D. V., E-mail: d.zyuzin@fz-juelich.de

    2015-12-15

    A software package providing a graphical user interface for computer experiments on the motion of charged particle beams in accelerators, as well as analysis of obtained data, is presented. The software package was tested in the framework of the international project on electric dipole moment measurement JEDI (Jülich Electric Dipole moment Investigations). The specific features of particle spin motion imply the requirement to use a cyclic accelerator (storage ring) consisting of electrostatic elements, which makes it possible to preserve horizontal polarization for a long time. Computer experiments study the dynamics of 10{sup 6}–10{sup 9} particles in a beam during 10{supmore » 9} turns in an accelerator (about 10{sup 12}–10{sup 15} integration steps for the equations of motion). For designing an optimal accelerator structure, a large number of computer experiments on polarized beam dynamics are required. The numerical core of the package is COSY Infinity, a program for modeling spin–orbit dynamics.« less

  13. SEMModComp: An R Package for Calculating Likelihood Ratio Tests for Mean and Covariance Structure Models

    ERIC Educational Resources Information Center

    Levy, Roy

    2010-01-01

    SEMModComp, a software package for conducting likelihood ratio tests for mean and covariance structure modeling is described. The package is written in R and freely available for download or on request.

  14. The design, deployment, and testing of kriging models in GEOframe with SIK-0.9.8

    NASA Astrophysics Data System (ADS)

    Bancheri, Marialaura; Serafin, Francesco; Bottazzi, Michele; Abera, Wuletawu; Formetta, Giuseppe; Rigon, Riccardo

    2018-06-01

    This work presents a software package for the interpolation of climatological variables, such as temperature and precipitation, using kriging techniques. The purposes of the paper are (1) to present a geostatistical software that is easy to use and easy to plug in to a hydrological model; (2) to provide a practical example of an accurately designed software from the perspective of reproducible research; and (3) to demonstrate the goodness of the results of the software and so have a reliable alternative to other, more traditional tools. A total of 11 types of theoretical semivariograms and four types of kriging were implemented and gathered into Object Modeling System-compliant components. The package provides real-time optimization for semivariogram and kriging parameters. The software was tested using a year's worth of hourly temperature readings and a rain storm event (11 h) recorded in 2008 and retrieved from 97 meteorological stations in the Isarco River basin, Italy. For both the variables, good interpolation results were obtained and then compared to the results from the R package gstat.

  15. Advanced Software Development Workstation Project

    NASA Technical Reports Server (NTRS)

    Lee, Daniel

    1989-01-01

    The Advanced Software Development Workstation Project, funded by Johnson Space Center, is investigating knowledge-based techniques for software reuse in NASA software development projects. Two prototypes have been demonstrated and a third is now in development. The approach is to build a foundation that provides passive reuse support, add a layer that uses domain-independent programming knowledge, add a layer that supports the acquisition of domain-specific programming knowledge to provide active support, and enhance maintainability and modifiability through an object-oriented approach. The development of new application software would use specification-by-reformulation, based on a cognitive theory of retrieval from very long-term memory in humans, and using an Ada code library and an object base. Current tasks include enhancements to the knowledge representation of Ada packages and abstract data types, extensions to support Ada package instantiation knowledge acquisition, integration with Ada compilers and relational databases, enhancements to the graphical user interface, and demonstration of the system with a NASA contractor-developed trajectory simulation package. Future work will focus on investigating issues involving scale-up and integration.

  16. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics

    PubMed Central

    2016-01-01

    Background We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. Objective To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. Methods The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Results Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix. Conclusions IBMWA is a new alternative for data analytics software that automates descriptive, predictive, and visual analytics. This program is very user-friendly but requires data preprocessing, statistical conceptual understanding, and domain expertise. PMID:27729304

  17. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics.

    PubMed

    Hoyt, Robert Eugene; Snider, Dallas; Thompson, Carla; Mantravadi, Sarita

    2016-10-11

    We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix. IBMWA is a new alternative for data analytics software that automates descriptive, predictive, and visual analytics. This program is very user-friendly but requires data preprocessing, statistical conceptual understanding, and domain expertise.

  18. CFD-Modeling of the Multistage Gasifier Capacity of 30 KW

    NASA Astrophysics Data System (ADS)

    Levin, A. A.; Kozlov, A. N.; Svishchev, D. A.; Donskoy, I. G.

    2017-11-01

    Single-stage fuel gasification processes have been developed and widely studied in Russia and abroad throughout the 20th century. They are fundamental to the creation and design of modern gas generator equipment. Many studies have shown that single-stage gasification process, have already reached the limit of perfection, which was a significant improvement in their performance becomes impossible and unprofitable. The most fully meet modern technical requirements of multistage gasification technology. In the first step of the process, is organized allothermic biomass pyrolysis using heat of exhaust gas and generating power plant. At this stage, the yield of volatile products (gas and tar) of fuel. In the second step, the layer of fuel is, the tar is decomposed by the action of hot air and steam, steam-gas mixture is formed further reacts with the charcoal in the third process stage. The paper presents a model developed by the authors of the multi-stage gasifier for wood chips. The model is made with the use of CFD-modeling software package (COMSOL Multiphisics). To describe the kinetics of wood pyrolysis and gasification of charcoal studies were carried out using a set of simultaneous thermal analysis. For this complex developed original methods of interpretation of measurements, including methods of technical analysis of fuels and determine the parameters of the detailed kinetics and mechanism of pyrolysis.

  19. L2CXCV: A Fortran 77 package for least squares convex/concave data smoothing

    NASA Astrophysics Data System (ADS)

    Demetriou, I. C.

    2006-04-01

    Fortran 77 software is given for least squares smoothing to data values contaminated by random errors subject to one sign change in the second divided differences of the smoothed values, where the location of the sign change is also unknown of the optimization problem. A highly useful description of the constraints is that they follow from the assumption of initially increasing and subsequently decreasing rates of change, or vice versa, of the process considered. The underlying algorithm partitions the data into two disjoint sets of adjacent data and calculates the required fit by solving a strictly convex quadratic programming problem for each set. The piecewise linear interpolant to the fit is convex on the first set and concave on the other one. The partition into suitable sets is achieved by a finite iterative algorithm, which is made quite efficient because of the interactions of the quadratic programming problems on consecutive data. The algorithm obtains the solution by employing no more quadratic programming calculations over subranges of data than twice the number of the divided differences constraints. The quadratic programming technique makes use of active sets and takes advantage of a B-spline representation of the smoothed values that allows some efficient updating procedures. The entire code required to implement the method is 2920 Fortran lines. The package has been tested on a variety of data sets and it has performed very efficiently, terminating in an overall number of active set changes over subranges of data that is only proportional to the number of data. The results suggest that the package can be used for very large numbers of data values. Some examples with output are provided to help new users and exhibit certain features of the software. Important applications of the smoothing technique may be found in calculating a sigmoid approximation, which is a common topic in various contexts in applications in disciplines like physics, economics, biology and engineering. Distribution material that includes single and double precision versions of the code, driver programs, technical details of the implementation of the software package and test examples that demonstrate the use of the software is available in an accompanying ASCII file. Program summaryTitle of program:L2CXCV Catalogue identifier:ADXM_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXM_v1_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computer:PC Intel Pentium, Sun Sparc Ultra 5, Hewlett-Packard HP UX 11.0 Operating system:WINDOWS 98, 2000, Unix/Solaris 7, Unix/HP UX 11.0 Programming language used:FORTRAN 77 Memory required to execute with typical data:O(n), where n is the number of data No. of bits in a byte:8 No. of lines in distributed program, including test data, etc.:29 349 No. of bytes in distributed program, including test data, etc.:1 276 663 No. of processors used:1 Has the code been vectorized or parallelized?:no Distribution format:default tar.gz Separate documentation available:Yes Nature of physical problem:Analysis of processes that show initially increasing and then decreasing rates of change (sigmoid shape), as, for example, in heat curves, reactor stability conditions, evolution curves, photoemission yields, growth models, utility functions, etc. Identifying an unknown convex/concave (sigmoid) function from some measurements of its values that contain random errors. Also, identifying the inflection point of this sigmoid function. Method of solution:Univariate data smoothing by minimizing the sum of the squares of the residuals (least squares approximation) subject to the condition that the second order divided differences of the smoothed values change sign at most once. Ideally, this is the number of sign changes in the second derivative of the underlying function. The remarkable property of the smoothed values is that they consist of one separate section of optimal components that give nonnegative second divided differences (convexity) and one separate section of optimal components that give nonpositive second divided differences (concavity). The solution process finds the joint (that is the inflection point estimate of the underlying function) of the sections automatically. The underlying method is iterative, each iteration solving a structured strictly convex quadratic programming problem in order to obtain a convex or a concave section over a subrange of data. Restrictions on the complexity of the problem:Number of data, n, is not limited in the software package, but is limited to 2000 in the main driver. The total work of the method requires 2n-2 structured quadratic programming calculations over subranges of data, which in practice does not exceed the amount of O(n) computer operations. Typical running times:CPU time on a PC with an Intel 733 MHz processor operating in Windows 98: About 2 s to smooth n=1000 noisy measurements that follow the shape of the sine function over one period. Summary:L2CXCV is a package of Fortran 77 subroutines for least squares smoothing to n univariate data values contaminated by random errors subject to one sign change in the second divided differences of the smoothed values, where the location of the sign change is unknown. The piecewise linear interpolant to the smoothed values gives a convex/concave fit to the data. The underlying algorithm is based on the property that in this best convex/concave fit, the convex and the concave section are both optimal and separate. The algorithm is iterative, each iteration solving a strictly convex quadratic programming problem for the best convex fit to the first k data, starting from the best convex fit to the first k-1 data. By reversing the order and sign of the data, the algorithm obtains the best concave fit to the last n-k data. Then it chooses that k as the optimal position of the required sign change (which defines the inflection point of the fit), if the convex and the concave components to the first k and the last n-k data, respectively, form a convex/concave vector that gives the least sum of squares of residuals. In effect the algorithm requires at most 2n-2 quadratic programming calculations over subranges of data. The package employs a technique for quadratic programming, which takes advantage of a B-spline representation of the smoothed values and makes use of some efficient O(k) updating procedures, where k is the number of data of a subrange. The package has been tested on a variety of data sets and it has performed very efficiently, terminating in an overall number of active set changes that is about n, thus exhibiting quadratic performance in n. The Fortran codes have been designed to minimize the use of computing resources. Attention has been given to computer rounding errors details, which are essential to the robustness of the software package. Numerical examples with output are provided to help the use of the software and exhibit certain features of the method. Distribution material that includes driver programs, technical details of the installation of the package and test examples that demonstrate the use of the software is available in an ASCII file that accompanies this work.

  20. Linear programming computational experience with onyx

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Atrek, E.

    1994-12-31

    ONYX is a linear programming software package based on an efficient variation of the gradient projection method. When fully configured, it is intended for application to industrial size problems. While the computational experience is limited at the time of this abstract, the technique is found to be robust and competitive with existing methodology in terms of both accuracy and speed. An overview of the approach is presented together with a description of program capabilities, followed by a discussion of up-to-date computational experience with the program. Conclusions include advantages of the approach and envisioned future developments.

Top