NASA Technical Reports Server (NTRS)
Dominick, Wayne D. (Editor); Bassari, Jinous; Triantafyllopoulos, Spiros
1984-01-01
The University of Southwestern Louisiana (USL) NASA PC R and D statistical analysis support package is designed to be a three-level package to allow statistical analysis for a variety of applications within the USL Data Base Management System (DBMS) contract work. The design addresses usage of the statistical facilities as a library package, as an interactive statistical analysis system, and as a batch processing package.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Astaf'ev, S. B., E-mail: bard@ns.crys.ras.ru; Shchedrin, B. M.; Yanusova, L. G.
2012-01-15
The main principles of developing the Basic Analysis of Reflectometry Data (BARD) software package, which is aimed at obtaining a unified (standardized) tool for analyzing the structure of thin multilayer films and nanostructures of different nature based on reflectometry data, are considered. This software package contains both traditionally used procedures for processing reflectometry data and the authors' original developments on the basis of new methods for carrying out and analyzing reflectometry experiments. The structure of the package, its functional possibilities, examples of application, and prospects of development are reviewed.
PLATSIM: A Simulation and Analysis Package for Large-Order Flexible Systems. Version 2.0
NASA Technical Reports Server (NTRS)
Maghami, Peiman G.; Kenny, Sean P.; Giesy, Daniel P.
1997-01-01
The software package PLATSIM provides efficient time and frequency domain analysis of large-order generic space platforms. PLATSIM can perform open-loop analysis or closed-loop analysis with linear or nonlinear control system models. PLATSIM exploits the particular form of sparsity of the plant matrices for very efficient linear and nonlinear time domain analysis, as well as frequency domain analysis. A new, original algorithm for the efficient computation of open-loop and closed-loop frequency response functions for large-order systems has been developed and is implemented within the package. Furthermore, a novel and efficient jitter analysis routine which determines jitter and stability values from time simulations in a very efficient manner has been developed and is incorporated in the PLATSIM package. In the time domain analysis, PLATSIM simulates the response of the space platform to disturbances and calculates the jitter and stability values from the response time histories. In the frequency domain analysis, PLATSIM calculates frequency response function matrices and provides the corresponding Bode plots. The PLATSIM software package is written in MATLAB script language. A graphical user interface is developed in the package to provide convenient access to its various features.
XAPiir: A recursive digital filtering package
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harris, D.
1990-09-21
XAPiir is a basic recursive digital filtering package, containing both design and implementation subroutines. XAPiir was developed for the experimental array processor (XAP) software package, and is written in FORTRAN. However, it is intended to be incorporated into any general- or special-purpose signal analysis program. It replaces the older package RECFIL, offering several enhancements. RECFIL is used in several large analysis programs developed at LLNL, including the seismic analysis package SAC, several expert systems (NORSEA and NETSEA), and two general purpose signal analysis packages (SIG and VIEW). This report is divided into two sections: the first describes the use ofmore » the subroutine package, and the second, its internal organization. In the first section, the filter design problem is briefly reviewed, along with the definitions of the filter design parameters and their relationship to the subroutine input parameters. In the second section, the internal organization is documented to simplify maintenance and extensions to the package. 5 refs., 9 figs.« less
NASA Astrophysics Data System (ADS)
Lestari Widaningrum, Dyah
2014-03-01
This research aims to investigate the importance of take-out food packaging attributes, using conjoint analysis and QFD approach among consumers of take-out food products in Jakarta, Indonesia. The conjoint results indicate that perception about packaging material (such as paper, plastic, and polystyrene foam) plays the most important role overall in consumer perception. The clustering results that there is strong segmentation in which take-out food packaging material consumer consider most important. Some consumers are mostly oriented toward the colour of packaging, while another segment of customers concerns on packaging shape and packaging information. Segmentation variables based on packaging response can provide very useful information to maximize image of products through the package's impact. The results of House of Quality development described that Conjoint Analysis - QFD is a useful combination of the two methodologies in product development, market segmentation, and the trade off between customers' requirements in the early stages of HOQ process
Differential maneuvering simulator data reduction and analysis software
NASA Technical Reports Server (NTRS)
Beasley, G. P.; Sigman, R. S.
1972-01-01
A multielement data reduction and analysis software package has been developed for use with the Langley differential maneuvering simulator (DMS). This package, which has several independent elements, was developed to support all phases of DMS aircraft simulation studies with a variety of both graphical and tabular information. The overall software package is considered unique because of the number, diversity, and sophistication of the element programs available for use in a single study. The purpose of this paper is to discuss the overall DMS data reduction and analysis package by reviewing the development of the various elements of the software, showing typical results that can be obtained, and discussing how each element can be used.
Neural Network Prototyping Package Within IRAF
NASA Technical Reports Server (NTRS)
Bazell, David
1997-01-01
The purpose of this contract was to develop a neural network package within the IRAF environment to allow users to easily understand and use different neural network algorithms the analysis of astronomical data. The package was developed for use within IRAF to allow portability to different computing environments and to provide a familiar and easy to use interface with the routines. In addition to developing the software and supporting documentation, we planned to use the system for the analysis of several sample problems to prove its viability and usefulness.
RSEIS and RFOC: Seismic Analysis in R
NASA Astrophysics Data System (ADS)
Lees, J. M.
2015-12-01
Open software is essential for reproducible scientific exchange. R-packages provide a platform for development of seismological investigation software that can be properly documented and traced for data processing. A suite of R packages designed for a wide range of seismic analysis is currently available in the free software platform called R. R is a software platform based on the S-language developed at Bell Labs decades ago. Routines in R can be run as standalone function calls, or developed in object-oriented mode. R comes with a base set of routines, and thousands of user developed packages. The packages developed at UNC include subroutines and interactive codes for processing seismic data, analyzing geographic information (GIS) and inverting data involved in a variety of geophysical applications. On CRAN (Comprehensive R Archive Network, http://www.r-project.org/) currently available packages related to seismic analysis are RSEIS, Rquake, GEOmap, RFOC, zoeppritz, RTOMO, and geophys, Rwave, PEIP, hht, rFDSN. These include signal processing, data management, mapping, earthquake location, deconvolution, focal mechanisms, wavelet transforms, Hilbert-Huang Transforms, tomographic inversion, and Mogi deformation among other useful functionality. All software in R packages is required to have detailed documentation, making the exchange and modification of existing software easy. In this presentation, I will focus on packages RSEIS and RFOC, showing examples from a variety of seismic analyses. The R approach has similarities to the popular (and expensive) MATLAB platform, although R is open source and free to down load.
Learn by Yourself: The Self-Learning Tools for Qualitative Analysis Software Packages
ERIC Educational Resources Information Center
Freitas, Fábio; Ribeiro, Jaime; Brandão, Catarina; Reis, Luís Paulo; de Souza, Francislê Neri; Costa, António Pedro
2017-01-01
Computer Assisted Qualitative Data Analysis Software (CAQDAS) are tools that help researchers to develop qualitative research projects. These software packages help the users with tasks such as transcription analysis, coding and text interpretation, writing and annotation, content search and analysis, recursive abstraction, grounded theory…
PyPathway: Python Package for Biological Network Analysis and Visualization.
Xu, Yang; Luo, Xiao-Chun
2018-05-01
Life science studies represent one of the biggest generators of large data sets, mainly because of rapid sequencing technological advances. Biological networks including interactive networks and human curated pathways are essential to understand these high-throughput data sets. Biological network analysis offers a method to explore systematically not only the molecular complexity of a particular disease but also the molecular relationships among apparently distinct phenotypes. Currently, several packages for Python community have been developed, such as BioPython and Goatools. However, tools to perform comprehensive network analysis and visualization are still needed. Here, we have developed PyPathway, an extensible free and open source Python package for functional enrichment analysis, network modeling, and network visualization. The network process module supports various interaction network and pathway databases such as Reactome, WikiPathway, STRING, and BioGRID. The network analysis module implements overrepresentation analysis, gene set enrichment analysis, network-based enrichment, and de novo network modeling. Finally, the visualization and data publishing modules enable users to share their analysis by using an easy web application. For package availability, see the first Reference.
Comparison of requirements and capabilities of major multipurpose software packages.
Igo, Robert P; Schnell, Audrey H
2012-01-01
The aim of this chapter is to introduce the reader to commonly used software packages and illustrate their input requirements, analysis options, strengths, and limitations. We focus on packages that perform more than one function and include a program for quality control, linkage, and association analyses. Additional inclusion criteria were (1) programs that are free to academic users and (2) currently supported, maintained, and developed. Using those criteria, we chose to review three programs: Statistical Analysis for Genetic Epidemiology (S.A.G.E.), PLINK, and Merlin. We will describe the required input format and analysis options. We will not go into detail about every possible program in the packages, but we will give an overview of the packages requirements and capabilities.
DESIGN ANALYSIS FOR THE DEFENSE HIGH-LEVEL WASTE DISPOSAL CONTAINER
DOE Office of Scientific and Technical Information (OSTI.GOV)
G. Radulesscu; J.S. Tang
The purpose of ''Design Analysis for the Defense High-Level Waste Disposal Container'' analysis is to technically define the defense high-level waste (DHLW) disposal container/waste package using the Waste Package Department's (WPD) design methods, as documented in ''Waste Package Design Methodology Report'' (CRWMS M&O [Civilian Radioactive Waste Management System Management and Operating Contractor] 2000a). The DHLW disposal container is intended for disposal of commercial high-level waste (HLW) and DHLW (including immobilized plutonium waste forms), placed within disposable canisters. The U.S. Department of Energy (DOE)-managed spent nuclear fuel (SNF) in disposable canisters may also be placed in a DHLW disposal container alongmore » with HLW forms. The objective of this analysis is to demonstrate that the DHLW disposal container/waste package satisfies the project requirements, as embodied in Defense High Level Waste Disposal Container System Description Document (SDD) (CRWMS M&O 1999a), and additional criteria, as identified in Waste Package Design Sensitivity Report (CRWMS M&Q 2000b, Table 4). The analysis briefly describes the analytical methods appropriate for the design of the DHLW disposal contained waste package, and summarizes the results of the calculations that illustrate the analytical methods. However, the analysis is limited to the calculations selected for the DHLW disposal container in support of the Site Recommendation (SR) (CRWMS M&O 2000b, Section 7). The scope of this analysis is restricted to the design of the codisposal waste package of the Savannah River Site (SRS) DHLW glass canisters and the Training, Research, Isotopes General Atomics (TRIGA) SNF loaded in a short 18-in.-outer diameter (OD) DOE standardized SNF canister. This waste package is representative of the waste packages that consist of the DHLW disposal container, the DHLW/HLW glass canisters, and the DOE-managed SNF in disposable canisters. The intended use of this analysis is to support Site Recommendation reports and to assist in the development of WPD drawings. Activities described in this analysis were conducted in accordance with the Development Plan ''Design Analysis for the Defense High-Level Waste Disposal Container'' (CRWMS M&O 2000c) with no deviations from the plan.« less
A Review of Meta-Analysis Packages in R
ERIC Educational Resources Information Center
Polanin, Joshua R.; Hennessy, Emily A.; Tanner-Smith, Emily E.
2017-01-01
Meta-analysis is a statistical technique that allows an analyst to synthesize effect sizes from multiple primary studies. To estimate meta-analysis models, the open-source statistical environment R is quickly becoming a popular choice. The meta-analytic community has contributed to this growth by developing numerous packages specific to…
Drought: A comprehensive R package for drought monitoring, prediction and analysis
NASA Astrophysics Data System (ADS)
Hao, Zengchao; Hao, Fanghua; Singh, Vijay P.; Cheng, Hongguang
2015-04-01
Drought may impose serious challenges to human societies and ecosystems. Due to complicated causing effects and wide impacts, a universally accepted definition of drought does not exist. The drought indicator is commonly used to characterize drought properties such as duration or severity. Various drought indicators have been developed in the past few decades for the monitoring of a certain aspect of drought condition along with the development of multivariate drought indices for drought characterizations from multiple sources or hydro-climatic variables. Reliable drought prediction with suitable drought indicators is critical to the drought preparedness plan to reduce potential drought impacts. In addition, drought analysis to quantify the risk of drought properties would provide useful information for operation drought managements. The drought monitoring, prediction and risk analysis are important components in drought modeling and assessments. In this study, a comprehensive R package "drought" is developed to aid the drought monitoring, prediction and risk analysis (available from R-Forge and CRAN soon). The computation of a suite of univariate and multivariate drought indices that integrate drought information from various sources such as precipitation, temperature, soil moisture, and runoff is available in the drought monitoring component in the package. The drought prediction/forecasting component consists of statistical drought predictions to enhance the drought early warning for decision makings. Analysis of drought properties such as duration and severity is also provided in this package for drought risk assessments. Based on this package, a drought monitoring and prediction/forecasting system is under development as a decision supporting tool. The package will be provided freely to the public to aid the drought modeling and assessment for researchers and practitioners.
Instrument Packages for the Cold, Dark, High Radiation Environments
NASA Technical Reports Server (NTRS)
Clark, P. E.; Millar, P. S.; Yeh, P. S.; Beamna, B.; Brigham, D.; Feng, S.
2011-01-01
We are developing a small cold temperature instrument package concept that integrates a cold temperature power system and radhard ultra low temperature ultra low power electronics components and power supplies now under development into a cold temperature surface operational version of a planetary surface instrument package. We are already in the process of developing a lower power lower tem-perature version for an instrument of mutual interest to SMD and ESMD to support the search for volatiles (the mass spectrometer VAPoR, Volatile Analysis by Pyrolysis of Regolith) both as a stand alone instrument and as part of an environmental monitoring package.
Generic Degraded Congiguration Probability Analysis for DOE Codisposal Waste Package
DOE Office of Scientific and Technical Information (OSTI.GOV)
S.F.A. Deng; M. Saglam; L.J. Gratton
2001-05-23
In accordance with the technical work plan, ''Technical Work Plan For: Department of Energy Spent Nuclear Fuel Work Packages'' (CRWMS M&O 2000c), this Analysis/Model Report (AMR) is developed for the purpose of screening out degraded configurations for U.S. Department of Energy (DOE) spent nuclear fuel (SNF) types. It performs the degraded configuration parameter and probability evaluations of the overall methodology specified in the ''Disposal Criticality Analysis Methodology Topical Report'' (YMP 2000, Section 3) to qualifying configurations. Degradation analyses are performed to assess realizable parameter ranges and physical regimes for configurations. Probability calculations are then performed for configurations characterized by k{submore » eff} in excess of the Critical Limit (CL). The scope of this document is to develop a generic set of screening criteria or models to screen out degraded configurations having potential for exceeding a criticality limit. The developed screening criteria include arguments based on physical/chemical processes and probability calculations and apply to DOE SNF types when codisposed with the high-level waste (HLW) glass inside a waste package. The degradation takes place inside the waste package and is long after repository licensing has expired. The emphasis of this AMR is on degraded configuration screening and the probability analysis is one of the approaches used for screening. The intended use of the model is to apply the developed screening criteria to each DOE SNF type following the completion of the degraded mode criticality analysis internal to the waste package.« less
DESIGN ANALYSIS FOR THE NAVAL SNF WASTE PACKAGE
DOE Office of Scientific and Technical Information (OSTI.GOV)
T.L. Mitchell
2000-05-31
The purpose of this analysis is to demonstrate the design of the naval spent nuclear fuel (SNF) waste package (WP) using the Waste Package Department's (WPD) design methodologies and processes described in the ''Waste Package Design Methodology Report'' (CRWMS M&O [Civilian Radioactive Waste Management System Management and Operating Contractor] 2000b). The calculations that support the design of the naval SNF WP will be discussed; however, only a sub-set of such analyses will be presented and shall be limited to those identified in the ''Waste Package Design Sensitivity Report'' (CRWMS M&O 2000c). The objective of this analysis is to describe themore » naval SNF WP design method and to show that the design of the naval SNF WP complies with the ''Naval Spent Nuclear Fuel Disposal Container System Description Document'' (CRWMS M&O 1999a) and Interface Control Document (ICD) criteria for Site Recommendation. Additional criteria for the design of the naval SNF WP have been outlined in Section 6.2 of the ''Waste Package Design Sensitivity Report'' (CRWMS M&O 2000c). The scope of this analysis is restricted to the design of the naval long WP containing one naval long SNF canister. This WP is representative of the WPs that will contain both naval short SNF and naval long SNF canisters. The following items are included in the scope of this analysis: (1) Providing a general description of the applicable design criteria; (2) Describing the design methodology to be used; (3) Presenting the design of the naval SNF waste package; and (4) Showing compliance with all applicable design criteria. The intended use of this analysis is to support Site Recommendation reports and assist in the development of WPD drawings. Activities described in this analysis were conducted in accordance with the technical product development plan (TPDP) ''Design Analysis for the Naval SNF Waste Package (CRWMS M&O 2000a).« less
Versatile Software Package For Near Real-Time Analysis of Experimental Data
NASA Technical Reports Server (NTRS)
Wieseman, Carol D.; Hoadley, Sherwood T.
1998-01-01
This paper provides an overview of a versatile software package developed for time- and frequency-domain analyses of experimental wind-tunnel data. This package, originally developed for analyzing data in the NASA Langley Transonic Dynamics Tunnel (TDT), is applicable for analyzing any time-domain data. A Matlab-based software package, TDT-analyzer, provides a compendium of commonly-required dynamic analysis functions in a user-friendly interactive and batch processing environment. TDT-analyzer has been used extensively to provide on-line near real-time and post-test examination and reduction of measured data acquired during wind tunnel tests of aeroelastically-scaled models of aircraft and rotorcraft as well as a flight test of the NASA High Alpha Research Vehicle (HARV) F-18. The package provides near real-time results in an informative and timely manner far exceeding prior methods of data reduction at the TDT.
SIMA: Python software for analysis of dynamic fluorescence imaging data.
Kaifosh, Patrick; Zaremba, Jeffrey D; Danielson, Nathan B; Losonczy, Attila
2014-01-01
Fluorescence imaging is a powerful method for monitoring dynamic signals in the nervous system. However, analysis of dynamic fluorescence imaging data remains burdensome, in part due to the shortage of available software tools. To address this need, we have developed SIMA, an open source Python package that facilitates common analysis tasks related to fluorescence imaging. Functionality of this package includes correction of motion artifacts occurring during in vivo imaging with laser-scanning microscopy, segmentation of imaged fields into regions of interest (ROIs), and extraction of signals from the segmented ROIs. We have also developed a graphical user interface (GUI) for manual editing of the automatically segmented ROIs and automated registration of ROIs across multiple imaging datasets. This software has been designed with flexibility in mind to allow for future extension with different analysis methods and potential integration with other packages. Software, documentation, and source code for the SIMA package and ROI Buddy GUI are freely available at http://www.losonczylab.org/sima/.
Analysis of reference transactions using packaged computer programs.
Calabretta, N; Ross, R
1984-01-01
Motivated by a continuing education class attended by the authors on the measurement of reference desk activities, the reference department at Scott Memorial Library initiated a project to gather data on reference desk transactions and to analyze the data by using packaged computer programs. The programs utilized for the project were SPSS (Statistical Package for the Social Sciences) and SAS (Statistical Analysis System). The planning, implementation and development of the project are described.
NORTICA—a new code for cyclotron analysis
NASA Astrophysics Data System (ADS)
Gorelov, D.; Johnson, D.; Marti, F.
2001-12-01
The new package NORTICA (Numerical ORbit Tracking In Cyclotrons with Analysis) of computer codes for beam dynamics simulations is under development at NSCL. The package was started as a replacement for the code MONSTER [1] developed in the laboratory in the past. The new codes are capable of beam dynamics simulations in both CCF (Coupled Cyclotron Facility) accelerators, the K500 and K1200 superconducting cyclotrons. The general purpose of this package is assisting in setting and tuning the cyclotrons taking into account the main field and extraction channel imperfections. The computer platform for the package is Alpha Station with UNIX operating system and X-Windows graphic interface. A multiple programming language approach was used in order to combine the reliability of the numerical algorithms developed over the long period of time in the laboratory and the friendliness of modern style user interface. This paper describes the capability and features of the codes in the present state.
The Geomorphic Road Analysis and Inventory Package (GRAIP) Volume 2: Office Procedures
Richard M. Cissel; Thomas A. Black; Kimberly A. T. Schreuders; Ajay Prasad; Charles H. Luce; David G. Tarboton; Nathan A. Nelson
2012-01-01
An important first step in managing forest roads for improved water quality and aquatic habitat is the performance of an inventory. The Geomorphic Roads Analysis and Inventory Package (GRAIP) was developed as a tool for making a comprehensive inventory and analysis of the effects of forest roads on watersheds. This manual describes the data analysis and process of a...
Liu, Bin; Wu, Hao; Zhang, Deyuan; Wang, Xiaolong; Chou, Kuo-Chen
2017-02-21
To expedite the pace in conducting genome/proteome analysis, we have developed a Python package called Pse-Analysis. The powerful package can automatically complete the following five procedures: (1) sample feature extraction, (2) optimal parameter selection, (3) model training, (4) cross validation, and (5) evaluating prediction quality. All the work a user needs to do is to input a benchmark dataset along with the query biological sequences concerned. Based on the benchmark dataset, Pse-Analysis will automatically construct an ideal predictor, followed by yielding the predicted results for the submitted query samples. All the aforementioned tedious jobs can be automatically done by the computer. Moreover, the multiprocessing technique was adopted to enhance computational speed by about 6 folds. The Pse-Analysis Python package is freely accessible to the public at http://bioinformatics.hitsz.edu.cn/Pse-Analysis/, and can be directly run on Windows, Linux, and Unix.
AOP: An R Package For Sufficient Causal Analysis in Pathway ...
Summary: How can I quickly find the key events in a pathway that I need to monitor to predict that a/an beneficial/adverse event/outcome will occur? This is a key question when using signaling pathways for drug/chemical screening in pharma-cology, toxicology and risk assessment. By identifying these sufficient causal key events, we have fewer events to monitor for a pathway, thereby decreasing assay costs and time, while maximizing the value of the information. I have developed the “aop” package which uses backdoor analysis of causal net-works to identify these minimal sets of key events that are suf-ficient for making causal predictions. Availability and Implementation: The source and binary are available online through the Bioconductor project (http://www.bioconductor.org/) as an R package titled “aop”. The R/Bioconductor package runs within the R statistical envi-ronment. The package has functions that can take pathways (as directed graphs) formatted as a Cytoscape JSON file as input, or pathways can be represented as directed graphs us-ing the R/Bioconductor “graph” package. The “aop” package has functions that can perform backdoor analysis to identify the minimal set of key events for making causal predictions.Contact: burgoon.lyle@epa.gov This paper describes an R/Bioconductor package that was developed to facilitate the identification of key events within an AOP that are the minimal set of sufficient key events that need to be tested/monit
Haider, Kamran; Cruz, Anthony; Ramsey, Steven; Gilson, Michael K; Kurtzman, Tom
2018-01-09
We have developed SSTMap, a software package for mapping structural and thermodynamic water properties in molecular dynamics trajectories. The package introduces automated analysis and mapping of local measures of frustration and enhancement of water structure. The thermodynamic calculations are based on Inhomogeneous Fluid Solvation Theory (IST), which is implemented using both site-based and grid-based approaches. The package also extends the applicability of solvation analysis calculations to multiple molecular dynamics (MD) simulation programs by using existing cross-platform tools for parsing MD parameter and trajectory files. SSTMap is implemented in Python and contains both command-line tools and a Python module to facilitate flexibility in setting up calculations and for automated generation of large data sets involving analysis of multiple solutes. Output is generated in formats compatible with popular Python data science packages. This tool will be used by the molecular modeling community for computational analysis of water in problems of biophysical interest such as ligand binding and protein function.
NASA Technical Reports Server (NTRS)
Buntine, Wray
1993-01-01
This paper introduces the IND Tree Package to prospective users. IND does supervised learning using classification trees. This learning task is a basic tool used in the development of diagnosis, monitoring and expert systems. The IND Tree Package was developed as part of a NASA project to semi-automate the development of data analysis and modelling algorithms using artificial intelligence techniques. The IND Tree Package integrates features from CART and C4 with newer Bayesian and minimum encoding methods for growing classification trees and graphs. The IND Tree Package also provides an experimental control suite on top. The newer features give improved probability estimates often required in diagnostic and screening tasks. The package comes with a manual, Unix 'man' entries, and a guide to tree methods and research. The IND Tree Package is implemented in C under Unix and was beta-tested at university and commercial research laboratories in the United States.
Scripting MODFLOW model development using Python and FloPy
Bakker, Mark; Post, Vincent E. A.; Langevin, Christian D.; Hughes, Joseph D.; White, Jeremy; Starn, Jeffrey; Fienen, Michael N.
2016-01-01
Graphical user interfaces (GUIs) are commonly used to construct and postprocess numerical groundwater flow and transport models. Scripting model development with the programming language Python is presented here as an alternative approach. One advantage of Python is that there are many packages available to facilitate the model development process, including packages for plotting, array manipulation, optimization, and data analysis. For MODFLOW-based models, the FloPy package was developed by the authors to construct model input files, run the model, and read and plot simulation results. Use of Python with the available scientific packages and FloPy facilitates data exploration, alternative model evaluations, and model analyses that can be difficult to perform with GUIs. Furthermore, Python scripts are a complete, transparent, and repeatable record of the modeling process. The approach is introduced with a simple FloPy example to create and postprocess a MODFLOW model. A more complicated capture-fraction analysis with a real-world model is presented to demonstrate the types of analyses that can be performed using Python and FloPy.
A streamlined Python framework for AT-TPC data analysis
NASA Astrophysics Data System (ADS)
Taylor, J. Z.; Bradt, J.; Bazin, D.; Kuchera, M. P.
2017-09-01
User-friendly data analysis software has been developed for the Active-Target Time Projection Chamber (AT-TPC) experiment at the National Superconducting Cyclotron Laboratory at Michigan State University. The AT-TPC, commissioned in 2014, is a gas-filled detector that acts as both the detector and target for high-efficiency detection of low-intensity, exotic nuclear reactions. The pytpc framework is a Python package for analyzing AT-TPC data. The package was developed for the analysis of 46Ar(p, p) data. The existing software was used to analyze data produced by the 40Ar(p, p) experiment that ran in August, 2015. Usage of the package was documented in an analysis manual both to improve analysis steps and aid in the work of future AT-TPC users. Software features and analysis methods in the pytpc framework will be presented along with the 40Ar results.
Powerlaw: a Python package for analysis of heavy-tailed distributions.
Alstott, Jeff; Bullmore, Ed; Plenz, Dietmar
2014-01-01
Power laws are theoretically interesting probability distributions that are also frequently used to describe empirical data. In recent years, effective statistical methods for fitting power laws have been developed, but appropriate use of these techniques requires significant programming and statistical insight. In order to greatly decrease the barriers to using good statistical methods for fitting power law distributions, we developed the powerlaw Python package. This software package provides easy commands for basic fitting and statistical analysis of distributions. Notably, it also seeks to support a variety of user needs by being exhaustive in the options available to the user. The source code is publicly available and easily extensible.
Radio Astronomy Tools in Python: Spectral-cube, pvextractor, and more
NASA Astrophysics Data System (ADS)
Ginsburg, A.; Robitaille, T.; Beaumont, C.; Rosolowsky, E.; Leroy, A.; Brogan, C.; Hunter, T.; Teuben, P.; Brisbin, D.
2015-12-01
The radio-astro-tools organization has been established to facilitate development of radio and millimeter analysis tools by the scientific community. The first packages developed under its umbrella are: • The spectral-cube package, for reading, writing, and analyzing spectral data cubes • The pvextractor package for extracting position-velocity slices from position-position-velocity cubes along aribitrary paths • The radio-beam package to handle gaussian beams in the context of the astropy quantity and unit framework • casa-python to enable installation of these packages - and any other - into users' CASA environments without conflicting with the underlying CASA package. Community input in the form of code contributions, suggestions, questions and commments is welcome on all of these tools. They can all be found at http://radio-astro-tools.github.io.
Network Meta-Analysis Using R: A Review of Currently Available Automated Packages
Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph
2014-01-01
Network meta-analysis (NMA) – a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously – has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA. PMID:25541687
Network meta-analysis using R: a review of currently available automated packages.
Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph
2014-01-01
Network meta-analysis (NMA)--a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously--has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA.
PharmacoGx: an R package for analysis of large pharmacogenomic datasets.
Smirnov, Petr; Safikhani, Zhaleh; El-Hachem, Nehme; Wang, Dong; She, Adrian; Olsen, Catharina; Freeman, Mark; Selby, Heather; Gendoo, Deena M A; Grossmann, Patrick; Beck, Andrew H; Aerts, Hugo J W L; Lupien, Mathieu; Goldenberg, Anna; Haibe-Kains, Benjamin
2016-04-15
Pharmacogenomics holds great promise for the development of biomarkers of drug response and the design of new therapeutic options, which are key challenges in precision medicine. However, such data are scattered and lack standards for efficient access and analysis, consequently preventing the realization of the full potential of pharmacogenomics. To address these issues, we implemented PharmacoGx, an easy-to-use, open source package for integrative analysis of multiple pharmacogenomic datasets. We demonstrate the utility of our package in comparing large drug sensitivity datasets, such as the Genomics of Drug Sensitivity in Cancer and the Cancer Cell Line Encyclopedia. Moreover, we show how to use our package to easily perform Connectivity Map analysis. With increasing availability of drug-related data, our package will open new avenues of research for meta-analysis of pharmacogenomic data. PharmacoGx is implemented in R and can be easily installed on any system. The package is available from CRAN and its source code is available from GitHub. bhaibeka@uhnresearch.ca or benjamin.haibe.kains@utoronto.ca Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Structural constraints in the packaging of bluetongue virus genomic segments
Burkhardt, Christiane; Sung, Po-Yu; Celma, Cristina C.
2014-01-01
The mechanism used by bluetongue virus (BTV) to ensure the sorting and packaging of its 10 genomic segments is still poorly understood. In this study, we investigated the packaging constraints for two BTV genomic segments from two different serotypes. Segment 4 (S4) of BTV serotype 9 was mutated sequentially and packaging of mutant ssRNAs was investigated by two newly developed RNA packaging assay systems, one in vivo and the other in vitro. Modelling of the mutated ssRNA followed by biochemical data analysis suggested that a conformational motif formed by interaction of the 5′ and 3′ ends of the molecule was necessary and sufficient for packaging. A similar structural signal was also identified in S8 of BTV serotype 1. Furthermore, the same conformational analysis of secondary structures for positive-sense ssRNAs was used to generate a chimeric segment that maintained the putative packaging motif but contained unrelated internal sequences. This chimeric segment was packaged successfully, confirming that the motif identified directs the correct packaging of the segment. PMID:24980574
Language Analysis Package (L.A.P.) Version I System Design.
ERIC Educational Resources Information Center
Porch, Ann
To permit researchers to use the speed and versatility of the computer to process natural language text as well as numerical data without undergoing special training in programing or computer operations, a language analysis package has been developed partially based on several existing programs. An overview of the design is provided and system…
AMModels: An R package for storing models, data, and metadata to facilitate adaptive management
Katz, Jonathan E.
2018-01-01
Agencies are increasingly called upon to implement their natural resource management programs within an adaptive management (AM) framework. This article provides the background and motivation for the R package, AMModels. AMModels was developed under R version 3.2.2. The overall goal of AMModels is simple: To codify knowledge in the form of models and to store it, along with models generated from numerous analyses and datasets that may come our way, so that it can be used or recalled in the future. AMModels facilitates this process by storing all models and datasets in a single object that can be saved to an .RData file and routinely augmented to track changes in knowledge through time. Through this process, AMModels allows the capture, development, sharing, and use of knowledge that may help organizations achieve their mission. While AMModels was designed to facilitate adaptive management, its utility is far more general. Many R packages exist for creating and summarizing models, but to our knowledge, AMModels is the only package dedicated not to the mechanics of analysis but to organizing analysis inputs, analysis outputs, and preserving descriptive metadata. We anticipate that this package will assist users hoping to preserve the key elements of an analysis so they may be more confidently revisited at a later date. PMID:29489825
AMModels: An R package for storing models, data, and metadata to facilitate adaptive management.
Donovan, Therese M; Katz, Jonathan E
2018-01-01
Agencies are increasingly called upon to implement their natural resource management programs within an adaptive management (AM) framework. This article provides the background and motivation for the R package, AMModels. AMModels was developed under R version 3.2.2. The overall goal of AMModels is simple: To codify knowledge in the form of models and to store it, along with models generated from numerous analyses and datasets that may come our way, so that it can be used or recalled in the future. AMModels facilitates this process by storing all models and datasets in a single object that can be saved to an .RData file and routinely augmented to track changes in knowledge through time. Through this process, AMModels allows the capture, development, sharing, and use of knowledge that may help organizations achieve their mission. While AMModels was designed to facilitate adaptive management, its utility is far more general. Many R packages exist for creating and summarizing models, but to our knowledge, AMModels is the only package dedicated not to the mechanics of analysis but to organizing analysis inputs, analysis outputs, and preserving descriptive metadata. We anticipate that this package will assist users hoping to preserve the key elements of an analysis so they may be more confidently revisited at a later date.
NASA Technical Reports Server (NTRS)
1984-01-01
Boeing Commercial Airplane Company's Flight Control Department engineers relied on Langley developed software package known as ORACLS to develop an advanced control synthesis package for both continuous and discrete control system. Package was used by Boeing for computerized analysis of new system designs. Resulting applications include a multiple input/output control system for the terrain-following navigation equipment of the Air Forces B-1 Bomber, and another for controlling in flight changes of wing camber on an experimental airplane. ORACLS is one of 1,300 computer programs available from COSMIC.
The Geomorphic Road Analysis and Inventory Package (GRAIP) Volume 1: Data Collection Method
Thomas A. Black; Richard M. Cissel; Charles H. Luce
2012-01-01
An important first step in managing forest roads for improved water quality and aquatic habitat is the performance of an inventory. The Geomorphic Roads Analysis and Inventory Package (GRAIP) was developed as a tool for making a comprehensive inventory and analysis of the effects of forest roads on watersheds. This manual describes the data collection and process of a...
Lin4Neuro: a customized Linux distribution ready for neuroimaging analysis
2011-01-01
Background A variety of neuroimaging software packages have been released from various laboratories worldwide, and many researchers use these packages in combination. Though most of these software packages are freely available, some people find them difficult to install and configure because they are mostly based on UNIX-like operating systems. We developed a live USB-bootable Linux package named "Lin4Neuro." This system includes popular neuroimaging analysis tools. The user interface is customized so that even Windows users can use it intuitively. Results The boot time of this system was only around 40 seconds. We performed a benchmark test of inhomogeneity correction on 10 subjects of three-dimensional T1-weighted MRI scans. The processing speed of USB-booted Lin4Neuro was as fast as that of the package installed on the hard disk drive. We also installed Lin4Neuro on a virtualization software package that emulates the Linux environment on a Windows-based operation system. Although the processing speed was slower than that under other conditions, it remained comparable. Conclusions With Lin4Neuro in one's hand, one can access neuroimaging software packages easily, and immediately focus on analyzing data. Lin4Neuro can be a good primer for beginners of neuroimaging analysis or students who are interested in neuroimaging analysis. It also provides a practical means of sharing analysis environments across sites. PMID:21266047
Lin4Neuro: a customized Linux distribution ready for neuroimaging analysis.
Nemoto, Kiyotaka; Dan, Ippeita; Rorden, Christopher; Ohnishi, Takashi; Tsuzuki, Daisuke; Okamoto, Masako; Yamashita, Fumio; Asada, Takashi
2011-01-25
A variety of neuroimaging software packages have been released from various laboratories worldwide, and many researchers use these packages in combination. Though most of these software packages are freely available, some people find them difficult to install and configure because they are mostly based on UNIX-like operating systems. We developed a live USB-bootable Linux package named "Lin4Neuro." This system includes popular neuroimaging analysis tools. The user interface is customized so that even Windows users can use it intuitively. The boot time of this system was only around 40 seconds. We performed a benchmark test of inhomogeneity correction on 10 subjects of three-dimensional T1-weighted MRI scans. The processing speed of USB-booted Lin4Neuro was as fast as that of the package installed on the hard disk drive. We also installed Lin4Neuro on a virtualization software package that emulates the Linux environment on a Windows-based operation system. Although the processing speed was slower than that under other conditions, it remained comparable. With Lin4Neuro in one's hand, one can access neuroimaging software packages easily, and immediately focus on analyzing data. Lin4Neuro can be a good primer for beginners of neuroimaging analysis or students who are interested in neuroimaging analysis. It also provides a practical means of sharing analysis environments across sites.
Scripting MODFLOW Model Development Using Python and FloPy.
Bakker, M; Post, V; Langevin, C D; Hughes, J D; White, J T; Starn, J J; Fienen, M N
2016-09-01
Graphical user interfaces (GUIs) are commonly used to construct and postprocess numerical groundwater flow and transport models. Scripting model development with the programming language Python is presented here as an alternative approach. One advantage of Python is that there are many packages available to facilitate the model development process, including packages for plotting, array manipulation, optimization, and data analysis. For MODFLOW-based models, the FloPy package was developed by the authors to construct model input files, run the model, and read and plot simulation results. Use of Python with the available scientific packages and FloPy facilitates data exploration, alternative model evaluations, and model analyses that can be difficult to perform with GUIs. Furthermore, Python scripts are a complete, transparent, and repeatable record of the modeling process. The approach is introduced with a simple FloPy example to create and postprocess a MODFLOW model. A more complicated capture-fraction analysis with a real-world model is presented to demonstrate the types of analyses that can be performed using Python and FloPy. © 2016, National Ground Water Association.
Analysis of counting data: Development of the SATLAS Python package
NASA Astrophysics Data System (ADS)
Gins, W.; de Groote, R. P.; Bissell, M. L.; Granados Buitrago, C.; Ferrer, R.; Lynch, K. M.; Neyens, G.; Sels, S.
2018-01-01
For the analysis of low-statistics counting experiments, a traditional nonlinear least squares minimization routine may not always provide correct parameter and uncertainty estimates due to the assumptions inherent in the algorithm(s). In response to this, a user-friendly Python package (SATLAS) was written to provide an easy interface between the data and a variety of minimization algorithms which are suited for analyzinglow, as well as high, statistics data. The advantage of this package is that it allows the user to define their own model function and then compare different minimization routines to determine the optimal parameter values and their respective (correlated) errors. Experimental validation of the different approaches in the package is done through analysis of hyperfine structure data of 203Fr gathered by the CRIS experiment at ISOLDE, CERN.
Community-based benchmarking of the CMIP DECK experiments
NASA Astrophysics Data System (ADS)
Gleckler, P. J.
2015-12-01
A diversity of community-based efforts are independently developing "diagnostic packages" with little or no coordination between them. A short list of examples include NCAR's Climate Variability Diagnostics Package (CVDP), ORNL's International Land Model Benchmarking (ILAMB), LBNL's Toolkit for Extreme Climate Analysis (TECA), PCMDI's Metrics Package (PMP), the EU EMBRACE ESMValTool, the WGNE MJO diagnostics package, and CFMIP diagnostics. The full value of these efforts cannot be realized without some coordination. As a first step, a WCRP effort has initiated a catalog to document candidate packages that could potentially be applied in a "repeat-use" fashion to all simulations contributed to the CMIP DECK (Diagnostic, Evaluation and Characterization of Klima) experiments. Some coordination of community-based diagnostics has the additional potential to improve how CMIP modeling groups analyze their simulations during model-development. The fact that most modeling groups now maintain a "CMIP compliant" data stream means that in principal without much effort they could readily adopt a set of well organized diagnostic capabilities specifically designed to operate on CMIP DECK experiments. Ultimately, a detailed listing of and access to analysis codes that are demonstrated to work "out of the box" with CMIP data could enable model developers (and others) to select those codes they wish to implement in-house, potentially enabling more systematic evaluation during the model development process.
Wang, Xiuquan; Huang, Guohe; Zhao, Shan; Guo, Junhong
2015-09-01
This paper presents an open-source software package, rSCA, which is developed based upon a stepwise cluster analysis method and serves as a statistical tool for modeling the relationships between multiple dependent and independent variables. The rSCA package is efficient in dealing with both continuous and discrete variables, as well as nonlinear relationships between the variables. It divides the sample sets of dependent variables into different subsets (or subclusters) through a series of cutting and merging operations based upon the theory of multivariate analysis of variance (MANOVA). The modeling results are given by a cluster tree, which includes both intermediate and leaf subclusters as well as the flow paths from the root of the tree to each leaf subcluster specified by a series of cutting and merging actions. The rSCA package is a handy and easy-to-use tool and is freely available at http://cran.r-project.org/package=rSCA . By applying the developed package to air quality management in an urban environment, we demonstrate its effectiveness in dealing with the complicated relationships among multiple variables in real-world problems.
Image analysis to evaluate the browning degree of banana (Musa spp.) peel.
Cho, Jeong-Seok; Lee, Hyeon-Jeong; Park, Jung-Hoon; Sung, Jun-Hyung; Choi, Ji-Young; Moon, Kwang-Deog
2016-03-01
Image analysis was applied to examine banana peel browning. The banana samples were divided into 3 treatment groups: no treatment and normal packaging (Cont); CO2 gas exchange packaging (CO); normal packaging with an ethylene generator (ET). We confirmed that the browning of banana peels developed more quickly in the CO group than the other groups based on sensory test and enzyme assay. The G (green) and CIE L(∗), a(∗), and b(∗) values obtained from the image analysis sharply increased or decreased in the CO group. And these colour values showed high correlation coefficients (>0.9) with the sensory test results. CIE L(∗)a(∗)b(∗) values using a colorimeter also showed high correlation coefficients but comparatively lower than those of image analysis. Based on this analysis, browning of the banana occurred more quickly for CO2 gas exchange packaging, and image analysis can be used to evaluate the browning of banana peels. Copyright © 2015 Elsevier Ltd. All rights reserved.
PIVOT: platform for interactive analysis and visualization of transcriptomics data.
Zhu, Qin; Fisher, Stephen A; Dueck, Hannah; Middleton, Sarah; Khaladkar, Mugdha; Kim, Junhyong
2018-01-05
Many R packages have been developed for transcriptome analysis but their use often requires familiarity with R and integrating results of different packages requires scripts to wrangle the datatypes. Furthermore, exploratory data analyses often generate multiple derived datasets such as data subsets or data transformations, which can be difficult to track. Here we present PIVOT, an R-based platform that wraps open source transcriptome analysis packages with a uniform user interface and graphical data management that allows non-programmers to interactively explore transcriptomics data. PIVOT supports more than 40 popular open source packages for transcriptome analysis and provides an extensive set of tools for statistical data manipulations. A graph-based visual interface is used to represent the links between derived datasets, allowing easy tracking of data versions. PIVOT further supports automatic report generation, publication-quality plots, and program/data state saving, such that all analysis can be saved, shared and reproduced. PIVOT will allow researchers with broad background to easily access sophisticated transcriptome analysis tools and interactively explore transcriptome datasets.
NASA Astrophysics Data System (ADS)
Petry, Dirk
2018-03-01
CASA is the standard science data analysis package for ALMA and VLA but it can also be used for the analysis of data from other observatories. In this talk, I will give an overview of the structure and features of CASA, who develops it, and the present status and plans, and then show typical analysis workflows for ALMA data with special emphasis on the handling of single dish data and its combination with interferometric data.
Appel, R D; Palagi, P M; Walther, D; Vargas, J R; Sanchez, J C; Ravier, F; Pasquali, C; Hochstrasser, D F
1997-12-01
Although two-dimensional electrophoresis (2-DE) computer analysis software packages have existed ever since 2-DE technology was developed, it is only now that the hardware and software technology allows large-scale studies to be performed on low-cost personal computers or workstations, and that setting up a 2-DE computer analysis system in a small laboratory is no longer considered a luxury. After a first attempt in the seventies and early eighties to develop 2-DE analysis software systems on hardware that had poor or even no graphical capabilities, followed in the late eighties by a wave of innovative software developments that were possible thanks to new graphical interface standards such as XWindows, a third generation of 2-DE analysis software packages has now come to maturity. It can be run on a variety of low-cost, general-purpose personal computers, thus making the purchase of a 2-DE analysis system easily attainable for even the smallest laboratory that is involved in proteome research. Melanie II 2-D PAGE, developed at the University Hospital of Geneva, is such a third-generation software system for 2-DE analysis. Based on unique image processing algorithms, this user-friendly object-oriented software package runs on multiple platforms, including Unix, MS-Windows 95 and NT, and Power Macintosh. It provides efficient spot detection and quantitation, state-of-the-art image comparison, statistical data analysis facilities, and is Internet-ready. Linked to proteome databases such as those available on the World Wide Web, it represents a valuable tool for the "Virtual Lab" of the post-genome area.
PlasmaPy: beginning a community developed Python package for plasma physics
NASA Astrophysics Data System (ADS)
Murphy, Nicholas A.; Huang, Yi-Min; PlasmaPy Collaboration
2016-10-01
In recent years, researchers in several disciplines have collaborated on community-developed open source Python packages such as Astropy, SunPy, and SpacePy. These packages provide core functionality, common frameworks for data analysis and visualization, and educational tools. We propose that our community begins the development of PlasmaPy: a new open source core Python package for plasma physics. PlasmaPy could include commonly used functions in plasma physics, easy-to-use plasma simulation codes, Grad-Shafranov solvers, eigenmode solvers, and tools to analyze both simulations and experiments. The development will include modern programming practices such as version control, embedding documentation in the code, unit tests, and avoiding premature optimization. We will describe early code development on PlasmaPy, and discuss plans moving forward. The success of PlasmaPy depends on active community involvement and a welcoming and inclusive environment, so anyone interested in joining this collaboration should contact the authors.
An R package for the integrated analysis of metabolomics and spectral data.
Costa, Christopher; Maraschin, Marcelo; Rocha, Miguel
2016-06-01
Recently, there has been a growing interest in the field of metabolomics, materialized by a remarkable growth in experimental techniques, available data and related biological applications. Indeed, techniques as nuclear magnetic resonance, gas or liquid chromatography, mass spectrometry, infrared and UV-visible spectroscopies have provided extensive datasets that can help in tasks as biological and biomedical discovery, biotechnology and drug development. However, as it happens with other omics data, the analysis of metabolomics datasets provides multiple challenges, both in terms of methodologies and in the development of appropriate computational tools. Indeed, from the available software tools, none addresses the multiplicity of existing techniques and data analysis tasks. In this work, we make available a novel R package, named specmine, which provides a set of methods for metabolomics data analysis, including data loading in different formats, pre-processing, metabolite identification, univariate and multivariate data analysis, machine learning, and feature selection. Importantly, the implemented methods provide adequate support for the analysis of data from diverse experimental techniques, integrating a large set of functions from several R packages in a powerful, yet simple to use environment. The package, already available in CRAN, is accompanied by a web site where users can deposit datasets, scripts and analysis reports to be shared with the community, promoting the efficient sharing of metabolomics data analysis pipelines. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Recent developments with the ORSER system
NASA Technical Reports Server (NTRS)
Baumer, G. M.; Turner, B. J.; Myers, W. L.
1981-01-01
Additions to the ORSER remote sensing data processing package are described. The ORSER package consists of about 35 individual programs that are grouped into preprocessing, data analysis, and display subsystems. Additional data formats and data management, data transformation, and geometric correlation programs were supplemented to the preprocessing subsystem. Enhancements to the data analysis techniques include a maximum likelihood classifier (MAXCLASS) and a new version of the STATS program which makes delineation of training areas easier and allows for detection of outlier points. Ongoing developments are also described.
Orchestrating high-throughput genomic analysis with Bioconductor
Huber, Wolfgang; Carey, Vincent J.; Gentleman, Robert; Anders, Simon; Carlson, Marc; Carvalho, Benilton S.; Bravo, Hector Corrada; Davis, Sean; Gatto, Laurent; Girke, Thomas; Gottardo, Raphael; Hahne, Florian; Hansen, Kasper D.; Irizarry, Rafael A.; Lawrence, Michael; Love, Michael I.; MacDonald, James; Obenchain, Valerie; Oleś, Andrzej K.; Pagès, Hervé; Reyes, Alejandro; Shannon, Paul; Smyth, Gordon K.; Tenenbaum, Dan; Waldron, Levi; Morgan, Martin
2015-01-01
Bioconductor is an open-source, open-development software project for the analysis and comprehension of high-throughput data in genomics and molecular biology. The project aims to enable interdisciplinary research, collaboration and rapid development of scientific software. Based on the statistical programming language R, Bioconductor comprises 934 interoperable packages contributed by a large, diverse community of scientists. Packages cover a range of bioinformatic and statistical applications. They undergo formal initial review and continuous automated testing. We present an overview for prospective users and contributors. PMID:25633503
A CAD approach to magnetic bearing design
NASA Technical Reports Server (NTRS)
Jeyaseelan, M.; Anand, D. K.; Kirk, J. A.
1988-01-01
A design methodology has been developed at the Magnetic Bearing Research Laboratory for designing magnetic bearings using a CAD approach. This is used in the algorithm of an interactive design software package. The package is a design tool developed to enable the designer to simulate the entire process of design and analysis of the system. Its capabilities include interactive input/modification of geometry, finding any possible saturation at critical sections of the system, and the design and analysis of a control system that stabilizes and maintains magnetic suspension.
AMModels: An R package for storing models, data, and metadata to facilitate adaptive management
Donovan, Therese M.; Katz, Jonathan
2018-01-01
Agencies are increasingly called upon to implement their natural resource management programs within an adaptive management (AM) framework. This article provides the background and motivation for the R package, AMModels. AMModels was developed under R version 3.2.2. The overall goal of AMModels is simple: To codify knowledge in the form of models and to store it, along with models generated from numerous analyses and datasets that may come our way, so that it can be used or recalled in the future. AMModels facilitates this process by storing all models and datasets in a single object that can be saved to an .RData file and routinely augmented to track changes in knowledge through time. Through this process, AMModels allows the capture, development, sharing, and use of knowledge that may help organizations achieve their mission. While AMModels was designed to facilitate adaptive management, its utility is far more general. Many R packages exist for creating and summarizing models, but to our knowledge, AMModels is the only package dedicated not to the mechanics of analysis but to organizing analysis inputs, analysis outputs, and preserving descriptive metadata. We anticipate that this package will assist users hoping to preserve the key elements of an analysis so they may be more confidently revisited at a later date.
Gruber, Bernd; Unmack, Peter J; Berry, Oliver F; Georges, Arthur
2018-05-01
Although vast technological advances have been made and genetic software packages are growing in number, it is not a trivial task to analyse SNP data. We announce a new r package, dartr, enabling the analysis of single nucleotide polymorphism data for population genomic and phylogenomic applications. dartr provides user-friendly functions for data quality control and marker selection, and permits rigorous evaluations of conformation to Hardy-Weinberg equilibrium, gametic-phase disequilibrium and neutrality. The package reports standard descriptive statistics, permits exploration of patterns in the data through principal components analysis and conducts standard F-statistics, as well as basic phylogenetic analyses, population assignment, isolation by distance and exports data to a variety of commonly used downstream applications (e.g., newhybrids, faststructure and phylogeny applications) outside of the r environment. The package serves two main purposes: first, a user-friendly approach to lower the hurdle to analyse such data-therefore, the package comes with a detailed tutorial targeted to the r beginner to allow data analysis without requiring deep knowledge of r. Second, we use a single, well-established format-genlight from the adegenet package-as input for all our functions to avoid data reformatting. By strictly using the genlight format, we hope to facilitate this format as the de facto standard of future software developments and hence reduce the format jungle of genetic data sets. The dartr package is available via the r CRAN network and GitHub. © 2017 John Wiley & Sons Ltd.
Preliminary design package for Sunair SEC-601 solar collector
NASA Technical Reports Server (NTRS)
1978-01-01
The preliminary design of the Owens-Illinois model Sunair SEC-601 tubular air solar collector is presented. Information in this package includes the subsystem design and development approaches, hazard analysis, and detailed drawings available as the preliminary design review.
ATTIRE (analytical tools for thermal infrared engineering): A sensor simulation and modeling package
NASA Astrophysics Data System (ADS)
Jaggi, S.
1993-02-01
The Advanced Sensor Development Laboratory (ASDL) at the Stennis Space Center develops, maintains and calibrates remote sensing instruments for the National Aeronautics & Space Administration (NASA). To perform system design trade-offs, analysis, and establish system parameters, ASDL has developed a software package for analytical simulation of sensor systems. This package called 'Analytical Tools for Thermal InfraRed Engineering' - ATTIRE, simulates the various components of a sensor system. The software allows each subsystem of the sensor to be analyzed independently for its performance. These performance parameters are then integrated to obtain system level information such as Signal-to-Noise Ratio (SNR), Noise Equivalent Radiance (NER), Noise Equivalent Temperature Difference (NETD) etc. This paper describes the uses of the package and the physics that were used to derive the performance parameters.
ERIC Educational Resources Information Center
Lavonen, Jari; Juuti, Kalle; Meisalo, Veijo
2003-01-01
In this study we analyse how the experiences of chemistry teachers on the use of a Microcomputer-Based Laboratory (MBL), gathered by a Likert-scale instrument, can be utilized to develop the new package "Empirica 2000." We used exploratory factor analysis to identify the essential features in a large set of questionnaire data to see how…
Space shuttle/food system study. Volume 2, Appendix F: Flight food and primary packaging
NASA Technical Reports Server (NTRS)
1974-01-01
The analysis and selection of food items and primary packaging, the development of menus, the nutritional analysis of diet, and the analyses of alternate food mixes and contingency foods is reported in terms of the overall food system design for space shuttle flight. Stowage weights and cubic volumes associated with each alternate mix were also evaluated.
Cyrface: An interface from Cytoscape to R that provides a user interface to R packages.
Gonçalves, Emanuel; Mirlach, Franz; Saez-Rodriguez, Julio
2013-01-01
There is an increasing number of software packages to analyse biological experimental data in the R environment. In particular, Bioconductor, a repository of curated R packages, is one of the most comprehensive resources for bioinformatics and biostatistics. The use of these packages is increasing, but it requires a basic understanding of the R language, as well as the syntax of the specific package used. The availability of user graphical interfaces for these packages would decrease the learning curve and broaden their application. Here, we present a Cytoscape app termed Cyrface that allows Cytoscape apps to connect to any function and package developed in R. Cyrface can be used to run R packages from within the Cytoscape environment making use of a graphical user interface. Moreover, it can link R packages with the capabilities of Cytoscape and its apps, in particular network visualization and analysis. Cyrface's utility has been demonstrated for two Bioconductor packages ( CellNOptR and DrugVsDisease), and here we further illustrate its usage by implementing a workflow of data analysis and visualization. Download links, installation instructions and user guides can be accessed from the Cyrface's homepage ( http://www.ebi.ac.uk/saezrodriguez/cyrface/) and from the Cytoscape app store ( http://apps.cytoscape.org/apps/cyrface).
Dualities in the analysis of phage DNA packaging motors
Serwer, Philip; Jiang, Wen
2012-01-01
The DNA packaging motors of double-stranded DNA phages are models for analysis of all multi-molecular motors and for analysis of several fundamental aspects of biology, including early evolution, relationship of in vivo to in vitro biochemistry and targets for anti-virals. Work on phage DNA packaging motors both has produced and is producing dualities in the interpretation of data obtained by use of both traditional techniques and the more recently developed procedures of single-molecule analysis. The dualities include (1) reductive vs. accretive evolution, (2) rotation vs. stasis of sub-assemblies of the motor, (3) thermal ratcheting vs. power stroking in generating force, (4) complete motor vs. spark plug role for the packaging ATPase, (5) use of previously isolated vs. new intermediates for analysis of the intermediate states of the motor and (6) a motor with one cycle vs. a motor with two cycles. We provide background for these dualities, some of which are under-emphasized in the literature. We suggest directions for future research. PMID:23532204
NASA Astrophysics Data System (ADS)
Daniell, James; Simpson, Alanna; Gunasekara, Rashmin; Baca, Abigail; Schaefer, Andreas; Ishizawa, Oscar; Murnane, Rick; Tijssen, Annegien; Deparday, Vivien; Forni, Marc; Himmelfarb, Anne; Leder, Jan
2015-04-01
Over the past few decades, a plethora of open access software packages for the calculation of earthquake, volcanic, tsunami, storm surge, wind and flood have been produced globally. As part of the World Bank GFDRR Review released at the Understanding Risk 2014 Conference, over 80 such open access risk assessment software packages were examined. Commercial software was not considered in the evaluation. A preliminary analysis was used to determine whether the 80 models were currently supported and if they were open access. This process was used to select a subset of 31 models that include 8 earthquake models, 4 cyclone models, 11 flood models, and 8 storm surge/tsunami models for more detailed analysis. By using multi-criteria analysis (MCDA) and simple descriptions of the software uses, the review allows users to select a few relevant software packages for their own testing and development. The detailed analysis evaluated the models on the basis of over 100 criteria and provides a synopsis of available open access natural hazard risk modelling tools. In addition, volcano software packages have since been added making the compendium of risk software tools in excess of 100. There has been a huge increase in the quality and availability of open access/source software over the past few years. For example, private entities such as Deltares now have an open source policy regarding some flood models (NGHS). In addition, leaders in developing risk models in the public sector, such as Geoscience Australia (EQRM, TCRM, TsuDAT, AnuGA) or CAPRA (ERN-Flood, Hurricane, CRISIS2007 etc.), are launching and/or helping many other initiatives. As we achieve greater interoperability between modelling tools, we will also achieve a future wherein different open source and open access modelling tools will be increasingly connected and adapted towards unified multi-risk model platforms and highly customised solutions. It was seen that many software tools could be improved by enabling user-defined exposure and vulnerability. Without this function, many tools can only be used regionally and not at global or continental scale. It is becoming increasingly easy to use multiple packages for a single region and/or hazard to characterize the uncertainty in the risk, or use as checks for the sensitivities in the analysis. There is a potential for valuable synergy between existing software. A number of open source software packages could be combined to generate a multi-risk model with multiple views of a hazard. This extensive review has simply attempted to provide a platform for dialogue between all open source and open access software packages and to hopefully inspire collaboration between developers, given the great work done by all open access and open source developers.
iGC-an integrated analysis package of gene expression and copy number alteration.
Lai, Yi-Pin; Wang, Liang-Bo; Wang, Wei-An; Lai, Liang-Chuan; Tsai, Mong-Hsun; Lu, Tzu-Pin; Chuang, Eric Y
2017-01-14
With the advancement in high-throughput technologies, researchers can simultaneously investigate gene expression and copy number alteration (CNA) data from individual patients at a lower cost. Traditional analysis methods analyze each type of data individually and integrate their results using Venn diagrams. Challenges arise, however, when the results are irreproducible and inconsistent across multiple platforms. To address these issues, one possible approach is to concurrently analyze both gene expression profiling and CNAs in the same individual. We have developed an open-source R/Bioconductor package (iGC). Multiple input formats are supported and users can define their own criteria for identifying differentially expressed genes driven by CNAs. The analysis of two real microarray datasets demonstrated that the CNA-driven genes identified by the iGC package showed significantly higher Pearson correlation coefficients with their gene expression levels and copy numbers than those genes located in a genomic region with CNA. Compared with the Venn diagram approach, the iGC package showed better performance. The iGC package is effective and useful for identifying CNA-driven genes. By simultaneously considering both comparative genomic and transcriptomic data, it can provide better understanding of biological and medical questions. The iGC package's source code and manual are freely available at https://www.bioconductor.org/packages/release/bioc/html/iGC.html .
An initial investigation into methods of computing transonic aerodynamic sensitivity coefficients
NASA Technical Reports Server (NTRS)
Carlson, Leland A.
1991-01-01
The three dimensional quasi-analytical sensitivity analysis and the ancillary driver programs are developed needed to carry out the studies and perform comparisons. The code is essentially contained in one unified package which includes the following: (1) a three dimensional transonic wing analysis program (ZEBRA); (2) a quasi-analytical portion which determines the matrix elements in the quasi-analytical equations; (3) a method for computing the sensitivity coefficients from the resulting quasi-analytical equations; (4) a package to determine for comparison purposes sensitivity coefficients via the finite difference approach; and (5) a graphics package.
Analysis pipelines and packages for Infinium HumanMethylation450 BeadChip (450k) data
Morris, Tiffany J.; Beck, Stephan
2015-01-01
The Illumina HumanMethylation450 BeadChip has become a popular platform for interrogating DNA methylation in epigenome-wide association studies (EWAS) and related projects as well as resource efforts such as the International Cancer Genome Consortium (ICGC) and the International Human Epigenome Consortium (IHEC). This has resulted in an exponential increase of 450k data in recent years and triggered the development of numerous integrated analysis pipelines and stand-alone packages. This review will introduce and discuss the currently most popular pipelines and packages and is particularly aimed at new 450k users. PMID:25233806
TYPE A FISSILE PACKAGING FOR AIR TRANSPORT PROJECT OVERVIEW
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eberl, K.; Blanton, P.
2013-10-11
This paper presents the project status of the Model 9980, a new Type A fissile packaging for use in air transport. The Savannah River National Laboratory (SRNL) developed this new packaging to be a light weight (<150-lb), drum-style package and prepared a Safety Analysis for Packaging (SARP) for submission to the DOE/EM. The package design incorporates unique features and engineered materials specifically designed to minimize packaging weight and to be in compliance with 10CFR71 requirements. Prototypes were fabricated and tested to evaluate the design when subjected to Normal Conditions of Transport (NCT) and Hypothetical Accident Conditions (HAC). An overview ofmore » the design details, results of the regulatory testing, and lessons learned from the prototype fabrication for the 9980 will be presented.« less
Material flow analysis for an industry - A case study in packaging
Amey, E.B.; Sandgren, K.
1996-01-01
The basic materials used in packaging are glass, metals (primarily aluminum and steel), an ever-growing range of plastics, paper and paperboard, wood, textiles for bags, and miscellaneous other materials (such as glues, inks, and other supplies). They are fabricated into rigid, semi-rigid, or flexible containers. The most common forms of these containers include cans, drums, bottles, cartons, boxes, bags, pouches, and wraps. Packaging products are, for the most part, low cost, bulky products that are manufactured close to their customers. There is virtually no import or export of packaging products. A material flow analysis can be developed that looks at all inputs to an industrial sector, inventories the losses in processing, and tracks the fate of the material after its useful life. An example is presented that identifies the material inputs to the packaging industry, and addresses the ultimate fate of the materials used. ?? 1996 International Association for Mathematical Geology.
NASA Technical Reports Server (NTRS)
1974-01-01
The relative penalties associated with various techniques for providing an onboard cold environment for storage of perishable food items, and for the development of packaging and vehicle stowage parameters were investigated in terms of the overall food system design analysis of space shuttle. The degrees of capability for maintaining both a 40 F to 45 F refrigerated temperature and a 0 F and 20 F frozen environment were assessed for the following cooling techniques: (1) phase change (heat sink) concept; (2) thermoelectric concept; (3) vapor cycle concept; and (4) expendable ammonia concept. The parameters considered in the analysis were weight, volume, and spacecraft power restrictions. Data were also produced for packaging and vehicle stowage parameters which are compatible with vehicle weight and volume specifications. Certain assumptions were made for food packaging sizes based on previously generated space shuttle menus. The results of the study are shown, along with the range of meal choices considered.
Model-based gene set analysis for Bioconductor.
Bauer, Sebastian; Robinson, Peter N; Gagneur, Julien
2011-07-01
Gene Ontology and other forms of gene-category analysis play a major role in the evaluation of high-throughput experiments in molecular biology. Single-category enrichment analysis procedures such as Fisher's exact test tend to flag large numbers of redundant categories as significant, which can complicate interpretation. We have recently developed an approach called model-based gene set analysis (MGSA), that substantially reduces the number of redundant categories returned by the gene-category analysis. In this work, we present the Bioconductor package mgsa, which makes the MGSA algorithm available to users of the R language. Our package provides a simple and flexible application programming interface for applying the approach. The mgsa package has been made available as part of Bioconductor 2.8. It is released under the conditions of the Artistic license 2.0. peter.robinson@charite.de; julien.gagneur@embl.de.
psygenet2r: a R/Bioconductor package for the analysis of psychiatric disease genes.
Gutiérrez-Sacristán, Alba; Hernández-Ferrer, Carles; González, Juan R; Furlong, Laura I
2017-12-15
Psychiatric disorders have a great impact on morbidity and mortality. Genotype-phenotype resources for psychiatric diseases are key to enable the translation of research findings to a better care of patients. PsyGeNET is a knowledge resource on psychiatric diseases and their genes, developed by text mining and curated by domain experts. We present psygenet2r, an R package that contains a variety of functions for leveraging PsyGeNET database and facilitating its analysis and interpretation. The package offers different types of queries to the database along with variety of analysis and visualization tools, including the study of the anatomical structures in which the genes are expressed and gaining insight of gene's molecular function. Psygenet2r is especially suited for network medicine analysis of psychiatric disorders. The package is implemented in R and is available under MIT license from Bioconductor (http://bioconductor.org/packages/release/bioc/html/psygenet2r.html). juanr.gonzalez@isglobal.org or laura.furlong@upf.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Li, Ruidong; Qu, Han; Wang, Shibo; Wei, Julong; Zhang, Le; Ma, Renyuan; Lu, Jianming; Zhu, Jianguo; Zhong, Wei-De; Jia, Zhenyu
2018-03-02
The large-scale multidimensional omics data in the Genomic Data Commons (GDC) provides opportunities to investigate the crosstalk among different RNA species and their regulatory mechanisms in cancers. Easy-to-use bioinformatics pipelines are needed to facilitate such studies. We have developed a user-friendly R/Bioconductor package, named GDCRNATools, for downloading, organizing, and analyzing RNA data in GDC with an emphasis on deciphering the lncRNA-mRNA related competing endogenous RNAs (ceRNAs) regulatory network in cancers. Many widely used bioinformatics tools and databases are utilized in our package. Users can easily pack preferred downstream analysis pipelines or integrate their own pipelines into the workflow. Interactive shiny web apps built in GDCRNATools greatly improve visualization of results from the analysis. GDCRNATools is an R/Bioconductor package that is freely available at Bioconductor (http://bioconductor.org/packages/devel/bioc/html/GDCRNATools.html). Detailed instructions, manual and example code are also available in Github (https://github.com/Jialab-UCR/GDCRNATools). arthur.jia@ucr.edu or zhongwd2009@live.cn or doctorzhujianguo@163.com.
Putting the 1991 census sample of anonymised records on your Unix workstation.
Turton, I; Openshaw, S
1995-03-01
"The authors describe the development of a customised computer software package for easing the analysis of the U.K. 1991 Sample of Anonymised Records. The resulting USAR [Unix Sample of Anonymised Records] package is designed to be portable within the Unix environment. It offers a number of features such as interactive table design, intelligent data interpretation, and fuzzy query. An example of SAR analysis is provided." excerpt
HydroApps: An R package for statistical simulation to use in regional analysis
NASA Astrophysics Data System (ADS)
Ganora, D.
2013-12-01
The HydroApps package is a newborn R extension initially developed to support the use of a recent model for flood frequency estimation developed for applications in Northwestern Italy; it also contains some general tools for regional analyses and can be easily extended to include other statistical models. The package is currently at an experimental level of development. The HydroApps is a corollary of the SSEM project for regional flood frequency analysis, although it was developed independently to support various instances of regional analyses. Its aim is to provide a basis for interplay between statistical simulation and practical operational use. In particular, the main module of the package deals with the building of the confidence bands of flood frequency curves expressed by means of their L-moments. Other functions include pre-processing and visualization of hydrologic time series, analysis of the optimal design-flood under uncertainty, but also tools useful in water resources management for the estimation of flow duration curves and their sensitivity to water withdrawals. Particular attention is devoted to the code granularity, i.e. the level of detail and aggregation of the code: a greater detail means more low-level functions, which entails more flexibility but reduces the ease of use for practical use. A balance between detail and simplicity is necessary and can be resolved with appropriate wrapping functions and specific help pages for each working block. From a more general viewpoint, the package has not really and user-friendly interface, but runs on multiple operating systems and it's easy to update, as many other open-source projects., The HydroApps functions and their features are reported in order to share ideas and materials to improve the ';technological' and information transfer between scientist communities and final users like policy makers.
The R package 'Luminescence': a history of unexpected complexity and concepts to deal with it
NASA Astrophysics Data System (ADS)
Kreutzer, Sebastian; Burow, Christoph; Dietze, Michael; Fuchs, Margret C.; Friedrich, Johannes; Fischer, Manfred; Schmidt, Christoph
2017-04-01
Overcoming limitations in the so far used standard software, developing an efficient solution of low weight for a very specific task or creating graphs of high quality: the reasons that may had initially lead a scientist to work with R are manifold. And as long as developed solutions, e.g., R scripts, are needed for personal use only, code can remain unstructured and a documentation is not compulsory. However, this changes with the first friendly request for help after the code has been reused by others. In contrast to single scripts, written without intention to ever get published, for R packages the CRAN policy demands a more structured and elaborated approach including a minimum of documentation. Nevertheless, growing projects with thousands of lines of code that need to be maintained can become overwhelming, in particular as researchers are not by definition experts on managing software projects. The R package 'Luminescence' (Kreutzer et al., 2017), a collection of tools dealing with the analysis of luminescence data in a geoscientific, geochronological context, started as one single R script, but quickly evolved into a comprehensive solution connected with various other R packages. We present (1) a very brief development history of the package 'Luminescence', before we (2) sketch technical challenges encountered over time and solutions that have been found to deal with it by using various open source tools. Our presentation is considered as a collection of concepts and approaches to set up R projects in geosciences. References. Kreutzer, S., Dietze, M., Burow, C., Fuchs, M. C., Schmidt, C., Fischer, M., Friedrich, J., 2017. Luminescence: Comprehensive Luminescence Dating Data Analysis. R package version 0.6.4. https://CRAN.R-project.org/package=Luminescence
Zackay, Arie; Steinhoff, Christine
2010-12-15
Exploration of DNA methylation and its impact on various regulatory mechanisms has become a very active field of research. Simultaneously there is an arising need for tools to process and analyse the data together with statistical investigation and visualisation. MethVisual is a new application that enables exploratory analysis and intuitive visualization of DNA methylation data as is typically generated by bisulfite sequencing. The package allows the import of DNA methylation sequences, aligns them and performs quality control comparison. It comprises basic analysis steps as lollipop visualization, co-occurrence display of methylation of neighbouring and distant CpG sites, summary statistics on methylation status, clustering and correspondence analysis. The package has been developed for methylation data but can be also used for other data types for which binary coding can be inferred. The application of the package, as well as a comparison to existing DNA methylation analysis tools and its workflow based on two datasets is presented in this paper. The R package MethVisual offers various analysis procedures for data that can be binarized, in particular for bisulfite sequenced methylation data. R/Bioconductor has become one of the most important environments for statistical analysis of various types of biological and medical data. Therefore, any data analysis within R that allows the integration of various data types as provided from different technological platforms is convenient. It is the first and so far the only specific package for DNA methylation analysis, in particular for bisulfite sequenced data available in R/Bioconductor enviroment. The package is available for free at http://methvisual.molgen.mpg.de/ and from the Bioconductor Consortium http://www.bioconductor.org.
2010-01-01
Background Exploration of DNA methylation and its impact on various regulatory mechanisms has become a very active field of research. Simultaneously there is an arising need for tools to process and analyse the data together with statistical investigation and visualisation. Findings MethVisual is a new application that enables exploratory analysis and intuitive visualization of DNA methylation data as is typically generated by bisulfite sequencing. The package allows the import of DNA methylation sequences, aligns them and performs quality control comparison. It comprises basic analysis steps as lollipop visualization, co-occurrence display of methylation of neighbouring and distant CpG sites, summary statistics on methylation status, clustering and correspondence analysis. The package has been developed for methylation data but can be also used for other data types for which binary coding can be inferred. The application of the package, as well as a comparison to existing DNA methylation analysis tools and its workflow based on two datasets is presented in this paper. Conclusions The R package MethVisual offers various analysis procedures for data that can be binarized, in particular for bisulfite sequenced methylation data. R/Bioconductor has become one of the most important environments for statistical analysis of various types of biological and medical data. Therefore, any data analysis within R that allows the integration of various data types as provided from different technological platforms is convenient. It is the first and so far the only specific package for DNA methylation analysis, in particular for bisulfite sequenced data available in R/Bioconductor enviroment. The package is available for free at http://methvisual.molgen.mpg.de/ and from the Bioconductor Consortium http://www.bioconductor.org. PMID:21159174
NASA Astrophysics Data System (ADS)
Hayrapetyan, David B.; Hovhannisyan, Levon; Mantashyan, Paytsar A.
2013-04-01
The analysis of complex spectra is an actual problem for modern science. The work is devoted to the creation of a software package, which analyzes spectrum in the different formats, possesses by dynamic knowledge database and self-study mechanism, performs automated analysis of the spectra compound based on knowledge database by application of certain algorithms. In the software package as searching systems, hyper-spherical random search algorithms, gradient algorithms and genetic searching algorithms were used. The analysis of Raman and IR spectrum of diamond-like carbon (DLC) samples were performed by elaborated program. After processing the data, the program immediately displays all the calculated parameters of DLC.
Nipype: a flexible, lightweight and extensible neuroimaging data processing framework in python.
Gorgolewski, Krzysztof; Burns, Christopher D; Madison, Cindee; Clark, Dav; Halchenko, Yaroslav O; Waskom, Michael L; Ghosh, Satrajit S
2011-01-01
Current neuroimaging software offer users an incredible opportunity to analyze their data in different ways, with different underlying assumptions. Several sophisticated software packages (e.g., AFNI, BrainVoyager, FSL, FreeSurfer, Nipy, R, SPM) are used to process and analyze large and often diverse (highly multi-dimensional) data. However, this heterogeneous collection of specialized applications creates several issues that hinder replicable, efficient, and optimal use of neuroimaging analysis approaches: (1) No uniform access to neuroimaging analysis software and usage information; (2) No framework for comparative algorithm development and dissemination; (3) Personnel turnover in laboratories often limits methodological continuity and training new personnel takes time; (4) Neuroimaging software packages do not address computational efficiency; and (5) Methods sections in journal articles are inadequate for reproducing results. To address these issues, we present Nipype (Neuroimaging in Python: Pipelines and Interfaces; http://nipy.org/nipype), an open-source, community-developed, software package, and scriptable library. Nipype solves the issues by providing Interfaces to existing neuroimaging software with uniform usage semantics and by facilitating interaction between these packages using Workflows. Nipype provides an environment that encourages interactive exploration of algorithms, eases the design of Workflows within and between packages, allows rapid comparative development of algorithms and reduces the learning curve necessary to use different packages. Nipype supports both local and remote execution on multi-core machines and clusters, without additional scripting. Nipype is Berkeley Software Distribution licensed, allowing anyone unrestricted usage. An open, community-driven development philosophy allows the software to quickly adapt and address the varied needs of the evolving neuroimaging community, especially in the context of increasing demand for reproducible research.
Nipype: A Flexible, Lightweight and Extensible Neuroimaging Data Processing Framework in Python
Gorgolewski, Krzysztof; Burns, Christopher D.; Madison, Cindee; Clark, Dav; Halchenko, Yaroslav O.; Waskom, Michael L.; Ghosh, Satrajit S.
2011-01-01
Current neuroimaging software offer users an incredible opportunity to analyze their data in different ways, with different underlying assumptions. Several sophisticated software packages (e.g., AFNI, BrainVoyager, FSL, FreeSurfer, Nipy, R, SPM) are used to process and analyze large and often diverse (highly multi-dimensional) data. However, this heterogeneous collection of specialized applications creates several issues that hinder replicable, efficient, and optimal use of neuroimaging analysis approaches: (1) No uniform access to neuroimaging analysis software and usage information; (2) No framework for comparative algorithm development and dissemination; (3) Personnel turnover in laboratories often limits methodological continuity and training new personnel takes time; (4) Neuroimaging software packages do not address computational efficiency; and (5) Methods sections in journal articles are inadequate for reproducing results. To address these issues, we present Nipype (Neuroimaging in Python: Pipelines and Interfaces; http://nipy.org/nipype), an open-source, community-developed, software package, and scriptable library. Nipype solves the issues by providing Interfaces to existing neuroimaging software with uniform usage semantics and by facilitating interaction between these packages using Workflows. Nipype provides an environment that encourages interactive exploration of algorithms, eases the design of Workflows within and between packages, allows rapid comparative development of algorithms and reduces the learning curve necessary to use different packages. Nipype supports both local and remote execution on multi-core machines and clusters, without additional scripting. Nipype is Berkeley Software Distribution licensed, allowing anyone unrestricted usage. An open, community-driven development philosophy allows the software to quickly adapt and address the varied needs of the evolving neuroimaging community, especially in the context of increasing demand for reproducible research. PMID:21897815
NASA Astrophysics Data System (ADS)
Kriswintari, D.; Yuanita, L.; Widodo, W.
2018-04-01
The aim of this study was to develop chemistry learning package using Student Teams Achievement Division (STAD) cooperative learning technique to foster students’ thinking skills and social attitudes. The chemistry learning package consisting of lesson plan, handout, students’ worksheet, thinking skill test, and observation sheet of social attitude was developed using the Dick and Carey model. Research subject of this study was chemistry learning package using STAD which was tried out on tenth grade students of SMA Trimurti Surabaya. The tryout was conducted using the one-group pre-test post-test design. Data was collected through observation, test, and questionnaire. The obtained data were analyzed using descriptive qualitative analysis. The findings of this study revealed that the developed chemistry learning package using STAD cooperative learning technique was categorized valid, practice and effective to be implemented in the classroom to foster students’ thinking skill and social attitude.
NASA Technical Reports Server (NTRS)
Capo, M. A.; Disney, R. K.
1971-01-01
The work performed in the following areas is summarized: (1) Analysis of Realistic nuclear-propelled vehicle was analyzed using the Marshall Space Flight Center computer code package. This code package includes one and two dimensional discrete ordinate transport, point kernel, and single scatter techniques, as well as cross section preparation and data processing codes, (2) Techniques were developed to improve the automated data transfer in the coupled computation method of the computer code package and improve the utilization of this code package on the Univac-1108 computer system. (3) The MSFC master data libraries were updated.
Development of high performance scientific components for interoperability of computing packages
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gulabani, Teena Pratap
2008-01-01
Three major high performance quantum chemistry computational packages, NWChem, GAMESS and MPQC have been developed by different research efforts following different design patterns. The goal is to achieve interoperability among these packages by overcoming the challenges caused by the different communication patterns and software design of each of these packages. A chemistry algorithm is hard to develop as well as being a time consuming process; integration of large quantum chemistry packages will allow resource sharing and thus avoid reinvention of the wheel. Creating connections between these incompatible packages is the major motivation of the proposed work. This interoperability is achievedmore » by bringing the benefits of Component Based Software Engineering through a plug-and-play component framework called Common Component Architecture (CCA). In this thesis, I present a strategy and process used for interfacing two widely used and important computational chemistry methodologies: Quantum Mechanics and Molecular Mechanics. To show the feasibility of the proposed approach the Tuning and Analysis Utility (TAU) has been coupled with NWChem code and its CCA components. Results show that the overhead is negligible when compared to the ease and potential of organizing and coping with large-scale software applications.« less
Perkins, James R; Dawes, John M; McMahon, Steve B; Bennett, David L H; Orengo, Christine; Kohl, Matthias
2012-07-02
Measuring gene transcription using real-time reverse transcription polymerase chain reaction (RT-qPCR) technology is a mainstay of molecular biology. Technologies now exist to measure the abundance of many transcripts in parallel. The selection of the optimal reference gene for the normalisation of this data is a recurring problem, and several algorithms have been developed in order to solve it. So far nothing in R exists to unite these methods, together with other functions to read in and normalise the data using the chosen reference gene(s). We have developed two R/Bioconductor packages, ReadqPCR and NormqPCR, intended for a user with some experience with high-throughput data analysis using R, who wishes to use R to analyse RT-qPCR data. We illustrate their potential use in a workflow analysing a generic RT-qPCR experiment, and apply this to a real dataset. Packages are available from http://www.bioconductor.org/packages/release/bioc/html/ReadqPCR.htmland http://www.bioconductor.org/packages/release/bioc/html/NormqPCR.html These packages increase the repetoire of RT-qPCR analysis tools available to the R user and allow them to (amongst other things) read their data into R, hold it in an ExpressionSet compatible R object, choose appropriate reference genes, normalise the data and look for differential expression between samples.
Test Plan Development for Plastic Ammunition Containers. Volume 1
1989-03-15
1850 Black Canyon Stage¶I Packaging division (SMCAR-AEP) Picatinny Phoenix, Arizona 85027 Arsenal, New Jersey 07806-5000 86. NAME OF FUNDING...packaging containers. The report is presented in two separate volumes. Volume I contains the Final Technical Report and includes the analysis of... Division of the U.S. Army Armament Research, Development and Engineering Center. Mr. Jasper C. Griggs and Mr. D. E. Jones served as technical consultants
Bird impact analysis package for turbine engine fan blades
NASA Technical Reports Server (NTRS)
Hirschbein, M. S.
1982-01-01
A computer program has been developed to analyze the gross structural response of turbine engine fan blades subjected to bird strikes. The program couples a NASTRAN finite element model and modal analysis of a fan blade with a multi-mode bird impact analysis computer program. The impact analysis uses the NASTRAN blade model and a fluid jet model of the bird to interactively calculate blade loading during a bird strike event. The analysis package is computationaly efficient, easy to use and provides a comprehensive history of the gross structual blade response. Example cases are presented for a representative fan blade.
Nonstationary Extreme Value Analysis in a Changing Climate: A Software Package
NASA Astrophysics Data System (ADS)
Cheng, L.; AghaKouchak, A.; Gilleland, E.
2013-12-01
Numerous studies show that climatic extremes have increased substantially in the second half of the 20th century. For this reason, analysis of extremes under a nonstationary assumption has received a great deal of attention. This paper presents a software package developed for estimation of return levels, return periods, and risks of climatic extremes in a changing climate. This MATLAB software package offers tools for analysis of climate extremes under both stationary and non-stationary assumptions. The Nonstationary Extreme Value Analysis (hereafter, NEVA) provides an efficient and generalized framework for analyzing extremes using Bayesian inference. NEVA estimates the extreme value parameters using a Differential Evolution Markov Chain (DE-MC) which utilizes the genetic algorithm Differential Evolution (DE) for global optimization over the real parameter space with the Markov Chain Monte Carlo (MCMC) approach and has the advantage of simplicity, speed of calculation and convergence over conventional MCMC. NEVA also offers the confidence interval and uncertainty bounds of estimated return levels based on the sampled parameters. NEVA integrates extreme value design concepts, data analysis tools, optimization and visualization, explicitly designed to facilitate analysis extremes in geosciences. The generalized input and output files of this software package make it attractive for users from across different fields. Both stationary and nonstationary components of the package are validated for a number of case studies using empirical return levels. The results show that NEVA reliably describes extremes and their return levels.
GUIDEseq: a bioconductor package to analyze GUIDE-Seq datasets for CRISPR-Cas nucleases.
Zhu, Lihua Julie; Lawrence, Michael; Gupta, Ankit; Pagès, Hervé; Kucukural, Alper; Garber, Manuel; Wolfe, Scot A
2017-05-15
Genome editing technologies developed around the CRISPR-Cas9 nuclease system have facilitated the investigation of a broad range of biological questions. These nucleases also hold tremendous promise for treating a variety of genetic disorders. In the context of their therapeutic application, it is important to identify the spectrum of genomic sequences that are cleaved by a candidate nuclease when programmed with a particular guide RNA, as well as the cleavage efficiency of these sites. Powerful new experimental approaches, such as GUIDE-seq, facilitate the sensitive, unbiased genome-wide detection of nuclease cleavage sites within the genome. Flexible bioinformatics analysis tools for processing GUIDE-seq data are needed. Here, we describe an open source, open development software suite, GUIDEseq, for GUIDE-seq data analysis and annotation as a Bioconductor package in R. The GUIDEseq package provides a flexible platform with more than 60 adjustable parameters for the analysis of datasets associated with custom nuclease applications. These parameters allow data analysis to be tailored to different nuclease platforms with different length and complexity in their guide and PAM recognition sequences or their DNA cleavage position. They also enable users to customize sequence aggregation criteria, and vary peak calling thresholds that can influence the number of potential off-target sites recovered. GUIDEseq also annotates potential off-target sites that overlap with genes based on genome annotation information, as these may be the most important off-target sites for further characterization. In addition, GUIDEseq enables the comparison and visualization of off-target site overlap between different datasets for a rapid comparison of different nuclease configurations or experimental conditions. For each identified off-target, the GUIDEseq package outputs mapped GUIDE-Seq read count as well as cleavage score from a user specified off-target cleavage score prediction algorithm permitting the identification of genomic sequences with unexpected cleavage activity. The GUIDEseq package enables analysis of GUIDE-data from various nuclease platforms for any species with a defined genomic sequence. This software package has been used successfully to analyze several GUIDE-seq datasets. The software, source code and documentation are freely available at http://www.bioconductor.org/packages/release/bioc/html/GUIDEseq.html .
MetaboAnalystR: an R package for flexible and reproducible analysis of metabolomics data.
Chong, Jasmine; Xia, Jianguo
2018-06-28
The MetaboAnalyst web application has been widely used for metabolomics data analysis and interpretation. Despite its user-friendliness, the web interface has presented its inherent limitations (especially for advanced users) with regard to flexibility in creating customized workflow, support for reproducible analysis, and capacity in dealing with large data. To address these limitations, we have developed a companion R package (MetaboAnalystR) based on the R code base of the web server. The package has been thoroughly tested to ensure that the same R commands will produce identical results from both interfaces. MetaboAnalystR complements the MetaboAnalyst web server to facilitate transparent, flexible and reproducible analysis of metabolomics data. MetaboAnalystR is freely available from https://github.com/xia-lab/MetaboAnalystR. Supplementary data are available at Bioinformatics online.
JGromacs: a Java package for analyzing protein simulations.
Münz, Márton; Biggin, Philip C
2012-01-23
In this paper, we introduce JGromacs, a Java API (Application Programming Interface) that facilitates the development of cross-platform data analysis applications for Molecular Dynamics (MD) simulations. The API supports parsing and writing file formats applied by GROMACS (GROningen MAchine for Chemical Simulations), one of the most widely used MD simulation packages. JGromacs builds on the strengths of object-oriented programming in Java by providing a multilevel object-oriented representation of simulation data to integrate and interconvert sequence, structure, and dynamics information. The easy-to-learn, easy-to-use, and easy-to-extend framework is intended to simplify and accelerate the implementation and development of complex data analysis algorithms. Furthermore, a basic analysis toolkit is included in the package. The programmer is also provided with simple tools (e.g., XML-based configuration) to create applications with a user interface resembling the command-line interface of GROMACS applications. JGromacs and detailed documentation is freely available from http://sbcb.bioch.ox.ac.uk/jgromacs under a GPLv3 license .
JGromacs: A Java Package for Analyzing Protein Simulations
2011-01-01
In this paper, we introduce JGromacs, a Java API (Application Programming Interface) that facilitates the development of cross-platform data analysis applications for Molecular Dynamics (MD) simulations. The API supports parsing and writing file formats applied by GROMACS (GROningen MAchine for Chemical Simulations), one of the most widely used MD simulation packages. JGromacs builds on the strengths of object-oriented programming in Java by providing a multilevel object-oriented representation of simulation data to integrate and interconvert sequence, structure, and dynamics information. The easy-to-learn, easy-to-use, and easy-to-extend framework is intended to simplify and accelerate the implementation and development of complex data analysis algorithms. Furthermore, a basic analysis toolkit is included in the package. The programmer is also provided with simple tools (e.g., XML-based configuration) to create applications with a user interface resembling the command-line interface of GROMACS applications. Availability: JGromacs and detailed documentation is freely available from http://sbcb.bioch.ox.ac.uk/jgromacs under a GPLv3 license. PMID:22191855
Applications of the Coastal Zone Color Scanner in oceanography
NASA Technical Reports Server (NTRS)
Mcclain, C. R.
1988-01-01
Research activity has continued to be focused on the applications of the Coastal Zone Color Scanner (CZCS) imagery in oceanography. A number of regional studies were completed including investigations of temporal and spatial variability of phytoplankton populations in the South Atlantic Bight, Northwest Spain, Weddell Sea, Bering Sea, Caribbean Sea and in tropical Atlantic Ocean. In addition to the regional studies, much work was dedicated to developing ancillary global scale meteorological and hydrographic data sets to complement the global CZCS processing products. To accomplish this, SEAPAK's image analysis capability was complemented with an interface to GEMPAK (Severe Storm Branch's meteorological analysis software package) for the analysis and graphical display of gridded data fields. Plans are being made to develop a similar interface to SEAPAK for hydrographic data using EPIC (a hydrographic data analysis package developed by NOAA/PMEL).
NASA Astrophysics Data System (ADS)
Smirnov, A. V.; Chobenko, V. M.; Shcherbakov, O. M.; Ushakov, S. M.; Parafiynyk, V. P.; Sereda, R. M.
2017-08-01
The article summarizes the results of analysis of data concerning the operation of turbocompressor packages at compressor stations for the natural gas transmission system of Ukraine. The basic requirements for gas turbine compressor packages used for modernization and reconstruction of compressor stations are considered. Using a 16 MW gas turbine package GPA-C-16S/76-1,44M1 as an example, the results of pre-design studies and some technical solutions that improve the energy efficiency of gas turbine compressor packages and their reliability, as well as its environmental performance are given. In particular, the article deals with the matching of performance characteristics of a centrifugal compressor (hereinafter compressor) and gas turbine drive to reduce fuel gas consumption; as well as application of energy efficient technologies, in particular, exhaust gas heat recovery units and gas-oil heat exchangers in turbocompressor packages oil system; as well as reducing emissions of carbon monoxide into the atmosphere using a catalytic exhaust system. Described technical solutions can be used for development of other types of gas turbine compressor packages.
WGCNA: an R package for weighted correlation network analysis.
Langfelder, Peter; Horvath, Steve
2008-12-29
Correlation networks are increasingly being used in bioinformatics applications. For example, weighted gene co-expression network analysis is a systems biology method for describing the correlation patterns among genes across microarray samples. Weighted correlation network analysis (WGCNA) can be used for finding clusters (modules) of highly correlated genes, for summarizing such clusters using the module eigengene or an intramodular hub gene, for relating modules to one another and to external sample traits (using eigengene network methodology), and for calculating module membership measures. Correlation networks facilitate network based gene screening methods that can be used to identify candidate biomarkers or therapeutic targets. These methods have been successfully applied in various biological contexts, e.g. cancer, mouse genetics, yeast genetics, and analysis of brain imaging data. While parts of the correlation network methodology have been described in separate publications, there is a need to provide a user-friendly, comprehensive, and consistent software implementation and an accompanying tutorial. The WGCNA R software package is a comprehensive collection of R functions for performing various aspects of weighted correlation network analysis. The package includes functions for network construction, module detection, gene selection, calculations of topological properties, data simulation, visualization, and interfacing with external software. Along with the R package we also present R software tutorials. While the methods development was motivated by gene expression data, the underlying data mining approach can be applied to a variety of different settings. The WGCNA package provides R functions for weighted correlation network analysis, e.g. co-expression network analysis of gene expression data. The R package along with its source code and additional material are freely available at http://www.genetics.ucla.edu/labs/horvath/CoexpressionNetwork/Rpackages/WGCNA.
WGCNA: an R package for weighted correlation network analysis
Langfelder, Peter; Horvath, Steve
2008-01-01
Background Correlation networks are increasingly being used in bioinformatics applications. For example, weighted gene co-expression network analysis is a systems biology method for describing the correlation patterns among genes across microarray samples. Weighted correlation network analysis (WGCNA) can be used for finding clusters (modules) of highly correlated genes, for summarizing such clusters using the module eigengene or an intramodular hub gene, for relating modules to one another and to external sample traits (using eigengene network methodology), and for calculating module membership measures. Correlation networks facilitate network based gene screening methods that can be used to identify candidate biomarkers or therapeutic targets. These methods have been successfully applied in various biological contexts, e.g. cancer, mouse genetics, yeast genetics, and analysis of brain imaging data. While parts of the correlation network methodology have been described in separate publications, there is a need to provide a user-friendly, comprehensive, and consistent software implementation and an accompanying tutorial. Results The WGCNA R software package is a comprehensive collection of R functions for performing various aspects of weighted correlation network analysis. The package includes functions for network construction, module detection, gene selection, calculations of topological properties, data simulation, visualization, and interfacing with external software. Along with the R package we also present R software tutorials. While the methods development was motivated by gene expression data, the underlying data mining approach can be applied to a variety of different settings. Conclusion The WGCNA package provides R functions for weighted correlation network analysis, e.g. co-expression network analysis of gene expression data. The R package along with its source code and additional material are freely available at . PMID:19114008
NASA Astrophysics Data System (ADS)
Pollard, Thomas B
Recent advances in microbiology, computational capabilities, and microelectromechanical-system fabrication techniques permit modeling, design, and fabrication of low-cost, miniature, sensitive and selective liquid-phase sensors and lab-on-a-chip systems. Such devices are expected to replace expensive, time-consuming, and bulky laboratory-based testing equipment. Potential applications for devices include: fluid characterization for material science and industry; chemical analysis in medicine and pharmacology; study of biological processes; food analysis; chemical kinetics analysis; and environmental monitoring. When combined with liquid-phase packaging, sensors based on surface-acoustic-wave (SAW) technology are considered strong candidates. For this reason such devices are focused on in this work; emphasis placed on device modeling and packaging for liquid-phase operation. Regarding modeling, topics considered include mode excitation efficiency of transducers; mode sensitivity based on guiding structure materials/geometries; and use of new piezoelectric materials. On packaging, topics considered include package interfacing with SAW devices, and minimization of packaging effects on device performance. In this work novel numerical models are theoretically developed and implemented to study propagation and transduction characteristics of sensor designs using wave/constitutive equations, Green's functions, and boundary/finite element methods. Using developed simulation tools that consider finite-thickness of all device electrodes, transduction efficiency for SAW transducers with neighboring uniform or periodic guiding electrodes is reported for the first time. Results indicate finite electrode thickness strongly affects efficiency. Using dense electrodes, efficiency is shown to approach 92% and 100% for uniform and periodic electrode guiding, respectively; yielding improved sensor detection limits. A numerical sensitivity analysis is presented targeting viscosity using uniform-electrode and shear-horizontal mode configurations on potassium-niobate, langasite, and quartz substrates. Optimum configurations are determined yielding maximum sensitivity. Results show mode propagation-loss and sensitivity to viscosity are correlated by a factor independent of substrate material. The analysis is useful for designing devices meeting sensitivity and signal level requirements. A novel, rapid and precise microfluidic chamber alignment/bonding method was developed for SAW platforms. The package is shown to have little effect on device performance and permits simple macrofluidic interfacing. Lastly, prototypes were designed, fabricated, and tested for viscosity and biosensor applications; results show ability to detect as low as 1% glycerol in water and surface-bound DNA crosslinking.
NASA Astrophysics Data System (ADS)
Vaishali, S.; Narendranath, S.; Sreekumar, P.
An IDL (interactive data language) based widget application developed for the calibration of C1XS (Narendranath et al., 2010) instrument on Chandrayaan-1 is modified to provide a generic package for the analysis of data from x-ray detectors. The package supports files in ascii as well as FITS format. Data can be fitted with a list of inbuilt functions to derive the spectral redistribution function (SRF). We have incorporated functions such as `HYPERMET' (Philips & Marlow 1976) including non Gaussian components in the SRF such as low energy tail, low energy shelf and escape peak. In addition users can incorporate additional models which may be required to model detector specific features. Spectral fits use a routine `mpfit' which uses Leven-Marquardt least squares fitting method. The SRF derived from this tool can be fed into an accompanying program to generate a redistribution matrix file (RMF) compatible with the X-ray spectral analysis package XSPEC. The tool provides a user friendly interface of help to beginners and also provides transparency and advanced features for experts.
Review and analysis of dense linear system solver package for distributed memory machines
NASA Technical Reports Server (NTRS)
Narang, H. N.
1993-01-01
A dense linear system solver package recently developed at the University of Texas at Austin for distributed memory machine (e.g. Intel Paragon) has been reviewed and analyzed. The package contains about 45 software routines, some written in FORTRAN, and some in C-language, and forms the basis for parallel/distributed solutions of systems of linear equations encountered in many problems of scientific and engineering nature. The package, being studied by the Computer Applications Branch of the Analysis and Computation Division, may provide a significant computational resource for NASA scientists and engineers in parallel/distributed computing. Since the package is new and not well tested or documented, many of its underlying concepts and implementations were unclear; our task was to review, analyze, and critique the package as a step in the process that will enable scientists and engineers to apply it to the solution of their problems. All routines in the package were reviewed and analyzed. Underlying theory or concepts which exist in the form of published papers or technical reports, or memos, were either obtained from the author, or from the scientific literature; and general algorithms, explanations, examples, and critiques have been provided to explain the workings of these programs. Wherever the things were still unclear, communications were made with the developer (author), either by telephone or by electronic mail, to understand the workings of the routines. Whenever possible, tests were made to verify the concepts and logic employed in their implementations. A detailed report is being separately documented to explain the workings of these routines.
Keuffel, Eric; Jaskiewicz, Wanda; Paphassarang, Chanthakhath; Tulenko, Kate
2013-11-01
Many developing countries are examining whether to institute incentive packages that increase the share of health workers who opt to locate in rural settings; however, uncertainty exists with respect to the expected net cost (or benefit) from these packages. We utilize the findings from the discrete choice experiment surveys applied to students training to be health professionals and costing analyses in Lao People's Democratic Republic to model the anticipated effect of incentive packages on new worker location decisions and direct costs. Incorporating evidence on health worker density and health outcomes, we then estimate the expected 5-year net cost (or benefit) of each incentive packages for 3 health worker cadres--physicians, nurses/midwives, and medical assistants. Under base case assumptions, the optimal incentive package for each cadre produced a 5-year net benefit (maximum net benefit for physicians: US$ 44,000; nurses/midwives: US$ 5.6 million; medical assistants: US$ 485,000). After accounting for health effects, the expected net cost of select incentive packages would be substantially less than the original estimate of direct costs. In the case of Lao People's Democratic Republic, incentive packages that do not invest in capital-intensive components generally should produce larger net benefits. Combining discrete choice experiment surveys, costing surveys and cost-benefit analysis methods may be replicated by other developing countries to calculate whether health worker incentive packages are viable policy options.
NASA Technical Reports Server (NTRS)
1997-01-01
DARcorporation developed a General Aviation CAD package through a Small Business Innovation Research contract from Langley Research Center. This affordable, user-friendly preliminary design system for General Aviation aircraft runs on the popular 486 IBM-compatible personal computers. Individuals taking the home-built approach, small manufacturers of General Aviation airplanes, as well as students and others interested in the analysis and design of aircraft are possible users of the package. The software can cut design and development time in half.
Analysis pipelines and packages for Infinium HumanMethylation450 BeadChip (450k) data.
Morris, Tiffany J; Beck, Stephan
2015-01-15
The Illumina HumanMethylation450 BeadChip has become a popular platform for interrogating DNA methylation in epigenome-wide association studies (EWAS) and related projects as well as resource efforts such as the International Cancer Genome Consortium (ICGC) and the International Human Epigenome Consortium (IHEC). This has resulted in an exponential increase of 450k data in recent years and triggered the development of numerous integrated analysis pipelines and stand-alone packages. This review will introduce and discuss the currently most popular pipelines and packages and is particularly aimed at new 450k users. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
PCIPS 2.0: Powerful multiprofile image processing implemented on PCs
NASA Technical Reports Server (NTRS)
Smirnov, O. M.; Piskunov, N. E.
1992-01-01
Over the years, the processing power of personal computers has steadily increased. Now, 386- and 486-based PC's are fast enough for many image processing applications, and inexpensive enough even for amateur astronomers. PCIPS is an image processing system based on these platforms that was designed to satisfy a broad range of data analysis needs, while requiring minimum hardware and providing maximum expandability. It will run (albeit at a slow pace) even on a 80286 with 640K memory, but will take full advantage of bigger memory and faster CPU's. Because the actual image processing is performed by external modules, the system can be easily upgraded by the user for all sorts of scientific data analysis. PCIPS supports large format lD and 2D images in any numeric type from 8-bit integer to 64-bit floating point. The images can be displayed, overlaid, printed and any part of the data examined via an intuitive graphical user interface that employs buttons, pop-up menus, and a mouse. PCIPS automatically converts images between different types and sizes to satisfy the requirements of various applications. PCIPS features an API that lets users develop custom applications in C or FORTRAN. While doing so, a programmer can concentrate on the actual data processing, because PCIPS assumes responsibility for accessing images and interacting with the user. This also ensures that all applications, even custom ones, have a consistent and user-friendly interface. The API is compatible with factory programming, a metaphor for constructing image processing procedures that will be implemented in future versions of the system. Several application packages were created under PCIPS. The basic package includes elementary arithmetics and statistics, geometric transformations and import/export in various formats (FITS, binary, ASCII, and GIF). The CCD processing package and the spectral analysis package were successfully used to reduce spectra from the Nordic Telescope at La Palma. A photometry package is also available, and other packages are being developed. A multitasking version of PCIPS that utilizes the factory programming concept is currently under development. This version will remain compatible (on the source code level) with existing application packages and custom applications.
Cantera Integration with the Toolbox for Modeling and Analysis of Thermodynamic Systems (T-MATS)
NASA Technical Reports Server (NTRS)
Lavelle, Thomas M.; Chapman, Jeffryes W.; May, Ryan D.; Litt, Jonathan S.; Guo, Ten-Huei
2014-01-01
NASA Glenn Research Center (GRC) has recently developed a software package for modeling generic thermodynamic systems called the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS). T-MATS is a library of building blocks that can be assembled to represent any thermodynamic system in the Simulink(Registered TradeMark) (The MathWorks, Inc.) environment. These elements, along with a Newton Raphson solver (also provided as part of the T-MATS package), enable users to create models of a wide variety of systems. The current version of T-MATS (v1.0.1) uses tabular data for providing information about a specific mixture of air, water (humidity), and hydrocarbon fuel in calculations of thermodynamic properties. The capabilities of T-MATS can be expanded by integrating it with the Cantera thermodynamic package. Cantera is an object-oriented analysis package that calculates thermodynamic solutions for any mixture defined by the user. Integration of Cantera with T-MATS extends the range of systems that may be modeled using the toolbox. In addition, the library of elements released with Cantera were developed using MATLAB native M-files, allowing for quicker prototyping of elements. This paper discusses how the new Cantera-based elements are created and provides examples for using T-MATS integrated with Cantera.
Cantera Integration with the Toolbox for Modeling and Analysis of Thermodynamic Systems (T-MATS)
NASA Technical Reports Server (NTRS)
Lavelle, Thomas M.; Chapman, Jeffryes W.; May, Ryan D.; Litt, Jonathan S.; Guo, Ten-Huei
2014-01-01
NASA Glenn Research Center (GRC) has recently developed a software package for modeling generic thermodynamic systems called the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS). T-MATS is a library of building blocks that can be assembled to represent any thermodynamic system in the Simulink (The MathWorks, Inc.) environment. These elements, along with a Newton Raphson solver (also provided as part of the T-MATS package), enable users to create models of a wide variety of systems. The current version of T-MATS (v1.0.1) uses tabular data for providing information about a specific mixture of air, water (humidity), and hydrocarbon fuel in calculations of thermodynamic properties. The capabilities of T-MATS can be expanded by integrating it with the Cantera thermodynamic package. Cantera is an object-oriented analysis package that calculates thermodynamic solutions for any mixture defined by the user. Integration of Cantera with T-MATS extends the range of systems that may be modeled using the toolbox. In addition, the library of elements released with Cantera were developed using MATLAB native M-files, allowing for quicker prototyping of elements. This paper discusses how the new Cantera-based elements are created and provides examples for using T-MATS integrated with Cantera.
Design and analysis study of a spacecraft optical transceiver package
NASA Technical Reports Server (NTRS)
Lambert, S. G.
1985-01-01
A detailed system level design of an Optical Transceiver Package (OPTRANSPAC) for a deep space vehicle whose mission is outer planet exploration is developed. In addition to the terminal design, this study provides estimates of the dynamic environments to be encountered by the transceiver throughout its mission life. Optical communication link analysis, optical thin lens design, electronic functional design and mechanical layout and packaging are employed in the terminal design. Results of the study describe an Optical Transceiver Package capable of communicating to an Earth Orbiting Relay Station at a distance of 10 Astronomical Units (AU) and data rates up to 100 KBPS. The transceiver is also capable of receiving 1 KBPS of command data from the Earth Relay. The physical dimensions of the terminal are contained within a 3.5' x 1.5' x 2.0' envelope and the transceiver weight and power are estimated at 52.2 Kg (115 pounds) and 57 watts, respectively.
2012-01-01
Background Measuring gene transcription using real-time reverse transcription polymerase chain reaction (RT-qPCR) technology is a mainstay of molecular biology. Technologies now exist to measure the abundance of many transcripts in parallel. The selection of the optimal reference gene for the normalisation of this data is a recurring problem, and several algorithms have been developed in order to solve it. So far nothing in R exists to unite these methods, together with other functions to read in and normalise the data using the chosen reference gene(s). Results We have developed two R/Bioconductor packages, ReadqPCR and NormqPCR, intended for a user with some experience with high-throughput data analysis using R, who wishes to use R to analyse RT-qPCR data. We illustrate their potential use in a workflow analysing a generic RT-qPCR experiment, and apply this to a real dataset. Packages are available from http://www.bioconductor.org/packages/release/bioc/html/ReadqPCR.htmland http://www.bioconductor.org/packages/release/bioc/html/NormqPCR.html Conclusions These packages increase the repetoire of RT-qPCR analysis tools available to the R user and allow them to (amongst other things) read their data into R, hold it in an ExpressionSet compatible R object, choose appropriate reference genes, normalise the data and look for differential expression between samples. PMID:22748112
NASA Astrophysics Data System (ADS)
Rogiers, Bart
2015-04-01
Since a few years, an increasing number of contributed R packages is becoming available, in the field of hydrology. Hydrological time series analysis packages, lumped conceptual rainfall-runoff models, distributed hydrological models, weather generators, and different calibration and uncertainty estimation methods are all available. Also a few packages are available for solving partial differential equations. Subsurface hydrological modelling is however still seldomly performed in R, or with codes interfaced with R, despite the fact that excellent geostatistical packages, model calibration/inversion options and state-of-the-art visualization libraries are available. Moreover, other popular scientific programming languages like matlab and python have packages for pre- and post-processing files of MODFLOW (Harbaugh 2005) and MT3DMS (Zheng 2010) models. To fill this gap, we present here the development versions of the RMODFLOW and RMT3DMS packages, which allow pre- and post-processing MODFLOW and MT3DMS input and output files from within R. File reading and writing functions are currently available for different packages, and plotting functions are foreseen making use of the ggplot2 package (plotting system based on the grammar of graphics; Wickham 2009). The S3 generic-function object oriented programming style is used for this. An example is provided, making modifications to an existing model, and visualization of the model output. References Harbaugh, A. (2005). MODFLOW-2005: The US Geological Survey Modular Ground-water Model--the Ground-water Flow Process, U.S. Geological Survey Techniques and Methods 6-A16 (p. 253). Wickham, H. (2009). ggplot2: elegant graphics for data analysis. Springer New York, 2009. Zheng, C. (2010). MT3DMS v5.3, a modular three-dimensional multispecies transport model for simulation of advection, dispersion and chemical reactions of contaminants in groundwater systems. Supplemental User's Guide. (p. 56).
Paintdakhi, Ahmad; Parry, Bradley; Campos, Manuel; Irnov, Irnov; Elf, Johan; Surovtsev, Ivan; Jacobs-Wagner, Christine
2016-01-01
Summary With the realization that bacteria display phenotypic variability among cells and exhibit complex subcellular organization critical for cellular function and behavior, microscopy has re-emerged as a primary tool in bacterial research during the last decade. However, the bottleneck in today’s single-cell studies is quantitative image analysis of cells and fluorescent signals. Here, we address current limitations through the development of Oufti, a stand-alone, open-source software package for automated measurements of microbial cells and fluorescence signals from microscopy images. Oufti provides computational solutions for tracking touching cells in confluent samples, handles various cell morphologies, offers algorithms for quantitative analysis of both diffraction and non-diffraction-limited fluorescence signals, and is scalable for high-throughput analysis of massive datasets, all with subpixel precision. All functionalities are integrated in a single package. The graphical user interface, which includes interactive modules for segmentation, image analysis, and post-processing analysis, makes the software broadly accessible to users irrespective of their computational skills. PMID:26538279
NASA Astrophysics Data System (ADS)
Pagliarone, C. E.; Uttaro, S.; Cappelli, L.; Fallone, M.; Kartal, S.
2017-02-01
CAT, Cryogenic Analysis Tools is a software package developed using LabVIEW and ROOT environments to analyze the performances of large size cryostats, where many parameters, input, and control variables need to be acquired and studied at the same time. The present paper describes how CAT works and which are the main improvements achieved in the new version: CAT 2. New Graphical User Interfaces have been developed in order to make the use of the full package more user-friendly as well as a process of resource optimization has been carried out. The offline analysis of the full cryostat performances is available both trough ROOT line command interface band also by using the new graphical interfaces.
Anker, Thomas Boysen
2016-01-01
This article analyses the paternalistic justification of the world’s first mandatory tobacco plain packaging policy, which came into force in Australia in 2012. The policy is setting international precedence, with a range of developed and developing countries planning and implementing similar policies. Understanding the paternalistic dimension of the policy is therefore of imminent international importance. The policy meets important ethical benchmarks such as respect for citizens’ self-interests and protection of others against harm. However, plain packaging faces a number of ethical challenges: the policy is a controversial type of paternalism; it runs partially against the harm principle; and it fails to meet key operational criteria. PMID:27551306
Oostenveld, Robert; Fries, Pascal; Maris, Eric; Schoffelen, Jan-Mathijs
2011-01-01
This paper describes FieldTrip, an open source software package that we developed for the analysis of MEG, EEG, and other electrophysiological data. The software is implemented as a MATLAB toolbox and includes a complete set of consistent and user-friendly high-level functions that allow experimental neuroscientists to analyze experimental data. It includes algorithms for simple and advanced analysis, such as time-frequency analysis using multitapers, source reconstruction using dipoles, distributed sources and beamformers, connectivity analysis, and nonparametric statistical permutation tests at the channel and source level. The implementation as toolbox allows the user to perform elaborate and structured analyses of large data sets using the MATLAB command line and batch scripting. Furthermore, users and developers can easily extend the functionality and implement new algorithms. The modular design facilitates the reuse in other software packages.
Analysis of microgravity space experiments Space Shuttle programmatic safety requirements
NASA Technical Reports Server (NTRS)
Terlep, Judith A.
1996-01-01
This report documents the results of an analysis of microgravity space experiments space shuttle programmatic safety requirements and recommends the creation of a Safety Compliance Data Package (SCDP) Template for both flight and ground processes. These templates detail the programmatic requirements necessary to produce a complete SCDP. The templates were developed from various NASA centers' requirement documents, previously written guidelines on safety data packages, and from personal experiences. The templates are included in the back as part of this report.
Kiryukhin, Maxim V; Lau, Hooi Hong; Goh, Seok Hong; Teh, Cathleen; Korzh, Vladimir; Sadovoy, Anton
2018-05-15
A new Membrane Film Sensor (MFS) has been developed to measure pH of fluids. MFS comprises a polyelectrolyte multilayer film with uniformly distributed compartments (microchambers) where a fluorescent sensing dye is encapsulated. Fabricated film is sealed onto a polyethylene film for a future use. MFS was applied to report changes in golden pomfret fillet upon its storage at 5 °C. MFS pH readings were correlated to bacteriological analysis of fish samples. A hike in pH of fish juices happens after 10 days of storage signaling bacterial spoilage of fish. The design of developed MFS allows easy integration with transparent packaging materials for future development of "SMART" packaging sensing food freshness. Copyright © 2018 Elsevier B.V. All rights reserved.
Mezouari, S; Liu, W Yun; Pace, G; Hartman, T G
2015-01-01
The objective of this study was to develop an improved analytical method for the determination of 3-chloro-1,2-propanediol (3-MCPD) and 1,3-dichloropropanol (1,3-DCP) in paper-type food packaging. The established method includes aqueous extraction, matrix spiking of a deuterated surrogate internal standard (3-MCPD-d₅), clean-up using Extrelut solid-phase extraction, derivatisation using a silylation reagent, and GC-MS analysis of the chloropropanols as their corresponding trimethyl silyl ethers. The new method is applicable to food-grade packaging samples using European Commission standard aqueous extraction and aqueous food stimulant migration tests. In this improved method, the derivatisation procedure was optimised; the cost and time of the analysis were reduced by using 10 times less sample, solvents and reagents than in previously described methods. Overall the validation data demonstrate that the method is precise and reliable. The limit of detection (LOD) of the aqueous extract was 0.010 mg kg(-1) (w/w) for both 3-MCPD and 1,3-DCP. Analytical precision had a relative standard deviation (RSD) of 3.36% for 3-MCPD and an RSD of 7.65% for 1,3-DCP. The new method was satisfactorily applied to the analysis of over 100 commercial paperboard packaging samples. The data are being used to guide the product development of a next generation of wet-strength resins with reduced chloropropanol content, and also for risk assessments to calculate the virtual safe dose (VSD).
[Analysis of phthalates in plastic food-packaging bags by thin layer chromatography].
Chen, Hui; Wang, Yuan; Zhu, Ruohua
2006-01-01
The method for simultaneous determination of four phthalates, namely dimethyl phthalate (DMP), diethyl phthalate (DEP), di-n-butyl phthalate (DBP) and di (2-ethylhexyl) phthalate (DEHP) in plastic food-packaging bags by thin layer chromatography (TLC) was developed. The plastic food-packaging bags were extracted with ethanol by ultrasonication, then the mixture was filtrated through membrane (0.45 microm). The mixture of ethyl acetate-anhydrous ether-isooctane (1 : 4 : 15, v/v) was used as developing agent on the TLC silica gel plate for development. The filtered liquid was spotted on the TLC plate dealt by acetone, and detected with scanning wavelength of 275 nm and reference wavelength of 340 nm. The qualitative analysis of the phthalates was performed using the R(f) values of the chromatogram. The quantitative analysis was performed with external standard method. Good linearities were obtained for DMP, DEP, DBP and DEHP. The detection limits were 2.1 ng for DMP, 2.4 ng for DEP, 3.4 ng for DBP and 4.0 ng for DEHP. The relative standard deviations (RSDs) of the four phthalates were 2.8% - 3.5%. The recoveries of the four phthalate standards in real sample were 78.58% - 111.04%. The method presented has the advantages of high precision, high sensitivity, small sample size, and simple pretreatment . The method was used to detect the four phthalates in the food-packaging bags. The contents in real samples were close to the results by gas chromatography.
Applied Cognitive Task Analysis (ACTA) Methodology
1997-11-01
experienced based cognitive skills. The primary goal of this project was to develop streamlined methods of Cognitive Task Analysis that would fill this need...We have made important progression this direction. We have developed streamlined methods of Cognitive Task Analysis . Our evaluation study indicates...developed a CD-based stand alone instructional package, which will make the Applied Cognitive Task Analysis (ACTA) tools widely accessible. A survey of the
Computational methods for evaluation of cell-based data assessment--Bioconductor.
Le Meur, Nolwenn
2013-02-01
Recent advances in miniaturization and automation of technologies have enabled cell-based assay high-throughput screening, bringing along new challenges in data analysis. Automation, standardization, reproducibility have become requirements for qualitative research. The Bioconductor community has worked in that direction proposing several R packages to handle high-throughput data including flow cytometry (FCM) experiment. Altogether, these packages cover the main steps of a FCM analysis workflow, that is, data management, quality assessment, normalization, outlier detection, automated gating, cluster labeling, and feature extraction. Additionally, the open-source philosophy of R and Bioconductor, which offers room for new development, continuously drives research and improvement of theses analysis methods, especially in the field of clustering and data mining. This review presents the principal FCM packages currently available in R and Bioconductor, their advantages and their limits. Copyright © 2012 Elsevier Ltd. All rights reserved.
González-Beltrán, Alejandra; Neumann, Steffen; Maguire, Eamonn; Sansone, Susanna-Assunta; Rocca-Serra, Philippe
2014-01-01
The ISA-Tab format and software suite have been developed to break the silo effect induced by technology-specific formats for a variety of data types and to better support experimental metadata tracking. Experimentalists seldom use a single technique to monitor biological signals. Providing a multi-purpose, pragmatic and accessible format that abstracts away common constructs for describing Investigations, Studies and Assays, ISA is increasingly popular. To attract further interest towards the format and extend support to ensure reproducible research and reusable data, we present the Risa package, which delivers a central component to support the ISA format by enabling effortless integration with R, the popular, open source data crunching environment. The Risa package bridges the gap between the metadata collection and curation in an ISA-compliant way and the data analysis using the widely used statistical computing environment R. The package offers functionality for: i) parsing ISA-Tab datasets into R objects, ii) augmenting annotation with extra metadata not explicitly stated in the ISA syntax; iii) interfacing with domain specific R packages iv) suggesting potentially useful R packages available in Bioconductor for subsequent processing of the experimental data described in the ISA format; and finally v) saving back to ISA-Tab files augmented with analysis specific metadata from R. We demonstrate these features by presenting use cases for mass spectrometry data and DNA microarray data. The Risa package is open source (with LGPL license) and freely available through Bioconductor. By making Risa available, we aim to facilitate the task of processing experimental data, encouraging a uniform representation of experimental information and results while delivering tools for ensuring traceability and provenance tracking. The Risa package is available since Bioconductor 2.11 (version 1.0.0) and version 1.2.1 appeared in Bioconductor 2.12, both along with documentation and examples. The latest version of the code is at the development branch in Bioconductor and can also be accessed from GitHub https://github.com/ISA-tools/Risa, where the issue tracker allows users to report bugs or feature requests.
The Risa R/Bioconductor package: integrative data analysis from experimental metadata and back again
2014-01-01
Background The ISA-Tab format and software suite have been developed to break the silo effect induced by technology-specific formats for a variety of data types and to better support experimental metadata tracking. Experimentalists seldom use a single technique to monitor biological signals. Providing a multi-purpose, pragmatic and accessible format that abstracts away common constructs for describing Investigations, Studies and Assays, ISA is increasingly popular. To attract further interest towards the format and extend support to ensure reproducible research and reusable data, we present the Risa package, which delivers a central component to support the ISA format by enabling effortless integration with R, the popular, open source data crunching environment. Results The Risa package bridges the gap between the metadata collection and curation in an ISA-compliant way and the data analysis using the widely used statistical computing environment R. The package offers functionality for: i) parsing ISA-Tab datasets into R objects, ii) augmenting annotation with extra metadata not explicitly stated in the ISA syntax; iii) interfacing with domain specific R packages iv) suggesting potentially useful R packages available in Bioconductor for subsequent processing of the experimental data described in the ISA format; and finally v) saving back to ISA-Tab files augmented with analysis specific metadata from R. We demonstrate these features by presenting use cases for mass spectrometry data and DNA microarray data. Conclusions The Risa package is open source (with LGPL license) and freely available through Bioconductor. By making Risa available, we aim to facilitate the task of processing experimental data, encouraging a uniform representation of experimental information and results while delivering tools for ensuring traceability and provenance tracking. Software availability The Risa package is available since Bioconductor 2.11 (version 1.0.0) and version 1.2.1 appeared in Bioconductor 2.12, both along with documentation and examples. The latest version of the code is at the development branch in Bioconductor and can also be accessed from GitHub https://github.com/ISA-tools/Risa, where the issue tracker allows users to report bugs or feature requests. PMID:24564732
Bodzon-Kulakowska, Anna; Marszalek-Grabska, Marta; Antolak, Anna; Drabik, Anna; Kotlinska, Jolanta H; Suder, Piotr
Data analysis from mass spectrometry imaging (MSI) imaging experiments is a very complex task. Most of the software packages devoted to this purpose are designed by the mass spectrometer manufacturers and, thus, are not freely available. Laboratories developing their own MS-imaging sources usually do not have access to the commercial software, and they must rely on the freely available programs. The most recognized ones are BioMap, developed by Novartis under Interactive Data Language (IDL), and Datacube, developed by the Dutch Foundation for Fundamental Research of Matter (FOM-Amolf). These two systems were used here for the analysis of images received from rat brain tissues subjected to morphine influence and their capabilities were compared in terms of ease of use and the quality of obtained results.
Ostrovnaya, Irina; Seshan, Venkatraman E; Olshen, Adam B; Begg, Colin B
2011-06-15
If a cancer patient develops multiple tumors, it is sometimes impossible to determine whether these tumors are independent or clonal based solely on pathological characteristics. Investigators have studied how to improve this diagnostic challenge by comparing the presence of loss of heterozygosity (LOH) at selected genetic locations of tumor samples, or by comparing genomewide copy number array profiles. We have previously developed statistical methodology to compare such genomic profiles for an evidence of clonality. We assembled the software for these tests in a new R package called 'Clonality'. For LOH profiles, the package contains significance tests. The analysis of copy number profiles includes a likelihood ratio statistic and reference distribution, as well as an option to produce various plots that summarize the results. Bioconductor (http://bioconductor.org/packages/release/bioc/html/Clonality.html) and http://www.mskcc.org/mskcc/html/13287.cfm.
MDTraj: A Modern Open Library for the Analysis of Molecular Dynamics Trajectories.
McGibbon, Robert T; Beauchamp, Kyle A; Harrigan, Matthew P; Klein, Christoph; Swails, Jason M; Hernández, Carlos X; Schwantes, Christian R; Wang, Lee-Ping; Lane, Thomas J; Pande, Vijay S
2015-10-20
As molecular dynamics (MD) simulations continue to evolve into powerful computational tools for studying complex biomolecular systems, the necessity of flexible and easy-to-use software tools for the analysis of these simulations is growing. We have developed MDTraj, a modern, lightweight, and fast software package for analyzing MD simulations. MDTraj reads and writes trajectory data in a wide variety of commonly used formats. It provides a large number of trajectory analysis capabilities including minimal root-mean-square-deviation calculations, secondary structure assignment, and the extraction of common order parameters. The package has a strong focus on interoperability with the wider scientific Python ecosystem, bridging the gap between MD data and the rapidly growing collection of industry-standard statistical analysis and visualization tools in Python. MDTraj is a powerful and user-friendly software package that simplifies the analysis of MD data and connects these datasets with the modern interactive data science software ecosystem in Python. Copyright © 2015 Biophysical Society. Published by Elsevier Inc. All rights reserved.
SNPassoc: an R package to perform whole genome association studies.
González, Juan R; Armengol, Lluís; Solé, Xavier; Guinó, Elisabet; Mercader, Josep M; Estivill, Xavier; Moreno, Víctor
2007-03-01
The popularization of large-scale genotyping projects has led to the widespread adoption of genetic association studies as the tool of choice in the search for single nucleotide polymorphisms (SNPs) underlying susceptibility to complex diseases. Although the analysis of individual SNPs is a relatively trivial task, when the number is large and multiple genetic models need to be explored it becomes necessary a tool to automate the analyses. In order to address this issue, we developed SNPassoc, an R package to carry out most common analyses in whole genome association studies. These analyses include descriptive statistics and exploratory analysis of missing values, calculation of Hardy-Weinberg equilibrium, analysis of association based on generalized linear models (either for quantitative or binary traits), and analysis of multiple SNPs (haplotype and epistasis analysis). Package SNPassoc is available at CRAN from http://cran.r-project.org. A tutorial is available on Bioinformatics online and in http://davinci.crg.es/estivill_lab/snpassoc.
MDTraj: A Modern Open Library for the Analysis of Molecular Dynamics Trajectories
McGibbon, Robert T.; Beauchamp, Kyle A.; Harrigan, Matthew P.; Klein, Christoph; Swails, Jason M.; Hernández, Carlos X.; Schwantes, Christian R.; Wang, Lee-Ping; Lane, Thomas J.; Pande, Vijay S.
2015-01-01
As molecular dynamics (MD) simulations continue to evolve into powerful computational tools for studying complex biomolecular systems, the necessity of flexible and easy-to-use software tools for the analysis of these simulations is growing. We have developed MDTraj, a modern, lightweight, and fast software package for analyzing MD simulations. MDTraj reads and writes trajectory data in a wide variety of commonly used formats. It provides a large number of trajectory analysis capabilities including minimal root-mean-square-deviation calculations, secondary structure assignment, and the extraction of common order parameters. The package has a strong focus on interoperability with the wider scientific Python ecosystem, bridging the gap between MD data and the rapidly growing collection of industry-standard statistical analysis and visualization tools in Python. MDTraj is a powerful and user-friendly software package that simplifies the analysis of MD data and connects these datasets with the modern interactive data science software ecosystem in Python. PMID:26488642
Castro-López, María del Mar; López-Vilariño, José Manuel; González-Rodríguez, María Victoria
2014-05-01
Several HPLC and UHPLC developed methods were compared to analyse the natural antioxidants catechins and quercetin used in active packaging and functional foods. Photodiode array detector coupled with a fluorescence detector and compared with LTQ-Orbitrap-MS was used. UHPLC was investigated as quick alternative without compromising the separation, analysis time shortened up to 6-fold. The feasibility of the four developed methods was compared. Linearity up to 0.9995, low detection limits (between 0.02 and 0.7 for HPLC-PDA, 2 to 7-fold lower for HPLC- LTQ-Orbitrap-MS and from 0.2 to 2mgL(-)(1) for UHPLC-PDA) and good precision parameters (RSD lower than 0.06%) were obtained. All methods were successfully applied to natural samples. LTQ-Orbitrap-MS allowed to identify other analytes of interest too. Good feasibility of the methods was also concluded from the analysis of catechin and quercetin release from new active packaging materials based on polypropylene added with catechins and green tea. Copyright © 2013 Elsevier Ltd. All rights reserved.
Digital PIV (DPIV) Software Analysis System
NASA Technical Reports Server (NTRS)
Blackshire, James L.
1997-01-01
A software package was developed to provide a Digital PIV (DPIV) capability for NASA LaRC. The system provides an automated image capture, test correlation, and autocorrelation analysis capability for the Kodak Megaplus 1.4 digital camera system for PIV measurements. The package includes three separate programs that, when used together with the PIV data validation algorithm, constitutes a complete DPIV analysis capability. The programs are run on an IBM PC/AT host computer running either Microsoft Windows 3.1 or Windows 95 using a 'quickwin' format that allows simple user interface and output capabilities to the windows environment.
Political Analysis through the Prince System. Learning Packages in the Policy Sciences, PS-23.
ERIC Educational Resources Information Center
Coplin, William D.; O'Leary, Michael K.
This package introduces college students to the elements of the Prince System, a widely used system for making political forecasts and developing political strategies. Designed to be completed in two to three weeks, the two exercises enable students to (1) identify political issues that the Prince System can help them understand, (2) determine the…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rakowski, Cynthia L.; Guensch, Gregory R.; Patton, Gregory W.
Beginning in fiscal year 2003, the DOE Richland Operations Office initiated activities, including the development of data packages, to support the 2004 Composite Analysis. The river data package provides calculations of flow and transport in the Columbia River system. This document presents the data assembled to run the river module components for the section of the Columbia River from Vernita Bridge to the confluence with the Yakima River.
NASA Technical Reports Server (NTRS)
Ghaffarian, Reza; Evans, John W.
2014-01-01
For five decades, the semiconductor industry has distinguished itself by the rapid pace of improvement in miniaturization of electronics products-Moore's Law. Now, scaling hits a brick wall, a paradigm shift. The industry roadmaps recognized the scaling limitation and project that packaging technologies will meet further miniaturization needs or ak.a "More than Moore". This paper presents packaging technology trends and accelerated reliability testing methods currently being practiced. Then, it presents industry status on key advanced electronic packages, factors affecting accelerated solder joint reliability of area array packages, and IPC/JEDEC/Mil specifications for characterizations of assemblies under accelerated thermal and mechanical loading. Finally, it presents an examples demonstrating how Accelerated Testing and Analysis have been effectively employed in the development of complex spacecraft thereby reducing risk. Quantitative assessments necessarily involve the mathematics of probability and statistics. In addition, accelerated tests need to be designed which consider the desired risk posture and schedule for particular project. Such assessments relieve risks without imposing additional costs. and constraints that are not value added for a particular mission. Furthermore, in the course of development of complex systems, variances and defects will inevitably present themselves and require a decision concerning their disposition, necessitating quantitative assessments. In summary, this paper presents a comprehensive view point, from technology to systems, including the benefits and impact of accelerated testing in offsetting risk.
The U. S. Department of Energy SARP review training program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mauck, C.J.
1988-01-01
In support of its radioactive material packaging certification program, the U.S. Department of Energy (DOE) has established a special training workshop. The purpose of the two-week workshop is to develop skills in reviewing Safety Analysis Reports for Packagings (SARPs) and performing confirmatory analyses. The workshop, conducted by the Lawrence Livermore National Laboratory (LLNL) for DOE, is divided into two parts: methods of review and methods of analysis. The sessions covering methods of review are based on the DOE document, ''Packaging Review Guide for Reviewing Safety Analysis Reports for Packagings'' (PRG). The sessions cover relevant DOE Orders and all areas ofmore » review in the applicable Nuclear Regulatory Commission (NRC) Regulatory Guides. The technical areas addressed include structural and thermal behavior, materials, shielding, criticality, and containment. The course sessions on methods of analysis provide hands-on experience in the use of calculational methods and codes for reviewing SARPs. Analytical techniques and computer codes are discussed and sample problems are worked. Homework is assigned each night and over the included weekend; at the conclusion, a comprehensive take-home examination is given requiring six to ten hours to complete.« less
Nmrglue: an open source Python package for the analysis of multidimensional NMR data.
Helmus, Jonathan J; Jaroniec, Christopher P
2013-04-01
Nmrglue, an open source Python package for working with multidimensional NMR data, is described. When used in combination with other Python scientific libraries, nmrglue provides a highly flexible and robust environment for spectral processing, analysis and visualization and includes a number of common utilities such as linear prediction, peak picking and lineshape fitting. The package also enables existing NMR software programs to be readily tied together, currently facilitating the reading, writing and conversion of data stored in Bruker, Agilent/Varian, NMRPipe, Sparky, SIMPSON, and Rowland NMR Toolkit file formats. In addition to standard applications, the versatility offered by nmrglue makes the package particularly suitable for tasks that include manipulating raw spectrometer data files, automated quantitative analysis of multidimensional NMR spectra with irregular lineshapes such as those frequently encountered in the context of biomacromolecular solid-state NMR, and rapid implementation and development of unconventional data processing methods such as covariance NMR and other non-Fourier approaches. Detailed documentation, install files and source code for nmrglue are freely available at http://nmrglue.com. The source code can be redistributed and modified under the New BSD license.
Nmrglue: An Open Source Python Package for the Analysis of Multidimensional NMR Data
Helmus, Jonathan J.; Jaroniec, Christopher P.
2013-01-01
Nmrglue, an open source Python package for working with multidimensional NMR data, is described. When used in combination with other Python scientific libraries, nmrglue provides a highly flexible and robust environment for spectral processing, analysis and visualization and includes a number of common utilities such as linear prediction, peak picking and lineshape fitting. The package also enables existing NMR software programs to be readily tied together, currently facilitating the reading, writing and conversion of data stored in Bruker, Agilent/Varian, NMRPipe, Sparky, SIMPSON, and Rowland NMR Toolkit file formats. In addition to standard applications, the versatility offered by nmrglue makes the package particularly suitable for tasks that include manipulating raw spectrometer data files, automated quantitative analysis of multidimensional NMR spectra with irregular lineshapes such as those frequently encountered in the context of biomacromolecular solid-state NMR, and rapid implementation and development of unconventional data processing methods such as covariance NMR and other non-Fourier approaches. Detailed documentation, install files and source code for nmrglue are freely available at http://nmrglue.com. The source code can be redistributed and modified under the New BSD license. PMID:23456039
SWMPr: An R Package for Retrieving, Organizing, and ...
The System-Wide Monitoring Program (SWMP) was implemented in 1995 by the US National Estuarine Research Reserve System. This program has provided two decades of continuous monitoring data at over 140 fixed stations in 28 estuaries. However, the increasing quantity of data provided by the monitoring network has complicated broad-scale comparisons between systems and, in some cases, prevented simple trend analysis of water quality parameters at individual sites. This article describes the SWMPr package that provides several functions that facilitate data retrieval, organization, andanalysis of time series data in the reserve estuaries. Previously unavailable functions for estuaries are also provided to estimate rates of ecosystem metabolism using the open-water method. The SWMPr package has facilitated a cross-reserve comparison of water quality trends and links quantitative information with analysis tools that have use for more generic applications to environmental time series. The manuscript describes a software package that was recently developed to retrieve, organize, and analyze monitoring data from the National Estuarine Research Reserve System. Functions are explained in detail, including recent applications for trend analysis of ecosystem metabolism.
Real Time Metrology Using Heterodyne Interferometry
NASA Astrophysics Data System (ADS)
Evans, Joseph T..., Jr.
1983-11-01
The Air Force Weapons Laboratory (AFWL) located at Albuquerque, NM has developed a digital heterodyne interferometer capable of real-time, closed loop analysis and control of adaptive optics. The device uses independent phase modulation of two orthogonal polarizations of an argon ion laser to produce a temporally phase modulated interferogram of the test object in a Twyman-Green interferometer. Differential phase detection under the control of a Data General minicomputer helps reconstruct the phase front without noise effects from amplitude modulation in the optical train. The system consists of the interferometer optics, phase detection circuitry, and the minicomputer, allowing for complete software control of the process. The software has been unified into a powerful package that performs automatic data acquisition, OPD reconstruction, and Zernike analysis of the resulting wavefront. The minicomputer has the capability to control external devices so that closed loop analysis and control is possible. New software under development will provide a framework of data acquisition, display, and storage packages which can be integrated with analysis and control packages customized to the user's needs. Preliminary measurements with the system show that it is noise limited by laser beam phase quality and vibration of the optics. Active measures are necessary to reduce the impact of these noise sources.
DOE Office of Scientific and Technical Information (OSTI.GOV)
West, M
2009-03-06
This Technical Review Report (TRR) documents the review, performed by Lawrence Livermore National Laboratory (LLNL) Staff, at the request of the Department of Energy (DOE), on the 'Safety Analysis Report for Packaging (SARP), Model 9978 B(M)F-96', Revision 1, March 2009 (S-SARP-G-00002). The Model 9978 Package complies with 10 CFR 71, and with 'Regulations for the Safe Transport of Radioactive Material-1996 Edition (As Amended, 2000)-Safety Requirements', International Atomic Energy Agency (IAEA) Safety Standards Series No. TS-R-1. The Model 9978 Packaging is designed, analyzed, fabricated, and tested in accordance with Section III of the American Society of Mechanical Engineers Boiler and Pressuremore » Vessel Code (ASME B&PVC). The review presented in this TRR was performed using the methods outlined in Revision 3 of the DOE's 'Packaging Review Guide (PRG) for Reviewing Safety Analysis Reports for Packages'. The format of the SARP follows that specified in Revision 2 of the Nuclear Regulatory Commission's Regulatory Guide 7.9, i.e., 'Standard Format and Content of Part 71 Applications for Approval of Packages for Radioactive Material'. Although the two documents are similar in their content, they are not identical. Formatting differences have been noted in this TRR, where appropriate. The Model 9978 Packaging is a single containment package, using a 5-inch containment vessel (5CV). It uses a nominal 35-gallon drum package design. In comparison, the Model 9977 Packaging uses a 6-inch containment vessel (6CV). The Model 9977 and Model 9978 Packagings were developed concurrently, and they were referred to as the General Purpose Fissile Material Package, Version 1 (GPFP). Both packagings use General Plastics FR-3716 polyurethane foam as insulation and as impact limiters. The 5CV is used as the Primary Containment Vessel (PCV) in the Model 9975-96 Packaging. The Model 9975-96 Packaging also has the 6CV as its Secondary Containment Vessel (SCV). In comparison, the Model 9975 Packagings use Celotex{trademark} for insulation and as impact limiters. To provide a historical perspective, it is noted that the Model 9975-96 Packaging is a 35-gallon drum package design that has evolved from a family of packages designed by DOE contractors at the Savannah River Site. Earlier package designs, i.e., the Model 9965, the Model 9966, the Model 9967, and the Model 9968 Packagings, were originally designed and certified in the early 1980s. In the 1990s, updated package designs that incorporated design features consistent with the then-newer safety requirements were proposed. The updated package designs at the time were the Model 9972, the Model 9973, the Model 9974, and the Model 9975 Packagings, respectively. The Model 9975 Package was certified by the Packaging Certification Program, under the Office of Safety Management and Operations. The Model 9978 Package has six Content Envelopes: C.1 ({sup 238}Pu Heat Sources), C.2 ( Pu/U Metals), C.3 (Pu/U Oxides, Reserved), C.4 (U Metal or Alloy), C.5 (U Compounds), and C.6 (Samples and Sources). Per 10 CFR 71.59 (Code of Federal Regulations), the value of N is 50 for the Model 9978 Package leading to a Criticality Safety Index (CSI) of 1.0. The Transport Index (TI), based on dose rate, is calculated to be a maximum of 4.1.« less
Oostenveld, Robert; Fries, Pascal; Maris, Eric; Schoffelen, Jan-Mathijs
2011-01-01
This paper describes FieldTrip, an open source software package that we developed for the analysis of MEG, EEG, and other electrophysiological data. The software is implemented as a MATLAB toolbox and includes a complete set of consistent and user-friendly high-level functions that allow experimental neuroscientists to analyze experimental data. It includes algorithms for simple and advanced analysis, such as time-frequency analysis using multitapers, source reconstruction using dipoles, distributed sources and beamformers, connectivity analysis, and nonparametric statistical permutation tests at the channel and source level. The implementation as toolbox allows the user to perform elaborate and structured analyses of large data sets using the MATLAB command line and batch scripting. Furthermore, users and developers can easily extend the functionality and implement new algorithms. The modular design facilitates the reuse in other software packages. PMID:21253357
NASA Astrophysics Data System (ADS)
Sawicka, K.; Breuer, L.; Houska, T.; Santabarbara Ruiz, I.; Heuvelink, G. B. M.
2016-12-01
Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Advances in uncertainty propagation analysis and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability, including case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the `spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo techniques, as well as several uncertainty visualization functions. Here we will demonstrate that the 'spup' package is an effective and easy-to-use tool to be applied even in a very complex study case, and that it can be used in multi-disciplinary research and model-based decision support. As an example, we use the ecological LandscapeDNDC model to analyse propagation of uncertainties associated with spatial variability of the model driving forces such as rainfall, nitrogen deposition and fertilizer inputs. The uncertainty propagation is analysed for the prediction of emissions of N2O and CO2 for a German low mountainous, agriculturally developed catchment. The study tests the effect of spatial correlations on spatially aggregated model outputs, and could serve as an advice for developing best management practices and model improvement strategies.
Product Support Manager Guidebook
2011-04-01
package is being developed using supportability analysis concepts such as Failure Mode, Effects and Criticality Analysis (FMECA), Fault Tree Analysis ( FTA ...Analysis (LORA) Condition Based Maintenance + (CBM+) Fault Tree Analysis ( FTA ) Failure Mode, Effects, and Criticality Analysis (FMECA) Maintenance Task...Reporting and Corrective Action System (FRACAS), Fault Tree Analysis ( FTA ), Level of Repair Analysis (LORA), Maintenance Task Analysis (MTA
NASA Astrophysics Data System (ADS)
Pandey, Palak; Kunte, Pravin D.
2016-10-01
This study presents an easy, modular, user-friendly, and flexible software package for processing of Landsat 7 ETM and Landsat 8 OLI-TIRS data for estimating suspended particulate matter concentrations in the coastal waters. This package includes 1) algorithm developed using freely downloadable SCILAB package, 2) ERDAS Models for iterative processing of Landsat images and 3) ArcMAP tool for plotting and map making. Utilizing SCILAB package, a module is written for geometric corrections, radiometric corrections and obtaining normalized water-leaving reflectance by incorporating Landsat 8 OLI-TIRS and Landsat 7 ETM+ data. Using ERDAS models, a sequence of modules are developed for iterative processing of Landsat images and estimating suspended particulate matter concentrations. Processed images are used for preparing suspended sediment concentration maps. The applicability of this software package is demonstrated by estimating and plotting seasonal suspended sediment concentration maps off the Bengal delta. The software is flexible enough to accommodate other remotely sensed data like Ocean Color monitor (OCM) data, Indian Remote Sensing data (IRS), MODIS data etc. by replacing a few parameters in the algorithm, for estimating suspended sediment concentration in coastal waters.
X based interactive computer graphics applications for aerodynamic design and education
NASA Technical Reports Server (NTRS)
Benson, Thomas J.; Higgs, C. Fred, III
1995-01-01
Six computer applications packages have been developed to solve a variety of aerodynamic problems in an interactive environment on a single workstation. The packages perform classical one dimensional analysis under the control of a graphical user interface and can be used for preliminary design or educational purposes. The programs were originally developed on a Silicon Graphics workstation and used the GL version of the FORMS library as the graphical user interface. These programs have recently been converted to the XFORMS library of X based graphics widgets and have been tested on SGI, IBM, Sun, HP and PC-Lunix computers. The paper will show results from the new VU-DUCT program as a prime example. VU-DUCT has been developed as an educational package for the study of subsonic open and closed loop wind tunnels.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weeratunga, S K
Ares and Kull are mature code frameworks that support ALE hydrodynamics for a variety of HEDP applications at LLNL, using two widely different meshing approaches. While Ares is based on a 2-D/3-D block-structured mesh data base, Kull is designed to support unstructured, arbitrary polygonal/polyhedral meshes. In addition, both frameworks are capable of running applications on large, distributed-memory parallel machines. Currently, both these frameworks separately support assorted collections of physics packages related to HEDP, including one for the energy deposition by laser/ion-beam ray tracing. This study analyzes the options available for developing a common laser/ion-beam ray tracing package that can bemore » easily shared between these two code frameworks and concludes with a set of recommendations for its development.« less
EXP-PAC: providing comparative analysis and storage of next generation gene expression data.
Church, Philip C; Goscinski, Andrzej; Lefèvre, Christophe
2012-07-01
Microarrays and more recently RNA sequencing has led to an increase in available gene expression data. How to manage and store this data is becoming a key issue. In response we have developed EXP-PAC, a web based software package for storage, management and analysis of gene expression and sequence data. Unique to this package is SQL based querying of gene expression data sets, distributed normalization of raw gene expression data and analysis of gene expression data across experiments and species. This package has been populated with lactation data in the international milk genomic consortium web portal (http://milkgenomics.org/). Source code is also available which can be hosted on a Windows, Linux or Mac APACHE server connected to a private or public network (http://mamsap.it.deakin.edu.au/~pcc/Release/EXP_PAC.html). Copyright © 2012 Elsevier Inc. All rights reserved.
PLACE: an open-source python package for laboratory automation, control, and experimentation.
Johnson, Jami L; Tom Wörden, Henrik; van Wijk, Kasper
2015-02-01
In modern laboratories, software can drive the full experimental process from data acquisition to storage, processing, and analysis. The automation of laboratory data acquisition is an important consideration for every laboratory. When implementing a laboratory automation scheme, important parameters include its reliability, time to implement, adaptability, and compatibility with software used at other stages of experimentation. In this article, we present an open-source, flexible, and extensible Python package for Laboratory Automation, Control, and Experimentation (PLACE). The package uses modular organization and clear design principles; therefore, it can be easily customized or expanded to meet the needs of diverse laboratories. We discuss the organization of PLACE, data-handling considerations, and then present an example using PLACE for laser-ultrasound experiments. Finally, we demonstrate the seamless transition to post-processing and analysis with Python through the development of an analysis module for data produced by PLACE automation. © 2014 Society for Laboratory Automation and Screening.
MPTinR: analysis of multinomial processing tree models in R.
Singmann, Henrik; Kellen, David
2013-06-01
We introduce MPTinR, a software package developed for the analysis of multinomial processing tree (MPT) models. MPT models represent a prominent class of cognitive measurement models for categorical data with applications in a wide variety of fields. MPTinR is the first software for the analysis of MPT models in the statistical programming language R, providing a modeling framework that is more flexible than standalone software packages. MPTinR also introduces important features such as (1) the ability to calculate the Fisher information approximation measure of model complexity for MPT models, (2) the ability to fit models for categorical data outside the MPT model class, such as signal detection models, (3) a function for model selection across a set of nested and nonnested candidate models (using several model selection indices), and (4) multicore fitting. MPTinR is available from the Comprehensive R Archive Network at http://cran.r-project.org/web/packages/MPTinR/ .
Pérez-Esteve, Edgar; Bernardos, Andrea; Martínez-Máñez, Ramón; Barat, José M
2013-04-01
In recent years nanotechnology has become a significant component in food industry. It is present in all food chain steps, from the design of new ingredients or additives, to the most modern systems of food quality methods or packaging, demonstrating the great potential of this new technology in a sector as traditional as food. However, while interest by industry in nanotechnology increases, the rejection by consumers, concerned about the potential risk, does too. The aim of this review is to evaluate the development of food nanotechnology by means of a patent analysis, highlighting current applications of nanotechnology along the whole food chain and contextualizing this evolution in the social scene.
Preliminary Dynamic Feasibility and Analysis of a Spherical, Wind-Driven (Tumbleweed), Martian Rover
NASA Technical Reports Server (NTRS)
Flick, John J.; Toniolo, Matthew D.
2005-01-01
The process and findings are presented from a preliminary feasibility study examining the dynamics characteristics of a spherical wind-driven (or Tumbleweed) rover, which is intended for exploration of the Martian surface. The results of an initial feasibility study involving several worst-case mobility situations that a Tumbleweed rover might encounter on the surface of Mars are discussed. Additional topics include the evaluation of several commercially available analysis software packages that were examined as possible platforms for the development of a Monte Carlo Tumbleweed mission simulation tool. This evaluation lead to the development of the Mars Tumbleweed Monte Carlo Simulator (or Tumbleweed Simulator) using the Vortex physics software package from CM-Labs, Inc. Discussions regarding the development and evaluation of the Tumbleweed Simulator, as well as the results of a preliminary analysis using the tool are also presented. Finally, a brief conclusions section is presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
M. Gross
2004-09-01
The purpose of this scientific analysis is to define the sampled values of stochastic (random) input parameters for (1) rockfall calculations in the lithophysal and nonlithophysal zones under vibratory ground motions, and (2) structural response calculations for the drip shield and waste package under vibratory ground motions. This analysis supplies: (1) Sampled values of ground motion time history and synthetic fracture pattern for analysis of rockfall in emplacement drifts in nonlithophysal rock (Section 6.3 of ''Drift Degradation Analysis'', BSC 2004 [DIRS 166107]); (2) Sampled values of ground motion time history and rock mechanical properties category for analysis of rockfall inmore » emplacement drifts in lithophysal rock (Section 6.4 of ''Drift Degradation Analysis'', BSC 2004 [DIRS 166107]); (3) Sampled values of ground motion time history and metal to metal and metal to rock friction coefficient for analysis of waste package and drip shield damage to vibratory motion in ''Structural Calculations of Waste Package Exposed to Vibratory Ground Motion'' (BSC 2004 [DIRS 167083]) and in ''Structural Calculations of Drip Shield Exposed to Vibratory Ground Motion'' (BSC 2003 [DIRS 163425]). The sampled values are indices representing the number of ground motion time histories, number of fracture patterns and rock mass properties categories. These indices are translated into actual values within the respective analysis and model reports or calculations. This report identifies the uncertain parameters and documents the sampled values for these parameters. The sampled values are determined by GoldSim V6.04.007 [DIRS 151202] calculations using appropriate distribution types and parameter ranges. No software development or model development was required for these calculations. The calculation of the sampled values allows parameter uncertainty to be incorporated into the rockfall and structural response calculations that support development of the seismic scenario for the Total System Performance Assessment for the License Application (TSPA-LA). The results from this scientific analysis also address project requirements related to parameter uncertainty, as specified in the acceptance criteria in ''Yucca Mountain Review Plan, Final Report'' (NRC 2003 [DIRS 163274]). This document was prepared under the direction of ''Technical Work Plan for: Regulatory Integration Modeling of Drift Degradation, Waste Package and Drip Shield Vibratory Motion and Seismic Consequences'' (BSC 2004 [DIRS 170528]) which directed the work identified in work package ARTM05. This document was prepared under procedure AP-SIII.9Q, ''Scientific Analyses''. There are no specific known limitations to this analysis.« less
The Statistical Package for the Social Sciences (SPSS) as an adjunct to pharmacokinetic analysis.
Mather, L E; Austin, K L
1983-01-01
Computer techniques for numerical analysis are well known to pharmacokineticists. Powerful techniques for data file management have been developed by social scientists but have, in general, been ignored by pharmacokineticists because of their apparent lack of ability to interface with pharmacokinetic programs. Extensive use has been made of the Statistical Package for the Social Sciences (SPSS) for its data handling capabilities, but at the same time, techniques have been developed within SPSS to interface with pharmacokinetic programs of the users' choice and to carry out a variety of user-defined pharmacokinetic tasks within SPSS commands, apart from the expected variety of statistical tasks. Because it is based on a ubiquitous package, this methodology has all of the benefits of excellent documentation, interchangeability between different types and sizes of machines and true portability of techniques and data files. An example is given of the total management of a pharmacokinetic study previously reported in the literature by the authors.
Liu, Yijin; Meirer, Florian; Williams, Phillip A.; Wang, Junyue; Andrews, Joy C.; Pianetta, Piero
2012-01-01
Transmission X-ray microscopy (TXM) has been well recognized as a powerful tool for non-destructive investigation of the three-dimensional inner structure of a sample with spatial resolution down to a few tens of nanometers, especially when combined with synchrotron radiation sources. Recent developments of this technique have presented a need for new tools for both system control and data analysis. Here a software package developed in MATLAB for script command generation and analysis of TXM data is presented. The first toolkit, the script generator, allows automating complex experimental tasks which involve up to several thousand motor movements. The second package was designed to accomplish computationally intense tasks such as data processing of mosaic and mosaic tomography datasets; dual-energy contrast imaging, where data are recorded above and below a specific X-ray absorption edge; and TXM X-ray absorption near-edge structure imaging datasets. Furthermore, analytical and iterative tomography reconstruction algorithms were implemented. The compiled software package is freely available. PMID:22338691
The IRAF Fabry-Perot analysis package: Ring fitting
NASA Technical Reports Server (NTRS)
Shopbell, P. L.; Bland-Hawthorn, J.; Cecil, G.
1992-01-01
As introduced at ADASSI, a Fabry-Perot analysis package for IRAF is currently under development as a joint effort of ourselves and Frank Valdes of the IRAF group. Although additional portions of the package were also implemented, we report primarily on the development of a robust ring fitting task, useful for fitting the calibration rings obtained in Fabry-Perot observations. The general equation of an ellipse is fit to the shape of the rings, providing information on ring center, ellipticity, and position angle. Such parameters provide valuable information on the wavelength response of the etalon and the geometric stability of the system. Appropriate statistical weighting is applied to the pixels to account for increasing numbers with radius, the Lorentzian cross-section, and uneven illumination. The major problems of incomplete, non-uniform, and multiple rings are addressed with the final task capable of fitting rings regardless of center, cross-section, or completion. The task requires only minimal user intervention, allowing large numbers of rings to be fit in an extremely automated manner.
Adeeb A. Rahman; Thomas J. Urbanik; Mustafa Mahamid
2003-01-01
Collapse of fiberboard packaging boxes, in the shipping industry, due to rise in humidity conditions is common and very costly. A 3D FE nonlinear model is developed to predict the moisture flow throughout a corrugated packaging fiberboard sandwich structure. The model predicts how the moisture diffusion will permeate through the layers of a fiberboard (medium and...
Kinematics Simulation Analysis of Packaging Robot with Joint Clearance
NASA Astrophysics Data System (ADS)
Zhang, Y. W.; Meng, W. J.; Wang, L. Q.; Cui, G. H.
2018-03-01
Considering the influence of joint clearance on the motion error, repeated positioning accuracy and overall position of the machine, this paper presents simulation analysis of a packaging robot — 2 degrees of freedom(DOF) planar parallel robot based on the characteristics of high precision and fast speed of packaging equipment. The motion constraint equation of the mechanism is established, and the analysis and simulation of the motion error are carried out in the case of turning the revolute clearance. The simulation results show that the size of the joint clearance will affect the movement accuracy and packaging efficiency of the packaging robot. The analysis provides a reference point of view for the packaging equipment design and selection criteria and has a great significance on the packaging industry automation.
MAVTgsa: An R Package for Gene Set (Enrichment) Analysis
Chien, Chih-Yi; Chang, Ching-Wei; Tsai, Chen-An; ...
2014-01-01
Gene semore » t analysis methods aim to determine whether an a priori defined set of genes shows statistically significant difference in expression on either categorical or continuous outcomes. Although many methods for gene set analysis have been proposed, a systematic analysis tool for identification of different types of gene set significance modules has not been developed previously. This work presents an R package, called MAVTgsa, which includes three different methods for integrated gene set enrichment analysis. (1) The one-sided OLS (ordinary least squares) test detects coordinated changes of genes in gene set in one direction, either up- or downregulation. (2) The two-sided MANOVA (multivariate analysis variance) detects changes both up- and downregulation for studying two or more experimental conditions. (3) A random forests-based procedure is to identify gene sets that can accurately predict samples from different experimental conditions or are associated with the continuous phenotypes. MAVTgsa computes the P values and FDR (false discovery rate) q -value for all gene sets in the study. Furthermore, MAVTgsa provides several visualization outputs to support and interpret the enrichment results. This package is available online.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mcwilliams, A. J.; Daugherty, W. L.; Skidmore, T. E.
The 9975 Type B shipping package is used within the DOE complex for shipping special nuclear materials. This package is re-certified annually in accordance with Safety Analysis Report for Packaging (SARP) requirements. The package is also used at the Savannah River Site as part of the long-term storage configuration of special nuclear materials. As such, the packages do not undergo annual recertification during storage, with uncertainty as to how long some of the package components will meet their functional requirements in the storage environment. The packages are currently approved for up to 15 years storage, and work continues to providemore » a technical basis to extend that period. This report describes efforts by the Savannah River National Laboratory (SRNL) to extend the service life estimate of Viton® GLT and GLT-S fluoroelastomer O-rings used in the 9975 shipping package. O-rings of both GLT and GLT-S compositions are undergoing accelerated aging at elevated temperature, and are periodically tested for compression stress relaxation (CSR) behavior. The CSR behavior of O-rings was evaluated at temperatures from 175 to 400 °F. These collective data were used to develop predictive models for extrapolation of CSR behavior to relevant service temperatures (< 156 °F). The predictive model developed from the CSR data conservatively indicates a service life of approximately 37 years for Viton GLT O-rings at the maximum effective service temperature of 156 °F. The estimated service life for Viton GLT-S O-rings is significantly longer.« less
New generation of exploration tools: interactive modeling software and microcomputers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krajewski, S.A.
1986-08-01
Software packages offering interactive modeling techniques are now available for use on microcomputer hardware systems. These packages are reasonably priced for both company and independent explorationists; they do not require users to have high levels of computer literacy; they are capable of rapidly completing complex ranges of sophisticated geologic and geophysical modeling tasks; and they can produce presentation-quality output for comparison with real-world data. For example, interactive packages are available for mapping, log analysis, seismic modeling, reservoir studies, and financial projects as well as for applying a variety of statistical and geostatistical techniques to analysis of exploration data. More importantly,more » these packages enable explorationists to directly apply their geologic expertise when developing and fine-tuning models for identifying new prospects and for extending producing fields. As a result of these features, microcomputers and interactive modeling software are becoming common tools in many exploration offices. Gravity and magnetics software programs illustrate some of the capabilities of such exploration tools.« less
Hydrological analysis in R: Topmodel and beyond
NASA Astrophysics Data System (ADS)
Buytaert, W.; Reusser, D.
2011-12-01
R is quickly gaining popularity in the hydrological sciences community. The wide range of statistical and mathematical functionality makes it an excellent tool for data analysis, modelling and uncertainty analysis. Topmodel was one of the first hydrological models being implemented as an R package and distributed through R's own distribution network CRAN. This facilitated pre- and postprocessing of data such as parameter sampling, calculation of prediction bounds, and advanced visualisation. However, apart from these basic functionalities, the package did not use many of the more advanced features of the R environment, especially from R's object oriented functionality. With R's increasing expansion in arenas such as high performance computing, big data analysis, and cloud services, we revisit the topmodel package, and use it as an example of how to build and deploy the next generation of hydrological models. R provides a convenient environment and attractive features to build and couple hydrological - and in extension other environmental - models, to develop flexible and effective data assimilation strategies, and to take the model beyond the individual computer by linking into cloud services for both data provision and computing. However, in order to maximise the benefit of these approaches, it will be necessary to adopt standards and ontologies for model interaction and information exchange. Some of those are currently being developed, such as the OGC web processing standards, while other will need to be developed.
Sensitivity analysis of a wing aeroelastic response
NASA Technical Reports Server (NTRS)
Kapania, Rakesh K.; Eldred, Lloyd B.; Barthelemy, Jean-Francois M.
1991-01-01
A variation of Sobieski's Global Sensitivity Equations (GSE) approach is implemented to obtain the sensitivity of the static aeroelastic response of a three-dimensional wing model. The formulation is quite general and accepts any aerodynamics and structural analysis capability. An interface code is written to convert one analysis's output to the other's input, and visa versa. Local sensitivity derivatives are calculated by either analytic methods or finite difference techniques. A program to combine the local sensitivities, such as the sensitivity of the stiffness matrix or the aerodynamic kernel matrix, into global sensitivity derivatives is developed. The aerodynamic analysis package FAST, using a lifting surface theory, and a structural package, ELAPS, implementing Giles' equivalent plate model are used.
FTOOLS: A FITS Data Processing and Analysis Software Package
NASA Astrophysics Data System (ADS)
Blackburn, J. K.
FTOOLS, a highly modular collection of over 110 utilities for processing and analyzing data in the FITS (Flexible Image Transport System) format, has been developed in support of the HEASARC (High Energy Astrophysics Science Archive Research Center) at NASA's Goddard Space Flight Center. Each utility performs a single simple task such as presentation of file contents, extraction of specific rows or columns, appending or merging tables, binning values in a column or selecting subsets of rows based on a boolean expression. Individual utilities can easily be chained together in scripts to achieve more complex operations such as the generation and displaying of spectra or light curves. The collection of utilities provides both generic processing and analysis utilities and utilities specific to high energy astrophysics data sets used for the ASCA, ROSAT, GRO, and XTE missions. A core set of FTOOLS providing support for generic FITS data processing, FITS image analysis and timing analysis can easily be split out of the full software package for users not needing the high energy astrophysics mission utilities. The FTOOLS software package is designed to be both compatible with IRAF and completely stand alone in a UNIX or VMS environment. The user interface is controlled by standard IRAF parameter files. The package is self documenting through the IRAF help facility and a stand alone help task. Software is written in ANSI C and \\fortran to provide portability across most computer systems. The data format dependencies between hardware platforms are isolated through the FITSIO library package.
Advances in the REDCAT software package
2013-01-01
Background Residual Dipolar Couplings (RDCs) have emerged in the past two decades as an informative source of experimental restraints for the study of structure and dynamics of biological macromolecules and complexes. The REDCAT software package was previously introduced for the analysis of molecular structures using RDC data. Here we report additional features that have been included in this software package in order to expand the scope of its analyses. We first discuss the features that enhance REDCATs user-friendly nature, such as the integration of a number of analyses into one single operation and enabling convenient examination of a structural ensemble in order to identify the most suitable structure. We then describe the new features which expand the scope of RDC analyses, performing exercises that utilize both synthetic and experimental data to illustrate and evaluate different features with regard to structure refinement and structure validation. Results We establish the seamless interaction that takes place between REDCAT, VMD, and Xplor-NIH in demonstrations that utilize our newly developed REDCAT-VMD and XplorGUI interfaces. These modules enable visualization of RDC analysis results on the molecular structure displayed in VMD and refinement of structures with Xplor-NIH, respectively. We also highlight REDCAT’s Error-Analysis feature in reporting the localized fitness of a structure to RDC data, which provides a more effective means of recognizing local structural anomalies. This allows for structurally sound regions of a molecule to be identified, and for any refinement efforts to be focused solely on locally distorted regions. Conclusions The newly engineered REDCAT software package, which is available for download via the WWW from http://ifestos.cse.sc.edu, has been developed in the Object Oriented C++ environment. Our most recent enhancements to REDCAT serve to provide a more complete RDC analysis suite, while also accommodating a more user-friendly experience, and will be of great interest to the community of researchers and developers since it hides the complications of software development. PMID:24098943
Leelaphiwat, Pattarin; Harte, Janice B; Auras, Rafael A; Ong, Peter Kc; Chonhenchob, Vanee
2017-04-01
Changes in the aroma characteristics of Thai 'tom yam' seasoning powder, containing lemongrass, galangal and kaffir lime leaf, as affected by different packaging materials were assessed using quantitative descriptive analysis (QDA) and gas chromatography-mass spectrometry (GC-MS). The descriptive aroma attributes for lemongrass, galangal and kaffir lime leaf powders were developed by the QDA panel. The mixed herb and spice seasoning powder was kept in glass jars closed with different packaging materials (Nylon 6, polyethylene terephthalate (PET) and polylactic acid (PLA)) stored at 38 °C (accelerated storage condition), and evaluated by the trained QDA panel during storage for 49 days. The descriptive words for Thai 'tom yam' seasoning powder developed by the trained panelists were lemongrass, vinegary and leafy for lemongrass, galangal and kaffir lime leaf dried powder, respectively. The aroma intensities significantly (P ≤ 0.05) decreased with increased storage time. However, the intensity scores for aroma attributes were not significantly (P > 0.05) different among the packaging materials studied. The major components in Thai 'tom yam' seasoning powder, quantified by GC-MS, were estragole, bicyclo[3.1.1]heptane, β-bisabolene, benzoic acid and 2-ethylhexyl salicylate. The concentrations of major aroma compounds significantly (P ≤ 0.05) decreased with storage time. Aroma stability of Thai 'tom yam' powder can be determined by descriptive sensory evaluation and GC-MS analysis. Nylon, PET and PLA exhibited similar aroma barrier properties against key aroma compounds in Thai 'tom yam'. This information can be used for prediction of aroma loss through packaging materials during storage of Thai 'tom yam'. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.
NASA Astrophysics Data System (ADS)
Nordin, N. H.; Hara, H.; Kaida, N.
2017-05-01
Food safety is an important issue that is related to public safety to prevent the toxicity threats of the food. Management through legal approach has been used in Malaysia as one of the predominant approaches to manage the environment. In this regard, the Food Regulation 1985 has been one of the mechanisms of environmental management through legal approach in controlling the safety of packaged food in food packaging industry in Malaysia. The present study aims to analyse and to explain the implementation of the Food Regulation 1985 in controlling the safety of packaged food in Malaysia and to integrate the concept of environmental management into the food safety issue. Qualitative analysis on the regulation document revealed that there are two main themes, general and specific, while their seven sub themes are included harmful packages, safety packages, reuse packages, polyvinyl chloride (PVC), alcoholic bottle, toys, money and others and iron powder. The implementation of the Food Regulation 1985 in controlling the safety of packaged food should not be regarded solely for regulation purposes but should be further developed for a broader sense of food safety from overcoming the food poisoning.
SimHap GUI: an intuitive graphical user interface for genetic association analysis.
Carter, Kim W; McCaskie, Pamela A; Palmer, Lyle J
2008-12-25
Researchers wishing to conduct genetic association analysis involving single nucleotide polymorphisms (SNPs) or haplotypes are often confronted with the lack of user-friendly graphical analysis tools, requiring sophisticated statistical and informatics expertise to perform relatively straightforward tasks. Tools, such as the SimHap package for the R statistics language, provide the necessary statistical operations to conduct sophisticated genetic analysis, but lacks a graphical user interface that allows anyone but a professional statistician to effectively utilise the tool. We have developed SimHap GUI, a cross-platform integrated graphical analysis tool for conducting epidemiological, single SNP and haplotype-based association analysis. SimHap GUI features a novel workflow interface that guides the user through each logical step of the analysis process, making it accessible to both novice and advanced users. This tool provides a seamless interface to the SimHap R package, while providing enhanced functionality such as sophisticated data checking, automated data conversion, and real-time estimations of haplotype simulation progress. SimHap GUI provides a novel, easy-to-use, cross-platform solution for conducting a range of genetic and non-genetic association analyses. This provides a free alternative to commercial statistics packages that is specifically designed for genetic association analysis.
NASA Astrophysics Data System (ADS)
Bianchi, R. M.; Boudreau, J.; Konstantinidis, N.; Martyniuk, A. C.; Moyse, E.; Thomas, J.; Waugh, B. M.; Yallup, D. P.; ATLAS Collaboration
2017-10-01
At the beginning, HEP experiments made use of photographical images both to record and store experimental data and to illustrate their findings. Then the experiments evolved and needed to find ways to visualize their data. With the availability of computer graphics, software packages to display event data and the detector geometry started to be developed. Here, an overview of the usage of event display tools in HEP is presented. Then the case of the ATLAS experiment is considered in more detail and two widely used event display packages are presented, Atlantis and VP1, focusing on the software technologies they employ, as well as their strengths, differences and their usage in the experiment: from physics analysis to detector development, and from online monitoring to outreach and communication. Towards the end, the other ATLAS visualization tools will be briefly presented as well. Future development plans and improvements in the ATLAS event display packages will also be discussed.
JP-8+100: The development of high-thermal-stability jet fuel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heneghan, S.P.; Zabarnick, S.; Ballal, D.R.
1996-09-01
Jet fuel requirements have evolved over the years as a balance of the demands placed by advanced aircraft performance (technological need), fuel cost (economic factors), and fuel availability (strategic factors). In a modern aircraft, the jet fuel not only provides the propulsive energy for flight, but also is the primary coolant for aircraft and engine subsystems. To meet the evolving challenge of improving the cooling potential of jet fuel while maintaining the current availability at a minimal price increase, the US Air Force, industry, and academia have teamed to develop an additive package for JP-8 fuels. This paper describes themore » development of an additive package for JP-8, to produce JP-8+100. This new fuel offers a 55 C increase in the bulk maximum temperature (from 325 F to 425 F) and improves the heat sink capability by 50%. Major advances made during the development JP-8 + 100 fuel include the development of several new quantitative fuel analysis tests, a free radical theory of autooxidation, adaptation of new chemistry models to computational fluid dynamics programs, and a nonparametric statistical analysis to evaluate thermal stability. Hundreds of additives were tested for effectiveness, and a package of additives was then formulated for JP-8 fuel. This package has been tested for fuel system materials compatibility and general fuel applicability. To date, the flight testing ha shown an improvement in thermal stability of JP-8 fuel. This improvement has resulted in a significant reduction in fuel-related maintenance costs and a threefold increase in mean time between fuel-related failures. In this manner, a novel high-thermal-stability jet fuel for the 21st century has been successfully developed.« less
HEART: an automated beat-to-beat cardiovascular analysis package using Matlab.
Schroeder, M J Mark J; Perreault, Bill; Ewert, D L Daniel L; Koenig, S C Steven C
2004-07-01
A computer program is described for beat-to-beat analysis of cardiovascular parameters from high-fidelity pressure and flow waveforms. The Hemodynamic Estimation and Analysis Research Tool (HEART) is a post-processing analysis software package developed in Matlab that enables scientists and clinicians to document, load, view, calibrate, and analyze experimental data that have been digitally saved in ascii or binary format. Analysis routines include traditional hemodynamic parameter estimates as well as more sophisticated analyses such as lumped arterial model parameter estimation and vascular impedance frequency spectra. Cardiovascular parameter values of all analyzed beats can be viewed and statistically analyzed. An attractive feature of the HEART program is the ability to analyze data with visual quality assurance throughout the process, thus establishing a framework toward which Good Laboratory Practice (GLP) compliance can be obtained. Additionally, the development of HEART on the Matlab platform provides users with the flexibility to adapt or create study specific analysis files according to their specific needs. Copyright 2003 Elsevier Ltd.
Multiscale analysis of river networks using the R package linbin
Welty, Ethan Z.; Torgersen, Christian E.; Brenkman, Samuel J.; Duda, Jeffrey J.; Armstrong, Jonathan B.
2015-01-01
Analytical tools are needed in riverine science and management to bridge the gap between GIS and statistical packages that were not designed for the directional and dendritic structure of streams. We introduce linbin, an R package developed for the analysis of riverscapes at multiple scales. With this software, riverine data on aquatic habitat and species distribution can be scaled and plotted automatically with respect to their position in the stream network or—in the case of temporal data—their position in time. The linbin package aggregates data into bins of different sizes as specified by the user. We provide case studies illustrating the use of the software for (1) exploring patterns at different scales by aggregating variables at a range of bin sizes, (2) comparing repeat observations by aggregating surveys into bins of common coverage, and (3) tailoring analysis to data with custom bin designs. Furthermore, we demonstrate the utility of linbin for summarizing patterns throughout an entire stream network, and we analyze the diel and seasonal movements of tagged fish past a stationary receiver to illustrate how linbin can be used with temporal data. In short, linbin enables more rapid analysis of complex data sets by fisheries managers and stream ecologists and can reveal underlying spatial and temporal patterns of fish distribution and habitat throughout a riverscape.
snpGeneSets: An R Package for Genome-Wide Study Annotation
Mei, Hao; Li, Lianna; Jiang, Fan; Simino, Jeannette; Griswold, Michael; Mosley, Thomas; Liu, Shijian
2016-01-01
Genome-wide studies (GWS) of SNP associations and differential gene expressions have generated abundant results; next-generation sequencing technology has further boosted the number of variants and genes identified. Effective interpretation requires massive annotation and downstream analysis of these genome-wide results, a computationally challenging task. We developed the snpGeneSets package to simplify annotation and analysis of GWS results. Our package integrates local copies of knowledge bases for SNPs, genes, and gene sets, and implements wrapper functions in the R language to enable transparent access to low-level databases for efficient annotation of large genomic data. The package contains functions that execute three types of annotations: (1) genomic mapping annotation for SNPs and genes and functional annotation for gene sets; (2) bidirectional mapping between SNPs and genes, and genes and gene sets; and (3) calculation of gene effect measures from SNP associations and performance of gene set enrichment analyses to identify functional pathways. We applied snpGeneSets to type 2 diabetes (T2D) results from the NHGRI genome-wide association study (GWAS) catalog, a Finnish GWAS, and a genome-wide expression study (GWES). These studies demonstrate the usefulness of snpGeneSets for annotating and performing enrichment analysis of GWS results. The package is open-source, free, and can be downloaded at: https://www.umc.edu/biostats_software/. PMID:27807048
PyXRF: Python-based X-ray fluorescence analysis package
NASA Astrophysics Data System (ADS)
Li, Li; Yan, Hanfei; Xu, Wei; Yu, Dantong; Heroux, Annie; Lee, Wah-Keat; Campbell, Stuart I.; Chu, Yong S.
2017-09-01
We developed a python-based fluorescence analysis package (PyXRF) at the National Synchrotron Light Source II (NSLS-II) for the X-ray fluorescence-microscopy beamlines, including Hard X-ray Nanoprobe (HXN), and Submicron Resolution X-ray Spectroscopy (SRX). This package contains a high-level fitting engine, a comprehensive commandline/ GUI design, rigorous physics calculations, and a visualization interface. PyXRF offers a method of automatically finding elements, so that users do not need to spend extra time selecting elements manually. Moreover, PyXRF provides a convenient and interactive way of adjusting fitting parameters with physical constraints. This will help us perform quantitative analysis, and find an appropriate initial guess for fitting. Furthermore, we also create an advanced mode for expert users to construct their own fitting strategies with a full control of each fitting parameter. PyXRF runs single-pixel fitting at a fast speed, which opens up the possibilities of viewing the results of fitting in real time during experiments. A convenient I/O interface was designed to obtain data directly from NSLS-II's experimental database. PyXRF is under open-source development and designed to be an integral part of NSLS-II's scientific computation library.
NASA Astrophysics Data System (ADS)
Gray, Bonnie L.
2012-04-01
Microfluidics is revolutionizing laboratory methods and biomedical devices, offering new capabilities and instrumentation in multiple areas such as DNA analysis, proteomics, enzymatic analysis, single cell analysis, immunology, point-of-care medicine, personalized medicine, drug delivery, and environmental toxin and pathogen detection. For many applications (e.g., wearable and implantable health monitors, drug delivery devices, and prosthetics) mechanically flexible polymer devices and systems that can conform to the body offer benefits that cannot be achieved using systems based on conventional rigid substrate materials. However, difficulties in implementing active devices and reliable packaging technologies have limited the success of flexible microfluidics. Employing highly compliant materials such as PDMS that are typically employed for prototyping, we review mechanically flexible polymer microfluidic technologies based on free-standing polymer substrates and novel electronic and microfluidic interconnection schemes. Central to these new technologies are hybrid microfabrication methods employing novel nanocomposite polymer materials and devices. We review microfabrication methods using these materials, along with demonstrations of example devices and packaging schemes that employ them. We review these recent developments and place them in the context of the fields of flexible microfluidics and conformable systems, and discuss cross-over applications to conventional rigid-substrate microfluidics.
Using Kepler for Tool Integration in Microarray Analysis Workflows.
Gan, Zhuohui; Stowe, Jennifer C; Altintas, Ilkay; McCulloch, Andrew D; Zambon, Alexander C
Increasing numbers of genomic technologies are leading to massive amounts of genomic data, all of which requires complex analysis. More and more bioinformatics analysis tools are being developed by scientist to simplify these analyses. However, different pipelines have been developed using different software environments. This makes integrations of these diverse bioinformatics tools difficult. Kepler provides an open source environment to integrate these disparate packages. Using Kepler, we integrated several external tools including Bioconductor packages, AltAnalyze, a python-based open source tool, and R-based comparison tool to build an automated workflow to meta-analyze both online and local microarray data. The automated workflow connects the integrated tools seamlessly, delivers data flow between the tools smoothly, and hence improves efficiency and accuracy of complex data analyses. Our workflow exemplifies the usage of Kepler as a scientific workflow platform for bioinformatics pipelines.
pcr: an R package for quality assessment, analysis and testing of qPCR data
Ahmed, Mahmoud
2018-01-01
Background Real-time quantitative PCR (qPCR) is a broadly used technique in the biomedical research. Currently, few different analysis models are used to determine the quality of data and to quantify the mRNA level across the experimental conditions. Methods We developed an R package to implement methods for quality assessment, analysis and testing qPCR data for statistical significance. Double Delta CT and standard curve models were implemented to quantify the relative expression of target genes from CT in standard qPCR control-group experiments. In addition, calculation of amplification efficiency and curves from serial dilution qPCR experiments are used to assess the quality of the data. Finally, two-group testing and linear models were used to test for significance of the difference in expression control groups and conditions of interest. Results Using two datasets from qPCR experiments, we applied different quality assessment, analysis and statistical testing in the pcr package and compared the results to the original published articles. The final relative expression values from the different models, as well as the intermediary outputs, were checked against the expected results in the original papers and were found to be accurate and reliable. Conclusion The pcr package provides an intuitive and unified interface for its main functions to allow biologist to perform all necessary steps of qPCR analysis and produce graphs in a uniform way. PMID:29576953
missMethyl: an R package for analyzing data from Illumina's HumanMethylation450 platform.
Phipson, Belinda; Maksimovic, Jovana; Oshlack, Alicia
2016-01-15
DNA methylation is one of the most commonly studied epigenetic modifications due to its role in both disease and development. The Illumina HumanMethylation450 BeadChip is a cost-effective way to profile >450 000 CpGs across the human genome, making it a popular platform for profiling DNA methylation. Here we introduce missMethyl, an R package with a suite of tools for performing normalization, removal of unwanted variation in differential methylation analysis, differential variability testing and gene set analysis for the 450K array. missMethyl is an R package available from the Bioconductor project at www.bioconductor.org. alicia.oshlack@mcri.edu.au Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Hilgers, Ralf-Dieter; Bogdan, Malgorzata; Burman, Carl-Fredrik; Dette, Holger; Karlsson, Mats; König, Franz; Male, Christoph; Mentré, France; Molenberghs, Geert; Senn, Stephen
2018-05-11
IDeAl (Integrated designs and analysis of small population clinical trials) is an EU funded project developing new statistical design and analysis methodologies for clinical trials in small population groups. Here we provide an overview of IDeAl findings and give recommendations to applied researchers. The description of the findings is broken down by the nine scientific IDeAl work packages and summarizes results from the project's more than 60 publications to date in peer reviewed journals. In addition, we applied text mining to evaluate the publications and the IDeAl work packages' output in relation to the design and analysis terms derived from in the IRDiRC task force report on small population clinical trials. The results are summarized, describing the developments from an applied viewpoint. The main result presented here are 33 practical recommendations drawn from the work, giving researchers a comprehensive guidance to the improved methodology. In particular, the findings will help design and analyse efficient clinical trials in rare diseases with limited number of patients available. We developed a network representation relating the hot topics developed by the IRDiRC task force on small population clinical trials to IDeAl's work as well as relating important methodologies by IDeAl's definition necessary to consider in design and analysis of small-population clinical trials. These network representation establish a new perspective on design and analysis of small-population clinical trials. IDeAl has provided a huge number of options to refine the statistical methodology for small-population clinical trials from various perspectives. A total of 33 recommendations developed and related to the work packages help the researcher to design small population clinical trial. The route to improvements is displayed in IDeAl-network representing important statistical methodological skills necessary to design and analysis of small-population clinical trials. The methods are ready for use.
The Cooperative VAS Program with the Marshall Space Flight Center
NASA Technical Reports Server (NTRS)
Diak, George R.; Menzel, W. Paul
1988-01-01
Work was divided between the analysis/forecast model development and evaluation of the impact of satellite data in mesoscale numerical weather prediction (NWP), development of the Multispectral Atmospheric Mapping Sensor (MAMS), and other related research. The Cooperative Institute for Meteorological Satellite Studies (CIMSS) Synoptic Scale Model (SSM) has progressed from a relatively basic analysis/forecast system to a package which includes such features as nonlinear vertical mode initialization, comprehensive Planetary Boundary Layer (PBL) physics, and the core of a fully four-dimensional data assimilation package. The MAMS effort has produced a calibrated visible and infrared sensor that produces imager at high spatial resolution. The MAMS was developed in order to study small scale atmospheric moisture variability, to monitor and classify clouds, and to investigate the role of surface characteristics in the production of clouds, precipitation, and severe storms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Salama, A.; Mikhail, M.
Comprehensive software packages have been developed at the Western Research Centre as tools to help coal preparation engineers analyze, evaluate, and control coal cleaning processes. The COal Preparation Software package (COPS) performs three functions: (1) data handling and manipulation, (2) data analysis, including the generation of washability data, performance evaluation and prediction, density and size modeling, evaluation of density and size partition characteristics and attrition curves, and (3) generation of graphics output. The Separation ChARacteristics Estimation software packages (SCARE) are developed to balance raw density or size separation data. The cases of density and size separation data are considered. Themore » generated balanced data can take the balanced or normalized forms. The scaled form is desirable for direct determination of the partition functions (curves). The raw and generated separation data are displayed in tabular and/or graphical forms. The computer softwares described in this paper are valuable tools for coal preparation plant engineers and operators for evaluating process performance, adjusting plant parameters, and balancing raw density or size separation data. These packages have been applied very successfully in many projects carried out by WRC for the Canadian coal preparation industry. The software packages are designed to run on a personal computer (PC).« less
Faramarzi, Salar; Moradi, Mohammadreza; Abedi, Ahmad
2018-06-01
The present study aimed to develop the thinking maps training package and compare its training effect with the thinking maps method on the reading performance of second and fifth grade of elementary school male dyslexic students. For this mixed method exploratory study, from among the above mentioned grades' students in Isfahan, 90 students who met the inclusion criteria were selected by multistage sampling and randomly assigned into six experimental and control groups. The data were collected by reading and dyslexia test and Wechsler Intelligence Scale for Children-fourth edition. The results of covariance analysis indicated a significant difference between the reading performance of the experimental (thinking maps training package and thinking maps method groups) and control groups ([Formula: see text]). Moreover, there were significant differences between the thinking maps training package group and thinking maps method group in some of the subtests ([Formula: see text]). It can be concluded that thinking maps training package and the thinking maps method exert a positive influence on the reading performance of dyslexic students; therefore, thinking maps can be used as an effective training and treatment method.
NASA Technical Reports Server (NTRS)
Woodworth, Andrew; Chen, Liangyu
2017-01-01
Testing high voltage (HV) electronic parts (greater than 300 V) for sudden event effects (SEE) caused by cosmic rays in the space environment, consisting of energetic heavy-ions, and neutron radiation in the upper atmosphere is a crucial step towards using these parts in spacecraft and aircraft. Due to the nature of cosmic radiation and neutrons, electronic parts are tested for SEE without any packaging and/or shielding over the top of the device. In the case of commercial HV parts, the top of the packaging is etched off and then a thin dielectric coating is placed over the part in order to avoid electrical arcing between the device surface and wire bonds and other components. Even though the effects of the thin dielectric layer on SEE testing can be accounted for, the dielectric layer significantly hinders post testing failure analysis. Replicating the test capability of state-of-the-art packaging while eliminating the need for post radiation test processing of the die surface (that obscures failure analysis) is the goal. To that end, a new packaging concept for HV parts has been developed that requires no dielectric coating over the part. Testing of prototype packages used with Schottky diodes (rated at 1200V) has shown no electrical arcing during testing and leakage currents during reverse bias testing are within the manufactures specifications.
Probabilistic Design and Analysis Framework
NASA Technical Reports Server (NTRS)
Strack, William C.; Nagpal, Vinod K.
2010-01-01
PRODAF is a software package designed to aid analysts and designers in conducting probabilistic analysis of components and systems. PRODAF can integrate multiple analysis programs to ease the tedious process of conducting a complex analysis process that requires the use of multiple software packages. The work uses a commercial finite element analysis (FEA) program with modules from NESSUS to conduct a probabilistic analysis of a hypothetical turbine blade, disk, and shaft model. PRODAF applies the response surface method, at the component level, and extrapolates the component-level responses to the system level. Hypothetical components of a gas turbine engine are first deterministically modeled using FEA. Variations in selected geometrical dimensions and loading conditions are analyzed to determine the effects of the stress state within each component. Geometric variations include the cord length and height for the blade, inner radius, outer radius, and thickness, which are varied for the disk. Probabilistic analysis is carried out using developing software packages like System Uncertainty Analysis (SUA) and PRODAF. PRODAF was used with a commercial deterministic FEA program in conjunction with modules from the probabilistic analysis program, NESTEM, to perturb loads and geometries to provide a reliability and sensitivity analysis. PRODAF simplified the handling of data among the various programs involved, and will work with many commercial and opensource deterministic programs, probabilistic programs, or modules.
Klein, Johannes; Leupold, Stefan; Biegler, Ilona; Biedendieck, Rebekka; Münch, Richard; Jahn, Dieter
2012-09-01
Time-lapse imaging in combination with fluorescence microscopy techniques enable the investigation of gene regulatory circuits and uncovered phenomena like culture heterogeneity. In this context, computational image processing for the analysis of single cell behaviour plays an increasing role in systems biology and mathematical modelling approaches. Consequently, we developed a software package with graphical user interface for the analysis of single bacterial cell behaviour. A new software called TLM-Tracker allows for the flexible and user-friendly interpretation for the segmentation, tracking and lineage analysis of microbial cells in time-lapse movies. The software package, including manual, tutorial video and examples, is available as Matlab code or executable binaries at http://www.tlmtracker.tu-bs.de.
IQM: An Extensible and Portable Open Source Application for Image and Signal Analysis in Java
Kainz, Philipp; Mayrhofer-Reinhartshuber, Michael; Ahammer, Helmut
2015-01-01
Image and signal analysis applications are substantial in scientific research. Both open source and commercial packages provide a wide range of functions for image and signal analysis, which are sometimes supported very well by the communities in the corresponding fields. Commercial software packages have the major drawback of being expensive and having undisclosed source code, which hampers extending the functionality if there is no plugin interface or similar option available. However, both variants cannot cover all possible use cases and sometimes custom developments are unavoidable, requiring open source applications. In this paper we describe IQM, a completely free, portable and open source (GNU GPLv3) image and signal analysis application written in pure Java. IQM does not depend on any natively installed libraries and is therefore runnable out-of-the-box. Currently, a continuously growing repertoire of 50 image and 16 signal analysis algorithms is provided. The modular functional architecture based on the three-tier model is described along the most important functionality. Extensibility is achieved using operator plugins, and the development of more complex workflows is provided by a Groovy script interface to the JVM. We demonstrate IQM’s image and signal processing capabilities in a proof-of-principle analysis and provide example implementations to illustrate the plugin framework and the scripting interface. IQM integrates with the popular ImageJ image processing software and is aiming at complementing functionality rather than competing with existing open source software. Machine learning can be integrated into more complex algorithms via the WEKA software package as well, enabling the development of transparent and robust methods for image and signal analysis. PMID:25612319
IQM: an extensible and portable open source application for image and signal analysis in Java.
Kainz, Philipp; Mayrhofer-Reinhartshuber, Michael; Ahammer, Helmut
2015-01-01
Image and signal analysis applications are substantial in scientific research. Both open source and commercial packages provide a wide range of functions for image and signal analysis, which are sometimes supported very well by the communities in the corresponding fields. Commercial software packages have the major drawback of being expensive and having undisclosed source code, which hampers extending the functionality if there is no plugin interface or similar option available. However, both variants cannot cover all possible use cases and sometimes custom developments are unavoidable, requiring open source applications. In this paper we describe IQM, a completely free, portable and open source (GNU GPLv3) image and signal analysis application written in pure Java. IQM does not depend on any natively installed libraries and is therefore runnable out-of-the-box. Currently, a continuously growing repertoire of 50 image and 16 signal analysis algorithms is provided. The modular functional architecture based on the three-tier model is described along the most important functionality. Extensibility is achieved using operator plugins, and the development of more complex workflows is provided by a Groovy script interface to the JVM. We demonstrate IQM's image and signal processing capabilities in a proof-of-principle analysis and provide example implementations to illustrate the plugin framework and the scripting interface. IQM integrates with the popular ImageJ image processing software and is aiming at complementing functionality rather than competing with existing open source software. Machine learning can be integrated into more complex algorithms via the WEKA software package as well, enabling the development of transparent and robust methods for image and signal analysis.
NASA Astrophysics Data System (ADS)
Müller, Peter; Krause, Marita; Beck, Rainer; Schmidt, Philip
2017-10-01
Context. The venerable NOD2 data reduction software package for single-dish radio continuum observations, which was developed for use at the 100-m Effelsberg radio telescope, has been successfully applied over many decades. Modern computing facilities, however, call for a new design. Aims: We aim to develop an interactive software tool with a graphical user interface for the reduction of single-dish radio continuum maps. We make a special effort to reduce the distortions along the scanning direction (scanning effects) by combining maps scanned in orthogonal directions or dual- or multiple-horn observations that need to be processed in a restoration procedure. The package should also process polarisation data and offer the possibility to include special tasks written by the individual user. Methods: Based on the ideas of the NOD2 package we developed NOD3, which includes all necessary tasks from the raw maps to the final maps in total intensity and linear polarisation. Furthermore, plot routines and several methods for map analysis are available. The NOD3 package is written in Python, which allows the extension of the package via additional tasks. The required data format for the input maps is FITS. Results: The NOD3 package is a sophisticated tool to process and analyse maps from single-dish observations that are affected by scanning effects from clouds, receiver instabilities, or radio-frequency interference. The "basket-weaving" tool combines orthogonally scanned maps into a final map that is almost free of scanning effects. The new restoration tool for dual-beam observations reduces the noise by a factor of about two compared to the NOD2 version. Combining single-dish with interferometer data in the map plane ensures the full recovery of the total flux density. Conclusions: This software package is available under the open source license GPL for free use at other single-dish radio telescopes of the astronomical community. The NOD3 package is designed to be extendable to multi-channel data represented by data cubes in Stokes I, Q, and U.
Spooled packaging of shape memory alloy actuators
NASA Astrophysics Data System (ADS)
Redmond, John A.
A vast cross-section of transportation, manufacturing, consumer product, and medical technologies rely heavily on actuation. Accordingly, progress in these industries is often strongly coupled to the advancement of actuation technologies. As the field of actuation continues to evolve, smart materials show significant promise for satisfying the growing needs of industry. In particular, shape memory alloy (SMA) wire actuators present an opportunity for low-cost, high performance actuation, but until now, they have been limited or restricted from use in many otherwise suitable applications by the difficulty in packaging the SMA wires within tight or unusually shaped form constraints. To address this packaging problem, SMA wires can be spool-packaged by wrapping around mandrels to make the actuator more compact or by redirecting around multiple mandrels to customize SMA wire pathways to unusual form factors. The goal of this dissertation is to develop the scientific knowledge base for spooled packaging of low-cost SMA wire actuators that enables high, predictable performance within compact, customizable form factors. In developing the scientific knowledge base, this dissertation defines a systematic general representation of single and multiple mandrel spool-packaged SMA actuators and provides tools for their analysis, understanding, and synthesis. A quasi-static analytical model distills the underlying mechanics down to the three effects of friction, bending, and binding, which enables prediction of the behavior of generic spool-packaged SMA actuators with specifiable geometric, loading, frictional, and SMA material parameters. An extensive experimental and simulation-based parameter study establishes the necessary understanding of how primary design tradeoffs between performance, packaging, and cost are governed by the underlying mechanics of spooled actuators. A design methodology outlines a systematic approach to synthesizing high performance SMA wire actuators with mitigated material, power, and packaging costs and compact, customizable form factors. By examining the multi-faceted connections between performance, packaging, and cost, this dissertation builds a knowledge base that goes beyond implementing SMA actuators for particular applications. Rather, it provides a well-developed strategy for realizing the advantages of SMA actuation for a broadened range of applications, thereby enabling opportunities for new functionality and capabilities in industry.
Safety analysis report for packaging (onsite) steel drum
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCormick, W.A.
This Safety Analysis Report for Packaging (SARP) provides the analyses and evaluations necessary to demonstrate that the steel drum packaging system meets the transportation safety requirements of HNF-PRO-154, Responsibilities and Procedures for all Hazardous Material Shipments, for an onsite packaging containing Type B quantities of solid and liquid radioactive materials. The basic component of the steel drum packaging system is the 208 L (55-gal) steel drum.
Around and about an application of the GAMLSS package to non-stationary flood frequency analysis
NASA Astrophysics Data System (ADS)
Debele, S. E.; Bogdanowicz, E.; Strupczewski, W. G.
2017-08-01
The non-stationarity of hydrologic processes due to climate change or human activities is challenging for the researchers and practitioners. However, the practical requirements for taking into account non-stationarity as a support in decision-making procedures exceed the up-to-date development of the theory and the of software. Currently, the most popular and freely available software package that allows for non-stationary statistical analysis is the GAMLSS (generalized additive models for location, scale and shape) package. GAMLSS has been used in a variety of fields. There are also several papers recommending GAMLSS in hydrological problems; however, there are still important issues which have not previously been discussed concerning mainly GAMLSS applicability not only for research and academic purposes, but also in a design practice. In this paper, we present a summary of our experiences in the implementation of GAMLSS to non-stationary flood frequency analysis, highlighting its advantages and pointing out weaknesses with regard to methodological and practical topics.
NASA Astrophysics Data System (ADS)
Nelson, Andrew
2010-11-01
The efficient use of complex neutron scattering instruments is often hindered by the complex nature of their operating software. This complexity exists at each experimental step: data acquisition, reduction and analysis, with each step being as important as the previous. For example, whilst command line interfaces are powerful at automated acquisition they often reduce accessibility by novice users and sometimes reduce the efficiency for advanced users. One solution to this is the development of a graphical user interface which allows the user to operate the instrument by a simple and intuitive "push button" approach. This approach was taken by the Motofit software package for analysis of multiple contrast reflectometry data. Here we describe the extension of this package to cover the data acquisition and reduction steps for the Platypus time-of-flight neutron reflectometer. Consequently, the complete operation of an instrument is integrated into a single, easy to use, program, leading to efficient instrument usage.
BAT - The Bayesian analysis toolkit
NASA Astrophysics Data System (ADS)
Caldwell, Allen; Kollár, Daniel; Kröninger, Kevin
2009-11-01
We describe the development of a new toolkit for data analysis. The analysis package is based on Bayes' Theorem, and is realized with the use of Markov Chain Monte Carlo. This gives access to the full posterior probability distribution. Parameter estimation, limit setting and uncertainty propagation are implemented in a straightforward manner.
WebArray: an online platform for microarray data analysis
Xia, Xiaoqin; McClelland, Michael; Wang, Yipeng
2005-01-01
Background Many cutting-edge microarray analysis tools and algorithms, including commonly used limma and affy packages in Bioconductor, need sophisticated knowledge of mathematics, statistics and computer skills for implementation. Commercially available software can provide a user-friendly interface at considerable cost. To facilitate the use of these tools for microarray data analysis on an open platform we developed an online microarray data analysis platform, WebArray, for bench biologists to utilize these tools to explore data from single/dual color microarray experiments. Results The currently implemented functions were based on limma and affy package from Bioconductor, the spacings LOESS histogram (SPLOSH) method, PCA-assisted normalization method and genome mapping method. WebArray incorporates these packages and provides a user-friendly interface for accessing a wide range of key functions of limma and others, such as spot quality weight, background correction, graphical plotting, normalization, linear modeling, empirical bayes statistical analysis, false discovery rate (FDR) estimation, chromosomal mapping for genome comparison. Conclusion WebArray offers a convenient platform for bench biologists to access several cutting-edge microarray data analysis tools. The website is freely available at . It runs on a Linux server with Apache and MySQL. PMID:16371165
Aaron's Solution, Instructor's Problem: Teaching Surface Analysis Using GIS
ERIC Educational Resources Information Center
Koch, Tom; Denike, Ken
2007-01-01
Teaching GIS is relatively simple, a matter of helping students develop familiarity with the software. Mapping as an aid to thinking is harder to instruct. This article presents a laboratory and lecture package developed to teach the utility of mapping in a course on spatial data analysis. Following a historical review of the use of surface…
PredictABEL: an R package for the assessment of risk prediction models.
Kundu, Suman; Aulchenko, Yurii S; van Duijn, Cornelia M; Janssens, A Cecile J W
2011-04-01
The rapid identification of genetic markers for multifactorial diseases from genome-wide association studies is fuelling interest in investigating the predictive ability and health care utility of genetic risk models. Various measures are available for the assessment of risk prediction models, each addressing a different aspect of performance and utility. We developed PredictABEL, a package in R that covers descriptive tables, measures and figures that are used in the analysis of risk prediction studies such as measures of model fit, predictive ability and clinical utility, and risk distributions, calibration plot and the receiver operating characteristic plot. Tables and figures are saved as separate files in a user-specified format, which include publication-quality EPS and TIFF formats. All figures are available in a ready-made layout, but they can be customized to the preferences of the user. The package has been developed for the analysis of genetic risk prediction studies, but can also be used for studies that only include non-genetic risk factors. PredictABEL is freely available at the websites of GenABEL ( http://www.genabel.org ) and CRAN ( http://cran.r-project.org/).
McCarthy, Davis J; Campbell, Kieran R; Lun, Aaron T L; Wills, Quin F
2017-04-15
Single-cell RNA sequencing (scRNA-seq) is increasingly used to study gene expression at the level of individual cells. However, preparing raw sequence data for further analysis is not a straightforward process. Biases, artifacts and other sources of unwanted variation are present in the data, requiring substantial time and effort to be spent on pre-processing, quality control (QC) and normalization. We have developed the R/Bioconductor package scater to facilitate rigorous pre-processing, quality control, normalization and visualization of scRNA-seq data. The package provides a convenient, flexible workflow to process raw sequencing reads into a high-quality expression dataset ready for downstream analysis. scater provides a rich suite of plotting tools for single-cell data and a flexible data structure that is compatible with existing tools and can be used as infrastructure for future software development. The open-source code, along with installation instructions, vignettes and case studies, is available through Bioconductor at http://bioconductor.org/packages/scater . davis@ebi.ac.uk. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press.
Skills Analysis. Workshop Package on Skills Analysis, Skills Audit and Training Needs Analysis.
ERIC Educational Resources Information Center
Hayton, Geoff; And Others
This four-part package is designed to assist Australian workshop leaders running 2-day workshops on skills analysis, skills audit, and training needs analysis. Part A contains information on how to use the package and a list of workshop aims. Parts B, C, and D consist, respectively, of the workshop leader's guide; overhead transparency sheets and…
R classes and methods for SNP array data.
Scharpf, Robert B; Ruczinski, Ingo
2010-01-01
The Bioconductor project is an "open source and open development software project for the analysis and comprehension of genomic data" (1), primarily based on the R programming language. Infrastructure packages, such as Biobase, are maintained by Bioconductor core developers and serve several key roles to the broader community of Bioconductor software developers and users. In particular, Biobase introduces an S4 class, the eSet, for high-dimensional assay data. Encapsulating the assay data as well as meta-data on the samples, features, and experiment in the eSet class definition ensures propagation of the relevant sample and feature meta-data throughout an analysis. Extending the eSet class promotes code reuse through inheritance as well as interoperability with other R packages and is less error-prone. Recently proposed class definitions for high-throughput SNP arrays extend the eSet class. This chapter highlights the advantages of adopting and extending Biobase class definitions through a working example of one implementation of classes for the analysis of high-throughput SNP arrays.
Ogiwara, Yoshiko; Roman, Maxine J; Decker, Eric A; Goddard, Julie M
2016-04-01
Many packaged foods utilize synthetic chelators (e.g. ethylenediaminetetraacetic acid, EDTA) to inhibit iron-promoted oxidation or microbial growth which would result in quality loss. To address consumer demands for all natural products, we have previously developed a non-migratory iron chelating active packaging material by covalent immobilization of polyhydroxamate and demonstrated its efficacy in delaying lipid oxidation. Herein, we demonstrate the ability of this hydroxamate-functionalized iron chelating active packaging to retain iron chelating capacity; even in the presence of competing ions common in food. Both immobilized and soluble hydroxamate chelators retained iron chelating capacity in the presence of calcium, magnesium, and sodium competing ions, although at pH 5.0 the presence of calcium reduced immobilized hydroxamate iron chelation. A strong correlation was found between colorimetric and mass spectral analysis of iron chelation by the chelating packaging material. Such chelating active packaging may support reducing additive use in product formulations, while retaining quality and shelf life. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Adams, M. L.; Padovan, J.; Fertis, D. G.
1980-01-01
A general purpose squeeze-film damper interactive force element was developed, coded into a software package (module) and debugged. This software package was applied to nonliner dynamic analyses of some simple rotor systems. Results for pressure distributions show that the long bearing (end sealed) is a stronger bearing as compared to the short bearing as expected. Results of the nonlinear dynamic analysis, using a four degree of freedom simulation model, showed that the orbit of the rotating shaft increases nonlinearity to fill the bearing clearance as the unbalanced weight increases.
FTOOLS: A general package of software to manipulate FITS files
NASA Astrophysics Data System (ADS)
Blackburn, J. K.; Shaw, R. A.; Payne, H. E.; Hayes, J. J. E.; Heasarc
1999-12-01
FTOOLS, a highly modular collection of utilities for processing and analyzing data in the FITS (Flexible Image Transport System) format, has been developed in support of the HEASARC (High Energy Astrophysics Research Archive Center) at NASA's Goddard Space Flight Center. The FTOOLS package contains many utility programs which perform modular tasks on any FITS image or table, as well as higher-level analysis programs designed specifically for data from current and past high energy astrophysics missions. The utility programs for FITS tables are especially rich and powerful, and provide functions for presentation of file contents, extraction of specific rows or columns, appending or merging tables, binning values in a column or selecting subsets of rows based on a boolean expression. Individual FTOOLS programs can easily be chained together in scripts to achieve more complex operations such as the generation and displaying of spectra or light curves. FTOOLS development began in 1991 and has produced the main set of data analysis software for the current ASCA and RXTE space missions and for other archival sets of X-ray and gamma-ray data. The FTOOLS software package is supported on most UNIX platforms and on Windows machines. The user interface is controlled by standard parameter files that are very similar to those used by IRAF. The package is self documenting through a stand alone help task called fhelp. Software is written in ANSI C and FORTRAN to provide portability across most computer systems. The data format dependencies between hardware platforms are isolated through the FITSIO library package.
chipPCR: an R package to pre-process raw data of amplification curves.
Rödiger, Stefan; Burdukiewicz, Michał; Schierack, Peter
2015-09-01
Both the quantitative real-time polymerase chain reaction (qPCR) and quantitative isothermal amplification (qIA) are standard methods for nucleic acid quantification. Numerous real-time read-out technologies have been developed. Despite the continuous interest in amplification-based techniques, there are only few tools for pre-processing of amplification data. However, a transparent tool for precise control of raw data is indispensable in several scenarios, for example, during the development of new instruments. chipPCR is an R: package for the pre-processing and quality analysis of raw data of amplification curves. The package takes advantage of R: 's S4 object model and offers an extensible environment. chipPCR contains tools for raw data exploration: normalization, baselining, imputation of missing values, a powerful wrapper for amplification curve smoothing and a function to detect the start and end of an amplification curve. The capabilities of the software are enhanced by the implementation of algorithms unavailable in R: , such as a 5-point stencil for derivative interpolation. Simulation tools, statistical tests, plots for data quality management, amplification efficiency/quantification cycle calculation, and datasets from qPCR and qIA experiments are part of the package. Core functionalities are integrated in GUIs (web-based and standalone shiny applications), thus streamlining analysis and report generation. http://cran.r-project.org/web/packages/chipPCR. Source code: https://github.com/michbur/chipPCR. stefan.roediger@b-tu.de Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
RNA-Seq-Based Transcript Structure Analysis with TrBorderExt.
Wang, Yejun; Sun, Ming-An; White, Aaron P
2018-01-01
RNA-Seq has become a routine strategy for genome-wide gene expression comparisons in bacteria. Despite lower resolution in transcript border parsing compared with dRNA-Seq, TSS-EMOTE, Cappable-seq, Term-seq, and others, directional RNA-Seq still illustrates its advantages: low cost, quantification and transcript border analysis with a medium resolution (±10-20 nt). To facilitate mining of directional RNA-Seq datasets especially with respect to transcript structure analysis, we developed a tool, TrBorderExt, which can parse transcript start sites and termination sites accurately in bacteria. A detailed protocol is described in this chapter for how to use the software package step by step to identify bacterial transcript borders from raw RNA-Seq data. The package was developed with Perl and R programming languages, and is accessible freely through the website: http://www.szu-bioinf.org/TrBorderExt .
Analytical Design Package (ADP2): A computer aided engineering tool for aircraft transparency design
NASA Technical Reports Server (NTRS)
Wuerer, J. E.; Gran, M.; Held, T. W.
1994-01-01
The Analytical Design Package (ADP2) is being developed as a part of the Air Force Frameless Transparency Program (FTP). ADP2 is an integrated design tool consisting of existing analysis codes and Computer Aided Engineering (CAE) software. The objective of the ADP2 is to develop and confirm an integrated design methodology for frameless transparencies, related aircraft interfaces, and their corresponding tooling. The application of this methodology will generate high confidence for achieving a qualified part prior to mold fabrication. ADP2 is a customized integration of analysis codes, CAE software, and material databases. The primary CAE integration tool for the ADP2 is P3/PATRAN, a commercial-off-the-shelf (COTS) software tool. The open architecture of P3/PATRAN allows customized installations with different applications modules for specific site requirements. Integration of material databases allows the engineer to select a material, and those material properties are automatically called into the relevant analysis code. The ADP2 materials database will be composed of four independent schemas: CAE Design, Processing, Testing, and Logistics Support. The design of ADP2 places major emphasis on the seamless integration of CAE and analysis modules with a single intuitive graphical interface. This tool is being designed to serve and be used by an entire project team, i.e., analysts, designers, materials experts, and managers. The final version of the software will be delivered to the Air Force in Jan. 1994. The Analytical Design Package (ADP2) will then be ready for transfer to industry. The package will be capable of a wide range of design and manufacturing applications.
Bayesian Hierarchical Random Effects Models in Forensic Science.
Aitken, Colin G G
2018-01-01
Statistical modeling of the evaluation of evidence with the use of the likelihood ratio has a long history. It dates from the Dreyfus case at the end of the nineteenth century through the work at Bletchley Park in the Second World War to the present day. The development received a significant boost in 1977 with a seminal work by Dennis Lindley which introduced a Bayesian hierarchical random effects model for the evaluation of evidence with an example of refractive index measurements on fragments of glass. Many models have been developed since then. The methods have now been sufficiently well-developed and have become so widespread that it is timely to try and provide a software package to assist in their implementation. With that in mind, a project (SAILR: Software for the Analysis and Implementation of Likelihood Ratios) was funded by the European Network of Forensic Science Institutes through their Monopoly programme to develop a software package for use by forensic scientists world-wide that would assist in the statistical analysis and implementation of the approach based on likelihood ratios. It is the purpose of this document to provide a short review of a small part of this history. The review also provides a background, or landscape, for the development of some of the models within the SAILR package and references to SAILR as made as appropriate.
ATACseqQC: a Bioconductor package for post-alignment quality assessment of ATAC-seq data.
Ou, Jianhong; Liu, Haibo; Yu, Jun; Kelliher, Michelle A; Castilla, Lucio H; Lawson, Nathan D; Zhu, Lihua Julie
2018-03-01
ATAC-seq (Assays for Transposase-Accessible Chromatin using sequencing) is a recently developed technique for genome-wide analysis of chromatin accessibility. Compared to earlier methods for assaying chromatin accessibility, ATAC-seq is faster and easier to perform, does not require cross-linking, has higher signal to noise ratio, and can be performed on small cell numbers. However, to ensure a successful ATAC-seq experiment, step-by-step quality assurance processes, including both wet lab quality control and in silico quality assessment, are essential. While several tools have been developed or adopted for assessing read quality, identifying nucleosome occupancy and accessible regions from ATAC-seq data, none of the tools provide a comprehensive set of functionalities for preprocessing and quality assessment of aligned ATAC-seq datasets. We have developed a Bioconductor package, ATACseqQC, for easily generating various diagnostic plots to help researchers quickly assess the quality of their ATAC-seq data. In addition, this package contains functions to preprocess aligned ATAC-seq data for subsequent peak calling. Here we demonstrate the utilities of our package using 25 publicly available ATAC-seq datasets from four studies. We also provide guidelines on what the diagnostic plots should look like for an ideal ATAC-seq dataset. This software package has been used successfully for preprocessing and assessing several in-house and public ATAC-seq datasets. Diagnostic plots generated by this package will facilitate the quality assessment of ATAC-seq data, and help researchers to evaluate their own ATAC-seq experiments as well as select high-quality ATAC-seq datasets from public repositories such as GEO to avoid generating hypotheses or drawing conclusions from low-quality ATAC-seq experiments. The software, source code, and documentation are freely available as a Bioconductor package at https://bioconductor.org/packages/release/bioc/html/ATACseqQC.html .
High-performance packaging for monolithic microwave and millimeter-wave integrated circuits
NASA Technical Reports Server (NTRS)
Shalkhauser, K. A.; Li, K.; Shih, Y. C.
1992-01-01
Packaging schemes were developed that provide low-loss, hermetic enclosure for advanced monolithic microwave and millimeter-wave integrated circuits (MMICs). The package designs are based on a fused quartz substrate material that offers improved radio frequency (RF) performance through 44 gigahertz (GHz). The small size and weight of the packages make them appropriate for a variety of applications, including phased array antenna systems. Packages were designed in two forms; one for housing a single MMIC chip, the second in the form of a multi-chip phased array module. The single chip array module was developed in three separate sizes, for chips of different geometry and frequency requirements. The phased array module was developed to address packaging directly for antenna applications, and includes transmission line and interconnect structures to support multi-element operation. All packages are fabricated using fused quartz substrate materials. As part of the packaging effort, a test fixture was developed to interface the single chip packages to conventional laboratory instrumentation for characterization of the packaged devices. The package and test fixture designs were both developed in a generic sense, optimizing performance for a wide range of possible applications and devices.
Software and package applicating for network meta-analysis: A usage-based comparative study.
Xu, Chang; Niu, Yuming; Wu, Junyi; Gu, Huiyun; Zhang, Chao
2017-12-21
To compare and analyze the characteristics and functions of software applications for network meta-analysis (NMA). PubMed, EMbase, The Cochrane Library, the official websites of Bayesian inference Using Gibbs Sampling (BUGS), Stata and R, and Google were searched to collect the software and packages for performing NMA; software and packages published up to March 2016 were included. After collecting the software, packages, and their user guides, we used the software and packages to calculate a typical example. All characteristics, functions, and computed results were compared and analyzed. Ten types of software were included, including programming and non-programming software. They were developed mainly based on Bayesian or frequentist theory. Most types of software have the characteristics of easy operation, easy mastery, exact calculation, or excellent graphing. However, there was no single software that performed accurate calculations with superior graphing; this could only be achieved through the combination of two or more types of software. This study suggests that the user should choose the appropriate software according to personal programming basis, operational habits, and financial ability. Then, the choice of the combination of BUGS and R (or Stata) software to perform the NMA is considered. © 2017 Chinese Cochrane Center, West China Hospital of Sichuan University and John Wiley & Sons Australia, Ltd.
SimHap GUI: An intuitive graphical user interface for genetic association analysis
Carter, Kim W; McCaskie, Pamela A; Palmer, Lyle J
2008-01-01
Background Researchers wishing to conduct genetic association analysis involving single nucleotide polymorphisms (SNPs) or haplotypes are often confronted with the lack of user-friendly graphical analysis tools, requiring sophisticated statistical and informatics expertise to perform relatively straightforward tasks. Tools, such as the SimHap package for the R statistics language, provide the necessary statistical operations to conduct sophisticated genetic analysis, but lacks a graphical user interface that allows anyone but a professional statistician to effectively utilise the tool. Results We have developed SimHap GUI, a cross-platform integrated graphical analysis tool for conducting epidemiological, single SNP and haplotype-based association analysis. SimHap GUI features a novel workflow interface that guides the user through each logical step of the analysis process, making it accessible to both novice and advanced users. This tool provides a seamless interface to the SimHap R package, while providing enhanced functionality such as sophisticated data checking, automated data conversion, and real-time estimations of haplotype simulation progress. Conclusion SimHap GUI provides a novel, easy-to-use, cross-platform solution for conducting a range of genetic and non-genetic association analyses. This provides a free alternative to commercial statistics packages that is specifically designed for genetic association analysis. PMID:19109877
Accelerator mass spectrometry analysis of aroma compound absorption in plastic packaging materials
NASA Astrophysics Data System (ADS)
Stenström, Kristina; Erlandsson, Bengt; Hellborg, Ragnar; Wiebert, Anders; Skog, Göran; Nielsen, Tim
1994-05-01
Absorption of aroma compounds in plastic packaging materials may affect the taste of the packaged food and it may also change the quality of the packaging material. A method to determine the aroma compound absorption in polymers by accelerator mass spectrometry (AMS) is being developed at the Lund Pelletron AMS facility. The high sensitivity of the AMS method makes it possible to study these phenomena under realistic conditions. As a first test low density polyethylene exposed to 14C-doped ethyl acetate is examined. After converting the polymer samples with the absorbed aroma compounds to graphite, the {14C }/{13C } ratio of the samples is measured by the AMS system and the degree of aroma compound absorption is established. The results are compared with those obtained by supercritical fluid extraction coupled to gas chromatography (SFE-GC).
NASA Astrophysics Data System (ADS)
Wang, Yu; Liu, Qun
2013-01-01
Surplus-production models are widely used in fish stock assessment and fisheries management due to their simplicity and lower data demands than age-structured models such as Virtual Population Analysis. The CEDA (catch-effort data analysis) and ASPIC (a surplus-production model incorporating covariates) computer packages are data-fitting or parameter estimation tools that have been developed to analyze catch-and-effort data using non-equilibrium surplus production models. We applied CEDA and ASPIC to the hairtail ( Trichiurus japonicus) fishery in the East China Sea. Both packages produced robust results and yielded similar estimates. In CEDA, the Schaefer surplus production model with log-normal error assumption produced results close to those of ASPIC. CEDA is sensitive to the choice of initial proportion, while ASPIC is not. However, CEDA produced higher R 2 values than ASPIC.
Multiple-Group Analysis Using the sem Package in the R System
ERIC Educational Resources Information Center
Evermann, Joerg
2010-01-01
Multiple-group analysis in covariance-based structural equation modeling (SEM) is an important technique to ensure the invariance of latent construct measurements and the validity of theoretical models across different subpopulations. However, not all SEM software packages provide multiple-group analysis capabilities. The sem package for the R…
Sustainable Library Development Training Package
ERIC Educational Resources Information Center
Peace Corps, 2012
2012-01-01
This Sustainable Library Development Training Package supports Peace Corps' Focus In/Train Up strategy, which was implemented following the 2010 Comprehensive Agency Assessment. Sustainable Library Development is a technical training package in Peace Corps programming within the Education sector. The training package addresses the Volunteer…
Orbiter Flying Qualities (OFQ) Workstation user's guide
NASA Technical Reports Server (NTRS)
Myers, Thomas T.; Parseghian, Zareh; Hogue, Jeffrey R.
1988-01-01
This project was devoted to the development of a software package, called the Orbiter Flying Qualities (OFQ) Workstation, for working with the OFQ Archives which are specially selected sets of space shuttle entry flight data relevant to flight control and flying qualities. The basic approach to creation of the workstation software was to federate and extend commercial software products to create a low cost package that operates on personal computers. Provision was made to link the workstation to large computers, but the OFQ Archive files were also converted to personal computer diskettes and can be stored on workstation hard disk drives. The primary element of the workstation developed in the project is the Interactive Data Handler (IDH) which allows the user to select data subsets from the archives and pass them to specialized analysis programs. The IDH was developed as an application in a relational database management system product. The specialized analysis programs linked to the workstation include a spreadsheet program, FREDA for spectral analysis, MFP for frequency domain system identification, and NIPIP for pilot-vehicle system parameter identification. The workstation also includes capability for ensemble analysis over groups of missions.
DOE-EM-45 PACKAGING OPERATIONS AND MAINTENANCE COURSE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watkins, R.; England, J.
2010-05-28
Savannah River National Laboratory - Savannah River Packaging Technology (SRNL-SRPT) delivered the inaugural offering of the Packaging Operations and Maintenance Course for DOE-EM-45's Packaging Certification Program (PCP) at the University of South Carolina Aiken on September 1 and 2, 2009. Twenty-nine students registered, attended, and completed this training. The DOE-EM-45 Packaging Certification Program (PCP) sponsored the presentation of a new training course, Packaging Maintenance and Operations, on September 1-2, 2009 at the University of South Carolina Aiken (USC-Aiken) campus in Aiken, SC. The premier offering of the course was developed and presented by the Savannah River National Laboratory, and attendedmore » by twenty-nine students across the DOE, NNSA and private industry. This training informed package users of the requirements associated with handling shipping containers at a facility (user) level and provided a basic overview of the requirements typically outlined in Safety Analysis Report for Packaging (SARP) Chapters 1, 7, and 8. The course taught packaging personnel about the regulatory nature of SARPs to help reduce associated and often costly packaging errors. Some of the topics covered were package contents, loading, unloading, storage, torque requirements, maintaining records, how to handle abnormal conditions, lessons learned, leakage testing (including demonstration), and replacement parts. The target audience for this course was facility operations personnel, facility maintenance personnel, and field quality assurance personnel who are directly involved in the handling of shipping containers. The training also aimed at writers of SARP Chapters 1, 7, and 8, package designers, and anyone else involved in radioactive material packaging and transportation safety. Student feedback and critiques of the training were very positive. SRNL will offer the course again at USC Aiken in September 2010.« less
Nørgaard, Birgitte; Mogensen, Christian Backer; Teglbjærg, Lars Stubbe; Brabrand, Mikkel; Lassen, Annmarie Touborg
2016-06-01
In the Region of Southern Denmark, the emergency departments categorise patients based on presenting symptoms and a proposed diagnostic package (n = 40) within each category. The diagnostic packages describe relevant clinical information and standard laboratory and other investigations to be performed. Allocation to the right diagnostic package is assumed to be associated with a higher quality. The aim of this study was to describe to which degree the assigned symptom-based diagnostic packages are related to relevant discharge diagnoses. This was a descriptive cohort study. The analysis was based on data on assigned diagnostic package, patient discharge diagnosis, hospital, gender, age, time of admission and discharge, length of stay, diagnostic package assigned, discharge diagnosis and co-morbidity. An acceptable standard for what would be an appropriate primarily diagnostic package was developed using a modified Delphi method. A total of 16,543 patient contacts were identified. Women constituted 52.2% (n = 8,925) of the patients. The median age was 64 years and the median length of stay was one day. All diagnostic packages were represented. A total of 68% of the included patients had been assigned an acceptable diagnostic package (95% confidence interval: 67.2-68.7). We found an appropriate use of one of 30 diagnostic packages in more than 50% of the cases. We found that 68% of the included patients were assigned an acceptable diagnostic package and that about 80% of all acute pathways were covered by 14 diagnostic packages. The study was funded by Region of Southern Denmark. The study was registered with the Danish Data Protection Agency (No. 2008-58-0035). No further approval was required.
Martínez-Moral, María Pilar; Tena, María Teresa
2012-11-15
The development and characterisation of a method based on reverse-phase ultra-performance liquid chromatography (UPLC) coupled to a quadrupole-time of flight mass spectrometer (Q-TOF-MS) with negative electrospray ionisation (ESI) to determine perfluorinated compounds (PFCs) in packaging is presented in this paper. Analytes were quantitatively recovered from packaging with methanol in only one PLE cycle of 6 min at 100 °C. The UPLC allowed the successful separation of the studied PFCs in less than 4 min. The whole method presented good precision, with RSDs below 8%, LODs from 0.6 to 16 ng g(-1); and excellent recovery values, around 100% in all cases, were achieved. The PLE-UPLC-MS method was applied to the analysis of popcorn packaging for microwave cooking. Besides the most commonly studied PFCs: PFOA and PFOS, the presence of other perfluorocarboxylic acids (PFCAs) in popcorn packaging is evidenced in this work. Copyright © 2012 Elsevier B.V. All rights reserved.
spMC: an R-package for 3D lithological reconstructions based on spatial Markov chains
NASA Astrophysics Data System (ADS)
Sartore, Luca; Fabbri, Paolo; Gaetan, Carlo
2016-09-01
The paper presents the spatial Markov Chains (spMC) R-package and a case study of subsoil simulation/prediction located in a plain site of Northeastern Italy. spMC is a quite complete collection of advanced methods for data inspection, besides spMC implements Markov Chain models to estimate experimental transition probabilities of categorical lithological data. Furthermore, simulation methods based on most known prediction methods (as indicator Kriging and CoKriging) were implemented in spMC package. Moreover, other more advanced methods are available for simulations, e.g. path methods and Bayesian procedures, that exploit the maximum entropy. Since the spMC package was developed for intensive geostatistical computations, part of the code is implemented for parallel computations via the OpenMP constructs. A final analysis of this computational efficiency compares the simulation/prediction algorithms by using different numbers of CPU cores, and considering the example data set of the case study included in the package.
Teaching preschool children to report suspicious packages to adults.
May, Michael E; Shayter, Ashley M; Schmick, Ayla; Barron, Becky; Doherty, Meghan; Johnson, Matthew
2018-05-16
Law enforcement agencies stress that public reporting of terror-related crime is the predominant means for disrupting these actions. However, schools may be unprepared because the majority of the populace may not understand the threat of suspicious materials or what to do when they are found on school grounds. The purpose of this study was to systematically teach preschool children to identify and report suspicious packages across three experiments. In the first experiment, we used multiple exemplar training to teach children to identify the characteristics of safe and unsafe packages. In the second experiment, we taught participants to identify the locations where packages should be considered unsafe. Finally, in the third experiment, we used behavioral skills training to teach participants to avoid touching unsafe packages, leave the area where they were located, and report their discovery to an adult. Results suggest the participants quickly developed these skills. Implications for safety skills in young school children are discussed. © 2018 Society for the Experimental Analysis of Behavior.
User's manual for the coupled rotor/airframe vibration analysis graphic package
NASA Technical Reports Server (NTRS)
Studwell, R. E.
1982-01-01
User instructions for a graphics package for coupled rotor/airframe vibration analysis are presented. Responses to plot package messages which the user must make to activate plot package operations and options are described. Installation instructions required to set up the program on the CDC system are included. The plot package overlay structure and subroutines which have to be modified for the CDC system are also described. Operating instructions for CDC applications are included.
Direct Method Transcription for a Human-Class Translunar Injection Trajectory Optimization
NASA Technical Reports Server (NTRS)
Witzberger, Kevin E.; Zeiler, Tom
2012-01-01
This paper presents a new trajectory optimization software package developed in the framework of a low-to-high fidelity 3 degrees-of-freedom (DOF)/6-DOF vehicle simulation program named Mission Analysis Simulation Tool in Fortran (MASTIF) and its application to a translunar trajectory optimization problem. The functionality of the developed optimization package is implemented as a new "mode" in generalized settings to make it applicable for a general trajectory optimization problem. In doing so, a direct optimization method using collocation is employed for solving the problem. Trajectory optimization problems in MASTIF are transcribed to a constrained nonlinear programming (NLP) problem and solved with SNOPT, a commercially available NLP solver. A detailed description of the optimization software developed is provided as well as the transcription specifics for the translunar injection (TLI) problem. The analysis includes a 3-DOF trajectory TLI optimization and a 3-DOF vehicle TLI simulation using closed-loop guidance.
Software technology testbed softpanel prototype
NASA Technical Reports Server (NTRS)
1991-01-01
The following subject areas are covered: analysis of using Ada for the development of real-time control systems for the Space Station; analysis of the functionality of the Application Generator; analysis of the User Support Environment criteria; analysis of the SSE tools and procedures which are to be used for the development of ground/flight software for the Space Station; analysis if the CBATS tutorial (an Ada tutorial package); analysis of Interleaf; analysis of the Integration, Test and Verification process of the Space Station; analysis of the DMS on-orbit flight architecture; analysis of the simulation architecture.
Green Packaging Management of Logistics Enterprises
NASA Astrophysics Data System (ADS)
Zhang, Guirong; Zhao, Zongjian
From the connotation of green logistics management, we discuss the principles of green packaging, and from the two levels of government and enterprises, we put forward a specific management strategy. The management of green packaging can be directly and indirectly promoted by laws, regulations, taxation, institutional and other measures. The government can also promote new investment to the development of green packaging materials, and establish specialized institutions to identify new packaging materials, standardization of packaging must also be accomplished through the power of the government. Business units of large scale through the packaging and container-based to reduce the use of packaging materials, develop and use green packaging materials and easy recycling packaging materials for proper packaging.
Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) User's Guide
NASA Technical Reports Server (NTRS)
Chapman, Jeffryes W.; Lavelle, Thomas M.; May, Ryan D.; Litt, Jonathan S.; Guo, Ten-Huei
2014-01-01
The Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) software package is an open source, MATLABSimulink toolbox (plug in) that can be used by industry professionals and academics for the development of thermodynamic and controls simulations.
TRAPR: R Package for Statistical Analysis and Visualization of RNA-Seq Data.
Lim, Jae Hyun; Lee, Soo Youn; Kim, Ju Han
2017-03-01
High-throughput transcriptome sequencing, also known as RNA sequencing (RNA-Seq), is a standard technology for measuring gene expression with unprecedented accuracy. Numerous bioconductor packages have been developed for the statistical analysis of RNA-Seq data. However, these tools focus on specific aspects of the data analysis pipeline, and are difficult to appropriately integrate with one another due to their disparate data structures and processing methods. They also lack visualization methods to confirm the integrity of the data and the process. In this paper, we propose an R-based RNA-Seq analysis pipeline called TRAPR, an integrated tool that facilitates the statistical analysis and visualization of RNA-Seq expression data. TRAPR provides various functions for data management, the filtering of low-quality data, normalization, transformation, statistical analysis, data visualization, and result visualization that allow researchers to build customized analysis pipelines.
iScreen: Image-Based High-Content RNAi Screening Analysis Tools.
Zhong, Rui; Dong, Xiaonan; Levine, Beth; Xie, Yang; Xiao, Guanghua
2015-09-01
High-throughput RNA interference (RNAi) screening has opened up a path to investigating functional genomics in a genome-wide pattern. However, such studies are often restricted to assays that have a single readout format. Recently, advanced image technologies have been coupled with high-throughput RNAi screening to develop high-content screening, in which one or more cell image(s), instead of a single readout, were generated from each well. This image-based high-content screening technology has led to genome-wide functional annotation in a wider spectrum of biological research studies, as well as in drug and target discovery, so that complex cellular phenotypes can be measured in a multiparametric format. Despite these advances, data analysis and visualization tools are still largely lacking for these types of experiments. Therefore, we developed iScreen (image-Based High-content RNAi Screening Analysis Tool), an R package for the statistical modeling and visualization of image-based high-content RNAi screening. Two case studies were used to demonstrate the capability and efficiency of the iScreen package. iScreen is available for download on CRAN (http://cran.cnr.berkeley.edu/web/packages/iScreen/index.html). The user manual is also available as a supplementary document. © 2014 Society for Laboratory Automation and Screening.
Efficient population-scale variant analysis and prioritization with VAPr.
Birmingham, Amanda; Mark, Adam M; Mazzaferro, Carlo; Xu, Guorong; Fisch, Kathleen M
2018-04-06
With the growing availability of population-scale whole-exome and whole-genome sequencing, demand for reproducible, scalable variant analysis has spread within genomic research communities. To address this need, we introduce the Python package VAPr (Variant Analysis and Prioritization). VAPr leverages existing annotation tools ANNOVAR and MyVariant.info with MongoDB-based flexible storage and filtering functionality. It offers biologists and bioinformatics generalists easy-to-use and scalable analysis and prioritization of genomic variants from large cohort studies. VAPr is developed in Python and is available for free use and extension under the MIT License. An install package is available on PyPi at https://pypi.python.org/pypi/VAPr, while source code and extensive documentation are on GitHub at https://github.com/ucsd-ccbb/VAPr. kfisch@ucsd.edu.
An analysis of packaging formats for complex digtal objects: review of principles
NASA Astrophysics Data System (ADS)
Bekaert, Jeroen L.; Hochstenbach, Patrick; De Kooning, Emiel; Van de Walle, Rik
2003-11-01
During recent years, the number of organizations making digital information available has massively increased. This evolution encouraged the development of standards for packaging and encoding digital representations of complex objects (such as a digital music albums or digitized books and photograph albums). The primary goal of this article is to offer a method to compare these packaging standards and best practices tailored to the needs of the digital library community and the rising digital preservation programs. The contribution of this paper is the definition of an integrated reference model, based on both the OAIS framework and some additional significant properties that affect the quality, usability, encoding and behavior of the digital objects.
clusterProfiler: an R package for comparing biological themes among gene clusters.
Yu, Guangchuang; Wang, Li-Gen; Han, Yanyan; He, Qing-Yu
2012-05-01
Increasing quantitative data generated from transcriptomics and proteomics require integrative strategies for analysis. Here, we present an R package, clusterProfiler that automates the process of biological-term classification and the enrichment analysis of gene clusters. The analysis module and visualization module were combined into a reusable workflow. Currently, clusterProfiler supports three species, including humans, mice, and yeast. Methods provided in this package can be easily extended to other species and ontologies. The clusterProfiler package is released under Artistic-2.0 License within Bioconductor project. The source code and vignette are freely available at http://bioconductor.org/packages/release/bioc/html/clusterProfiler.html.
Poppr: an R package for genetic analysis of populations with mixed (clonal/sexual) reproduction
USDA-ARS?s Scientific Manuscript database
Poppr is an R package for analysis of population genetic data. It extends the adegenet package and provides several novel tools, particularly with regard to analysis of data from admixed, clonal, and/or sexual populations. Currently, poppr can be used for dominant/codominant and haploid/diploid gene...
Reliability Technology to Achieve Insertion of Advanced Packaging (RELTECH) program
NASA Astrophysics Data System (ADS)
Fayette, Daniel F.; Speicher, Patricia; Stoklosa, Mark J.; Evans, Jillian V.; Evans, John W.; Gentile, Mike; Pagel, Chuck A.; Hakim, Edward
1993-08-01
A joint military-commercial effort to evaluate multichip module (MCM) structures is discussed. The program, Reliability Technology to Achieve Insertion of Advanced Packaging (RELTECH), has been designed to identify the failure mechanisms that are possible in MCM structures. The RELTECH test vehicles, technical assessment task, product evaluation plan, reliability modeling task, accelerated and environmental testing, and post-test physical analysis and failure analysis are described. The information obtained through RELTECH can be used to address standardization issues, through development of cost effective qualification and appropriate screening criteria, for inclusion into a commercial specification and the MIL-H-38534 general specification for hybrid microcircuits.
Use of the MATRIXx Integrated Toolkit on the Microwave Anisotropy Probe Attitude Control System
NASA Technical Reports Server (NTRS)
Ward, David K.; Andrews, Stephen F.; McComas, David C.; ODonnell, James R., Jr.
1999-01-01
Recent advances in analytical software tools allow the analysis, simulation, flight code, and documentation of an algorithm to be generated from a single source, all within one integrated analytical design package. NASA's Microwave Anisotropy Probe project has used one such package, Integrated Systems' MATRIXx suite, in the design of the spacecraft's Attitude Control System. The project's experience with the linear analysis, simulation, code generation, and documentation tools will be presented and compared with more traditional development tools. In particular, the quality of the flight software generated will be examined in detail. Finally, lessons learned on each of the tools will be shared.
Reliability Technology to Achieve Insertion of Advanced Packaging (RELTECH) program
NASA Technical Reports Server (NTRS)
Fayette, Daniel F.; Speicher, Patricia; Stoklosa, Mark J.; Evans, Jillian V.; Evans, John W.; Gentile, Mike; Pagel, Chuck A.; Hakim, Edward
1993-01-01
A joint military-commercial effort to evaluate multichip module (MCM) structures is discussed. The program, Reliability Technology to Achieve Insertion of Advanced Packaging (RELTECH), has been designed to identify the failure mechanisms that are possible in MCM structures. The RELTECH test vehicles, technical assessment task, product evaluation plan, reliability modeling task, accelerated and environmental testing, and post-test physical analysis and failure analysis are described. The information obtained through RELTECH can be used to address standardization issues, through development of cost effective qualification and appropriate screening criteria, for inclusion into a commercial specification and the MIL-H-38534 general specification for hybrid microcircuits.
An R package for state-trace analysis.
Prince, Melissa; Hawkins, Guy; Love, Jonathon; Heathcote, Andrew
2012-09-01
State-trace analysis (Bamber, Journal of Mathematical Psychology, 19, 137-181, 1979) is a graphical analysis that can determine whether one or more than one latent variable mediates an apparent dissociation between the effects of two experimental manipulations. State-trace analysis makes only ordinal assumptions and so, is not confounded by range effects that plague alternative methods, especially when performance is measured on a bounded scale (such as accuracy). We describe and illustrate the application of a freely available GUI driven package, StateTrace, for the R language. StateTrace automates many aspects of a state-trace analysis of accuracy and other binary response data, including customizable graphics and the efficient management of computationally intensive Bayesian methods for quantifying evidence about the outcomes of a state-trace experiment, developed by Prince, Brown, and Heathcote (Psychological Methods, 17, 78-99, 2012).
Long-term Preservation of Data Analysis Capabilities
NASA Astrophysics Data System (ADS)
Gabriel, C.; Arviset, C.; Ibarra, A.; Pollock, A.
2015-09-01
While the long-term preservation of scientific data obtained by large astrophysics missions is ensured through science archives, the issue of data analysis software preservation has hardly been addressed. Efforts by large data centres have contributed so far to maintain some instrument or mission-specific data reduction packages on top of high-level general purpose data analysis software. However, it is always difficult to keep software alive without support and maintenance once the active phase of a mission is over. This is especially difficult in the budgetary model followed by space agencies. We discuss the importance of extending the lifetime of dedicated data analysis packages and review diverse strategies under development at ESA using new paradigms such as Virtual Machines, Cloud Computing, and Software as a Service for making possible full availability of data analysis and calibration software for decades at minimal cost.
MC-GenomeKey: a multicloud system for the detection and annotation of genomic variants.
Elshazly, Hatem; Souilmi, Yassine; Tonellato, Peter J; Wall, Dennis P; Abouelhoda, Mohamed
2017-01-20
Next Generation Genome sequencing techniques became affordable for massive sequencing efforts devoted to clinical characterization of human diseases. However, the cost of providing cloud-based data analysis of the mounting datasets remains a concerning bottleneck for providing cost-effective clinical services. To address this computational problem, it is important to optimize the variant analysis workflow and the used analysis tools to reduce the overall computational processing time, and concomitantly reduce the processing cost. Furthermore, it is important to capitalize on the use of the recent development in the cloud computing market, which have witnessed more providers competing in terms of products and prices. In this paper, we present a new package called MC-GenomeKey (Multi-Cloud GenomeKey) that efficiently executes the variant analysis workflow for detecting and annotating mutations using cloud resources from different commercial cloud providers. Our package supports Amazon, Google, and Azure clouds, as well as, any other cloud platform based on OpenStack. Our package allows different scenarios of execution with different levels of sophistication, up to the one where a workflow can be executed using a cluster whose nodes come from different clouds. MC-GenomeKey also supports scenarios to exploit the spot instance model of Amazon in combination with the use of other cloud platforms to provide significant cost reduction. To the best of our knowledge, this is the first solution that optimizes the execution of the workflow using computational resources from different cloud providers. MC-GenomeKey provides an efficient multicloud based solution to detect and annotate mutations. The package can run in different commercial cloud platforms, which enables the user to seize the best offers. The package also provides a reliable means to make use of the low-cost spot instance model of Amazon, as it provides an efficient solution to the sudden termination of spot machines as a result of a sudden price increase. The package has a web-interface and it is available for free for academic use.
Clark, Robin A; Shoaib, Mohammed; Hewitt, Katherine N; Stanford, S Clare; Bate, Simon T
2012-08-01
InVivoStat is a free-to-use statistical software package for analysis of data generated from animal experiments. The package is designed specifically for researchers in the behavioural sciences, where exploiting the experimental design is crucial for reliable statistical analyses. This paper compares the analysis of three experiments conducted using InVivoStat with other widely used statistical packages: SPSS (V19), PRISM (V5), UniStat (V5.6) and Statistica (V9). We show that InVivoStat provides results that are similar to those from the other packages and, in some cases, are more advanced. This investigation provides evidence of further validation of InVivoStat and should strengthen users' confidence in this new software package.
mcaGUI: microbial community analysis R-Graphical User Interface (GUI).
Copeland, Wade K; Krishnan, Vandhana; Beck, Daniel; Settles, Matt; Foster, James A; Cho, Kyu-Chul; Day, Mitch; Hickey, Roxana; Schütte, Ursel M E; Zhou, Xia; Williams, Christopher J; Forney, Larry J; Abdo, Zaid
2012-08-15
Microbial communities have an important role in natural ecosystems and have an impact on animal and human health. Intuitive graphic and analytical tools that can facilitate the study of these communities are in short supply. This article introduces Microbial Community Analysis GUI, a graphical user interface (GUI) for the R-programming language (R Development Core Team, 2010). With this application, researchers can input aligned and clustered sequence data to create custom abundance tables and perform analyses specific to their needs. This GUI provides a flexible modular platform, expandable to include other statistical tools for microbial community analysis in the future. The mcaGUI package and source are freely available as part of Bionconductor at http://www.bioconductor.org/packages/release/bioc/html/mcaGUI.html
Using R to implement spatial analysis in open source environment
NASA Astrophysics Data System (ADS)
Shao, Yixi; Chen, Dong; Zhao, Bo
2007-06-01
R is an open source (GPL) language and environment for spatial analysis, statistical computing and graphics which provides a wide variety of statistical and graphical techniques, and is highly extensible. In the Open Source environment it plays an important role in doing spatial analysis. So, to implement spatial analysis in the Open Source environment which we called the Open Source geocomputation is using the R data analysis language integrated with GRASS GIS and MySQL or PostgreSQL. This paper explains the architecture of the Open Source GIS environment and emphasizes the role R plays in the aspect of spatial analysis. Furthermore, one apt illustration of the functions of R is given in this paper through the project of constructing CZPGIS (Cheng Zhou Population GIS) supported by Changzhou Government, China. In this project we use R to implement the geostatistics in the Open Source GIS environment to evaluate the spatial correlation of land price and estimate it by Kriging Interpolation. We also use R integrated with MapServer and php to show how R and other Open Source software cooperate with each other in WebGIS environment, which represents the advantages of using R to implement spatial analysis in Open Source GIS environment. And in the end, we points out that the packages for spatial analysis in R is still scattered and the limited memory is still a bottleneck when large sum of clients connect at the same time. Therefore further work is to group the extensive packages in order or design normative packages and make R cooperate better with other commercial software such as ArcIMS. Also we look forward to developing packages for land price evaluation.
New Mexico Play Fairway Analysis: Particle Tracking ArcGIS Map Packages
Jeff Pepin
2015-11-15
These are map packages used to visualize geochemical particle-tracking analysis results in ArcGIS. It includes individual map packages for several regions of New Mexico including: Acoma, Rincon, Gila, Las Cruces, Socorro and Truth or Consequences.
Student Development of Educational Software: Spin-Offs from Classroom Use of DIAS.
ERIC Educational Resources Information Center
Harrington, John A., Jr.; And Others
1988-01-01
Describes several college courses which encourage students to develop computer software programs in the areas of remote sensing and geographic information systems. A microcomputer-based tutorial package, the Digital Image Analysis System (DAIS), teaches the principles of digital processing. (LS)
Improvement of calculation method for electrical parameters of short network of ore-thermal furnaces
NASA Astrophysics Data System (ADS)
Aliferov, A. I.; Bikeev, R. A.; Goreva, L. P.
2017-10-01
The paper describes a new calculation method for active and inductive resistance of split interleaved current leads packages in ore-thermal electric furnaces. The method is developed on basis of regression analysis of dependencies of active and inductive resistances of the packages on their geometrical parameters, mutual disposition and interleaving pattern. These multi-parametric calculations have been performed with ANSYS software. The proposed method allows solving split current lead electrical parameters minimization and balancing problems for ore-thermal furnaces.
PresenceAbsence: An R package for presence absence analysis
Elizabeth A. Freeman; Gretchen Moisen
2008-01-01
The PresenceAbsence package for R provides a set of functions useful when evaluating the results of presence-absence analysis, for example, models of species distribution or the analysis of diagnostic tests. The package provides a toolkit for selecting the optimal threshold for translating a probability surface into presence-absence maps specifically tailored to their...
Hyperspectral imaging for differentiation of foreign materials from pinto beans
NASA Astrophysics Data System (ADS)
Mehrubeoglu, Mehrube; Zemlan, Michael; Henry, Sam
2015-09-01
Food safety and quality in packaged products are paramount in the food processing industry. To ensure that packaged products are free of foreign materials, such as debris and pests, unwanted materials mixed with the targeted products must be detected before packaging. A portable hyperspectral imaging system in the visible-to-NIR range has been used to acquire hyperspectral data cubes from pinto beans that have been mixed with foreign matter. Bands and band ratios have been identified as effective features to develop a classification scheme for detection of foreign materials in pinto beans. A support vector machine has been implemented with a quadratic kernel to separate pinto beans and background (Class 1) from all other materials (Class 2) in each scene. After creating a binary classification map for the scene, further analysis of these binary images allows separation of false positives from true positives for proper removal action during packaging.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bronowski, D.R.; Madsen, M.M.
The Heat Source/Radioisotopic Thermoelectric Generator shipping container is a Type B packaging design currently under development by Los Alamos National Laboratory. Type B packaging for transporting radioactive material is required to maintain containment and shielding after being exposed to the normal and hypothetical accident environments defined in Title 10 Code of Federal Regulations Part 71. A combination of testing and analysis is used to verify the adequacy of this package design. This report documents the test program portion of the design verification, using several prototype packages. Four types of testing were performed: 30-foot hypothetical accident condition drop tests in threemore » orientations, 40-inch hypothetical accident condition puncture tests in five orientations, a 21 psi external overpressure test, and a normal conditions of transport test consisting of a water spray and a 4 foot drop test. 18 refs., 104 figs., 13 tabs.« less
Compact DFB laser modules with integrated isolator at 935 nm
NASA Astrophysics Data System (ADS)
Reggentin, M.; Thiem, H.; Tsianos, G.; Malach, M.; Hofmann, J.; Plocke, T.; Kneier, M.; Richter, L.
2018-02-01
New developments in industrial applications and applications under rough environmental conditions within the field of spectroscopy and quantum technology in the 935 nm wavelength regime demand new compact, stable and robust laser systems. Beside a stable laser source the integration of a compact optical isolator is necessary to reduce size and power consumption for the whole laser system. The integration of a suitable optical isolator suppresses back reflections from the following optical system efficiently. However, the miniaturization of the optics inside the package leads to high optical power density levels that make a more detailed analysis of the components and their laser damage threshold necessary. We present test results on compact stable DFB laser sources (butterfly style packages) with newly integrated optical isolators operating around 935 nm. The presented data includes performance and lifetime tests for the laser diodes as well as package components. Overall performance data of the packaged laser diodes will be shown as well.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kung, F.; Deru, M.; Bonnema, E.
2013-10-01
Few third-party guidance documents or tools are available for evaluating thermal energy storage (TES) integrated with packaged air conditioning (AC), as this type of TES is relatively new compared to TES integrated with chillers or hot water systems. To address this gap, researchers at the National Renewable Energy Laboratory conducted a project to improve the ability of potential technology adopters to evaluate TES technologies. Major project outcomes included: development of an evaluation framework to describe key metrics, methodologies, and issues to consider when assessing the performance of TES systems integrated with packaged AC; application of multiple concepts from the evaluationmore » framework to analyze performance data from four demonstration sites; and production of a new simulation capability that enables modeling of TES integrated with packaged AC in EnergyPlus. This report includes the evaluation framework and analysis results from the project.« less
Small Cold Temperature Instrument Packages
NASA Astrophysics Data System (ADS)
Clark, P. E.; Millar, P. S.; Yeh, P. S.; Feng, S.; Brigham, D.; Beaman, B.
We are developing a small cold temperature instrument package concept that integrates a cold temperature power system with ultra low temperature ultra low power electronics components and power supplies now under development into a 'cold temperature surface operational' version of a planetary surface instrument package. We are already in the process of developing a lower power lower temperature version for an instrument of mutual interest to SMD and ESMD to support the search for volatiles (the mass spectrometer VAPoR, Volatile Analysis by Pyrolysis of Regolith) both as a stand alone instrument and as part of an environmental monitoring package. We build on our previous work to develop strategies for incorporating Ultra Low Temperature/Ultra Low Power (ULT/ULP) electronics, lower voltage power supplies, as well as innovative thermal design concepts for instrument packages. Cryotesting has indicated that our small Si RHBD CMOS chips can deliver >80% of room temperature performance at 40K (nominal minimum lunar surface temperature). We leverage collaborations, past and current, with the JPL battery development program to increase power system efficiency in extreme environments. We harness advances in MOSFET technology that provide lower voltage thresholds for power switching circuits incorporated into our low voltage power supply concept. Conventional power conversion has a lower efficiency. Our low power circuit concept based on 'synchronous rectification' could produce stable voltages as low as 0.6 V with 85% efficiency. Our distributed micro-battery-based power supply concept incorporates cold temperature power supplies operating with a 4 V or 8 V battery. This work will allow us to provide guidelines for applying the low temperature, low power system approaches generically to the widest range of surface instruments.
Bailén, Gloria; Guillén, Fabián; Castillo, Salvador; Serrano, María; Valero, Daniel; Martínez-Romero, Domingo
2006-03-22
Ethylene triggers the ripening process of tomato affecting the storage durability and shelf life (loss of quality) and inducing fruit decay. In this paper, an active packaging has been developed on the basis of the combination of modified atmosphere packaging (MAP) and the addition of granular-activated carbon (GAC) alone or impregnated with palladium as a catalyst (GAC-Pd). A steady-state atmosphere was 4 and 10 kPa for O2 and CO2 in control packages, while it was 8 and 7 kPa for O2 and CO2 in treated ones. The addition of GAC-Pd led to the lower ethylene accumulation inside packages, while the higher was obtained in controls. The parameters related to ripening showed that treated tomatoes exhibited a reduction in color evolution, softening, and weight loss, especially for GAC-Pd treatment. Moreover, these treatments were also effective in delaying tomato decay. After sensorial panel, tomatoes treated with GAC-Pd received the higher scores in terms of sweetness, firmness, juiciness, color, odor, and flavor. Results from the GC-MS analysis of the MAP headspace showed that 23 volatile compounds were identified in control packages, with these volatiles being significantly reduced in MAP-treated packages, which was correlated to the odor intensity detected by panelists after bag opening.
MODEL 9977 B(M)F-96 SAFETY ANALYSIS REPORT FOR PACKAGING
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abramczyk, G; Paul Blanton, P; Kurt Eberl, K
2006-05-18
This Safety Analysis Report for Packaging (SARP) documents the analysis and testing performed on and for the 9977 Shipping Package, referred to as the General Purpose Fissile Package (GPFP). The performance evaluation presented in this SARP documents the compliance of the 9977 package with the regulatory safety requirements for Type B packages. Per 10 CFR 71.59, for the 9977 packages evaluated in this SARP, the value of ''N'' is 50, and the Transport Index based on nuclear criticality control is 1.0. The 9977 package is designed with a high degree of single containment. The 9977 complies with 10 CFR 71more » (2002), Department of Energy (DOE) Order 460.1B, DOE Order 460.2, and 10 CFR 20 (2003) for As Low As Reasonably Achievable (ALARA) principles. The 9977 also satisfies the requirements of the Regulations for the Safe Transport of Radioactive Material--1996 Edition (Revised)--Requirements. IAEA Safety Standards, Safety Series No. TS-R-1 (ST-1, Rev.), International Atomic Energy Agency, Vienna, Austria (2000). The 9977 package is designed, analyzed and fabricated in accordance with Section III of the American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel (B&PV) Code, 1992 edition.« less
Ceramic ball grid array package stress analysis
NASA Astrophysics Data System (ADS)
Badri, S. H. B. S.; Aziz, M. H. A.; Ong, N. R.; Sauli, Z.; Alcain, J. B.; Retnasamy, V.
2017-09-01
The ball grid array (BGA), a form of chip scale package (CSP), was developed as one of the most advanced surface mount devices, which may be assembled by an ordinary surface ball bumps are used instead of plated nickel and gold (Ni/Au) bumps. Assembly and reliability of the BGA's printed circuit board (PCB), which is soldered by conventional surface mount technology is considered in this study. The Ceramic Ball Grid Array (CBGA) is a rectangular ceramic package or square-shaped that will use the solder ball for external electrical connections instead of leads or wire for connections. The solder balls will be arranged in an array or grid at the bottom of the ceramic package body. In this study, ANSYS software is used to investigate the stress on the package for 2 balls and 4 balls of the CBGA package with the various force range of 1-3 Newton applied to the top of the die, top of the substrate and side of the substrate. The highest maximum stress was analyzed and the maximum equivalent stress was observed on the solder ball and the die. From the simulation result, the CBGA package with less solder balls experience higher stress compared to the package with many solder balls. Therefore, less number of solder ball on the CBGA package results higher stress and critically affect the reliability of the solder balls itself, substrate and die which can lead to the solder crack and also die crack.
Evaluation of RDBMS packages for use in astronomy
NASA Technical Reports Server (NTRS)
Page, C. G.; Davenhall, A. C.
1992-01-01
Tabular data sets arise in many areas of astronomical data analysis, from raw data (such as photon event lists) to final results (such as source catalogs). The Starlink catalog access and reporting package, SCAR, was originally developed to handle IRAS data and it has been the principal relational DBMS in the Starlink software collection for several years. But SCAR has many limitations and is VMS-specific, while Starlink is in transition from VMS to Unix. Rather than attempt a major re-write of SCAR for Unix, it seemed more sensible to see whether any existing database packages are suitable for general astronomical use. The authors first drew up a list of desirable properties for such a system and then used these criteria to evaluate a number of packages, both free ones and those commercially available. It is already clear that most commercial DBMS packages are not very well suited to the requirements; for example, most cannot carry out efficiently even fairly basic operations such as joining two catalogs on an approximate match of celestial positions. This paper reports the results of the evaluation exercise and notes the problems in using a standard DBMS package to process scientific data. In parallel with this the authors have started to develop a simple database engine that can handle tabular data in a range of common formats including simple direct-access files (such as SCAR and Exosat DBMS tables) and FITS tables (both ASCII and binary).
Development of pre-deployment primary healthcare training for Combat Medical Technicians.
Parsons, Iain T; Rawden, M P; Wheatley, R J
2014-09-01
To develop and run a primary healthcare (PHC) refresher package to address the range of clinical presentations to Combat Medical Technicians (CMTs) on deployment and improve their confidence and capability in providing PHC for Op Herrick 18, with particular regard to the first month of deployment. A regimental level, two-and-a-half day refresher package was developed following analysis of PHC conditions most likely to be seen on Op HERRICK 18. It consisted of lectures and skill stations with written and case-based assessment phases to demonstrate effective and safe use of CMT clinical protocols on simulated patients. Internal feedback assessed the CMT's subjective understanding of each individual section. A qualitative questionnaire was used to retrospectively evaluate the package after 1 month of deployment. Immediate feedback showed that the refresher training was well received. Following the first month of deployment, CMTs who had attended the PHC refresher package felt more confident in managing PHC patients and felt they had received training for the majority of PHC conditions witnessed during their deployment in comparison with CMTs who had not. By delivering a training package acceptable to the majority of medics, we have increased the confidence and capability of CMTs in delivering PHC within the context of their protocols and prepared them for their first month of deployment. It suggests that PHC delivery can be improved by such a package and consideration should be given to formalising this into a military training qualification. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Lee, Keun Taik
2010-09-01
This article explores the effects of physically manipulated packaging materials on the quality and safety of meat products. Recently, innovative measures for improving quality and extending the shelf-life of packaged meat products have been developed, utilizing technologies including barrier film, active packaging, nanotechnology, microperforation, irradiation, plasma and far-infrared ray (FIR) treatments. Despite these developments, each technology has peculiar drawbacks which will need to be addressed by meat scientists in the future. To develop successful meat packaging systems, key product characteristics affecting stability, environmental conditions during storage until consumption, and consumers' packaging expectations must all be taken into consideration. Furthermore, the safety issues related to packaging materials must also be taken into account when processing, packaging and storing meat products.
To Duc, Khanh
2017-11-18
Receiver operating characteristic (ROC) surface analysis is usually employed to assess the accuracy of a medical diagnostic test when there are three ordered disease status (e.g. non-diseased, intermediate, diseased). In practice, verification bias can occur due to missingness of the true disease status and can lead to a distorted conclusion on diagnostic accuracy. In such situations, bias-corrected inference tools are required. This paper introduce an R package, named bcROCsurface, which provides utility functions for verification bias-corrected ROC surface analysis. The shiny web application of the correction for verification bias in estimation of the ROC surface analysis is also developed. bcROCsurface may become an important tool for the statistical evaluation of three-class diagnostic markers in presence of verification bias. The R package, readme and example data are available on CRAN. The web interface enables users less familiar with R to evaluate the accuracy of diagnostic tests, and can be found at http://khanhtoduc.shinyapps.io/bcROCsurface_shiny/ .
Discrete Event Simulation of a Suppression of Enemy Air Defenses (SEAD) Mission
2008-03-01
component-based DES developed in Java® using the Simkit simulation package. Analysis of ship self air defense system selection ( Turan , 1999) is another...Institute of Technology, Wright-Patterson AFB OH, March 2003 (ADA445279 ) Turan , Bulent. A Comparative Analysis of Ship Self Air Defense (SSAD) Systems
13 CFR 130.340 - SBDC services and restrictions on service.
Code of Federal Regulations, 2011 CFR
2011-01-01
... access to capital, such as business plan development, financial statement preparation and analysis, and cash flow preparation and analysis. (2) SBDCs should help prepare their clients to represent themselves... financial packages, the SBDCs may not take a direct role in representing clients in loan negotiations. (3...
Increasing Flexibility in Energy Code Compliance: Performance Packages
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hart, Philip R.; Rosenberg, Michael I.
Energy codes and standards have provided significant increases in building efficiency over the last 38 years, since the first national energy code was published in late 1975. The most commonly used path in energy codes, the prescriptive path, appears to be reaching a point of diminishing returns. As the code matures, the prescriptive path becomes more complicated, and also more restrictive. It is likely that an approach that considers the building as an integrated system will be necessary to achieve the next real gains in building efficiency. Performance code paths are increasing in popularity; however, there remains a significant designmore » team overhead in following the performance path, especially for smaller buildings. This paper focuses on development of one alternative format, prescriptive packages. A method to develop building-specific prescriptive packages is reviewed based on a multiple runs of prototypical building models that are used to develop parametric decision analysis to determines a set of packages with equivalent energy performance. The approach is designed to be cost-effective and flexible for the design team while achieving a desired level of energy efficiency performance. A demonstration of the approach based on mid-sized office buildings with two HVAC system types is shown along with a discussion of potential applicability in the energy code process.« less
FTOOLS: A FITS Data Processing and Analysis Software Package
NASA Astrophysics Data System (ADS)
Blackburn, J. Kent; Greene, Emily A.; Pence, William
1993-05-01
FTOOLS, a highly modular collection of utilities for processing and analyzing data in the FITS (Flexible Image Transport System) format, has been developed in support of the HEASARC (High Energy Astrophysics Research Archive Center) at NASA's Goddard Space Flight Center. Each utility performs a single simple task such as presentation of file contents, extraction of specific rows or columns, appending or merging tables, binning values in a column or selecting subsets of rows based on a boolean expression. Individual utilities can easily be chained together in scripts to achieve more complex operations such as the generation and displaying of spectra or light curves. The collection of utilities provides both generic processing and analysis utilities and utilities common to high energy astrophysics data sets. The FTOOLS software package is designed to be both compatible with IRAF and completely stand alone in a UNIX or VMS environment. The user interface is controlled by standard IRAF parameter files. The package is self documenting through the IRAF help facility and a stand alone help task. Software is written in ANSI C and FORTRAN to provide portability across most computer systems. The data format dependencies between hardware platforms are isolated through the FITSIO library package.
NASA Astrophysics Data System (ADS)
Ramkilowan, A.; Griffith, D. J.
2017-10-01
Surveillance modelling in terms of the standard Detect, Recognise and Identify (DRI) thresholds remains a key requirement for determining the effectiveness of surveillance sensors. With readily available computational resources it has become feasible to perform statistically representative evaluations of the effectiveness of these sensors. A new capability for performing this Monte-Carlo type analysis is demonstrated in the MORTICIA (Monte- Carlo Optical Rendering for Theatre Investigations of Capability under the Influence of the Atmosphere) software package developed at the Council for Scientific and Industrial Research (CSIR). This first generation, python-based open-source integrated software package, currently in the alpha stage of development aims to provide all the functionality required to perform statistical investigations of the effectiveness of optical surveillance systems in specific or generic deployment theatres. This includes modelling of the mathematical and physical processes that govern amongst other components of a surveillance system; a sensor's detector and optical components, a target and its background as well as the intervening atmospheric influences. In this paper we discuss integral aspects of the bespoke framework that are critical to the longevity of all subsequent modelling efforts. Additionally, some preliminary results are presented.
TCGAbiolinks: an R/Bioconductor package for integrative analysis of TCGA data
Colaprico, Antonio; Silva, Tiago C.; Olsen, Catharina; Garofano, Luciano; Cava, Claudia; Garolini, Davide; Sabedot, Thais S.; Malta, Tathiane M.; Pagnotta, Stefano M.; Castiglioni, Isabella; Ceccarelli, Michele; Bontempi, Gianluca; Noushmehr, Houtan
2016-01-01
The Cancer Genome Atlas (TCGA) research network has made public a large collection of clinical and molecular phenotypes of more than 10 000 tumor patients across 33 different tumor types. Using this cohort, TCGA has published over 20 marker papers detailing the genomic and epigenomic alterations associated with these tumor types. Although many important discoveries have been made by TCGA's research network, opportunities still exist to implement novel methods, thereby elucidating new biological pathways and diagnostic markers. However, mining the TCGA data presents several bioinformatics challenges, such as data retrieval and integration with clinical data and other molecular data types (e.g. RNA and DNA methylation). We developed an R/Bioconductor package called TCGAbiolinks to address these challenges and offer bioinformatics solutions by using a guided workflow to allow users to query, download and perform integrative analyses of TCGA data. We combined methods from computer science and statistics into the pipeline and incorporated methodologies developed in previous TCGA marker studies and in our own group. Using four different TCGA tumor types (Kidney, Brain, Breast and Colon) as examples, we provide case studies to illustrate examples of reproducibility, integrative analysis and utilization of different Bioconductor packages to advance and accelerate novel discoveries. PMID:26704973
Packaging Technologies for 500C SiC Electronics and Sensors
NASA Technical Reports Server (NTRS)
Chen, Liang-Yu
2013-01-01
Various SiC electronics and sensors are currently under development for applications in 500C high temperature environments such as hot sections of aerospace engines and the surface of Venus. In order to conduct long-term test and eventually commercialize these SiC devices, compatible packaging technologies for the SiC electronics and sensors are required. This presentation reviews packaging technologies developed for 500C SiC electronics and sensors to address both component and subsystem level packaging needs for high temperature environments. The packaging system for high temperature SiC electronics includes ceramic chip-level packages, ceramic printed circuit boards (PCBs), and edge-connectors. High temperature durable die-attach and precious metal wire-bonding are used in the chip-level packaging process. A high temperature sensor package is specifically designed to address high temperature micro-fabricated capacitive pressure sensors for high differential pressure environments. This presentation describes development of these electronics and sensor packaging technologies, including some testing results of SiC electronics and capacitive pressure sensors using these packaging technologies.
ExGUtils: A Python Package for Statistical Analysis With the ex-Gaussian Probability Density.
Moret-Tatay, Carmen; Gamermann, Daniel; Navarro-Pardo, Esperanza; Fernández de Córdoba Castellá, Pedro
2018-01-01
The study of reaction times and their underlying cognitive processes is an important field in Psychology. Reaction times are often modeled through the ex-Gaussian distribution, because it provides a good fit to multiple empirical data. The complexity of this distribution makes the use of computational tools an essential element. Therefore, there is a strong need for efficient and versatile computational tools for the research in this area. In this manuscript we discuss some mathematical details of the ex-Gaussian distribution and apply the ExGUtils package, a set of functions and numerical tools, programmed for python, developed for numerical analysis of data involving the ex-Gaussian probability density. In order to validate the package, we present an extensive analysis of fits obtained with it, discuss advantages and differences between the least squares and maximum likelihood methods and quantitatively evaluate the goodness of the obtained fits (which is usually an overlooked point in most literature in the area). The analysis done allows one to identify outliers in the empirical datasets and criteriously determine if there is a need for data trimming and at which points it should be done.
ExGUtils: A Python Package for Statistical Analysis With the ex-Gaussian Probability Density
Moret-Tatay, Carmen; Gamermann, Daniel; Navarro-Pardo, Esperanza; Fernández de Córdoba Castellá, Pedro
2018-01-01
The study of reaction times and their underlying cognitive processes is an important field in Psychology. Reaction times are often modeled through the ex-Gaussian distribution, because it provides a good fit to multiple empirical data. The complexity of this distribution makes the use of computational tools an essential element. Therefore, there is a strong need for efficient and versatile computational tools for the research in this area. In this manuscript we discuss some mathematical details of the ex-Gaussian distribution and apply the ExGUtils package, a set of functions and numerical tools, programmed for python, developed for numerical analysis of data involving the ex-Gaussian probability density. In order to validate the package, we present an extensive analysis of fits obtained with it, discuss advantages and differences between the least squares and maximum likelihood methods and quantitatively evaluate the goodness of the obtained fits (which is usually an overlooked point in most literature in the area). The analysis done allows one to identify outliers in the empirical datasets and criteriously determine if there is a need for data trimming and at which points it should be done. PMID:29765345
Glidewell, Liz; Willis, Thomas A; Petty, Duncan; Lawton, Rebecca; McEachan, Rosemary R C; Ingleson, Emma; Heudtlass, Peter; Davies, Andrew; Jamieson, Tony; Hunter, Cheryl; Hartley, Suzanne; Gray-Burrows, Kara; Clamp, Susan; Carder, Paul; Alderson, Sarah; Farrin, Amanda J; Foy, Robbie
2018-02-17
Interpreting evaluations of complex interventions can be difficult without sufficient description of key intervention content. We aimed to develop an implementation package for primary care which could be delivered using typically available resources and could be adapted to target determinants of behaviour for each of four quality indicators: diabetes control, blood pressure control, anticoagulation for atrial fibrillation and risky prescribing. We describe the development and prospective verification of behaviour change techniques (BCTs) embedded within the adaptable implementation packages. We used an over-lapping multi-staged process. We identified evidence-based, candidate delivery mechanisms-mainly audit and feedback, educational outreach and computerised prompts and reminders. We drew upon interviews with primary care professionals using the Theoretical Domains Framework to explore likely determinants of adherence to quality indicators. We linked determinants to candidate BCTs. With input from stakeholder panels, we prioritised likely determinants and intervention content prior to piloting the implementation packages. Our content analysis assessed the extent to which embedded BCTs could be identified within the packages and compared them across the delivery mechanisms and four quality indicators. Each implementation package included at least 27 out of 30 potentially applicable BCTs representing 15 of 16 BCT categories. Whilst 23 BCTs were shared across all four implementation packages (e.g. BCTs relating to feedback and comparing behaviour), some BCTs were unique to certain delivery mechanisms (e.g. 'graded tasks' and 'problem solving' for educational outreach). BCTs addressing the determinants 'environmental context' and 'social and professional roles' (e.g. 'restructuring the social and 'physical environment' and 'adding objects to the environment') were indicator specific. We found it challenging to operationalise BCTs targeting 'environmental context', 'social influences' and 'social and professional roles' within our chosen delivery mechanisms. We have demonstrated a transparent process for selecting, operationalising and verifying the BCT content in implementation packages adapted to target four quality indicators in primary care. There was considerable overlap in BCTs identified across the four indicators suggesting core BCTs can be embedded and verified within delivery mechanisms commonly available to primary care. Whilst feedback reports can include a wide range of BCTs, computerised prompts can deliver BCTs at the time of decision making, and educational outreach can allow for flexibility and individual tailoring in delivery.
High-performance packaging for monolithic microwave and millimeter-wave integrated circuits
NASA Technical Reports Server (NTRS)
Shalkhauser, K. A.; Li, K.; Shih, Y. C.
1992-01-01
Packaging schemes are developed that provide low-loss, hermetic enclosure for enhanced monolithic microwave and millimeter-wave integrated circuits. These package schemes are based on a fused quartz substrate material offering improved RF performance through 44 GHz. The small size and weight of the packages make them useful for a number of applications, including phased array antenna systems. As part of the packaging effort, a test fixture was developed to interface the single chip packages to conventional laboratory instrumentation for characterization of the packaged devices.
pROC: an open-source package for R and S+ to analyze and compare ROC curves.
Robin, Xavier; Turck, Natacha; Hainard, Alexandre; Tiberti, Natalia; Lisacek, Frédérique; Sanchez, Jean-Charles; Müller, Markus
2011-03-17
Receiver operating characteristic (ROC) curves are useful tools to evaluate classifiers in biomedical and bioinformatics applications. However, conclusions are often reached through inconsistent use or insufficient statistical analysis. To support researchers in their ROC curves analysis we developed pROC, a package for R and S+ that contains a set of tools displaying, analyzing, smoothing and comparing ROC curves in a user-friendly, object-oriented and flexible interface. With data previously imported into the R or S+ environment, the pROC package builds ROC curves and includes functions for computing confidence intervals, statistical tests for comparing total or partial area under the curve or the operating points of different classifiers, and methods for smoothing ROC curves. Intermediary and final results are visualised in user-friendly interfaces. A case study based on published clinical and biomarker data shows how to perform a typical ROC analysis with pROC. pROC is a package for R and S+ specifically dedicated to ROC analysis. It proposes multiple statistical tests to compare ROC curves, and in particular partial areas under the curve, allowing proper ROC interpretation. pROC is available in two versions: in the R programming language or with a graphical user interface in the S+ statistical software. It is accessible at http://expasy.org/tools/pROC/ under the GNU General Public License. It is also distributed through the CRAN and CSAN public repositories, facilitating its installation.
Advanced statistical methods for improved data analysis of NASA astrophysics missions
NASA Technical Reports Server (NTRS)
Feigelson, Eric D.
1992-01-01
The investigators under this grant studied ways to improve the statistical analysis of astronomical data. They looked at existing techniques, the development of new techniques, and the production and distribution of specialized software to the astronomical community. Abstracts of nine papers that were produced are included, as well as brief descriptions of four software packages. The articles that are abstracted discuss analytical and Monte Carlo comparisons of six different linear least squares fits, a (second) paper on linear regression in astronomy, two reviews of public domain software for the astronomer, subsample and half-sample methods for estimating sampling distributions, a nonparametric estimation of survival functions under dependent competing risks, censoring in astronomical data due to nondetections, an astronomy survival analysis computer package called ASURV, and improving the statistical methodology of astronomical data analysis.
Cornwell, MacIntosh; Vangala, Mahesh; Taing, Len; Herbert, Zachary; Köster, Johannes; Li, Bo; Sun, Hanfei; Li, Taiwen; Zhang, Jian; Qiu, Xintao; Pun, Matthew; Jeselsohn, Rinath; Brown, Myles; Liu, X Shirley; Long, Henry W
2018-04-12
RNA sequencing has become a ubiquitous technology used throughout life sciences as an effective method of measuring RNA abundance quantitatively in tissues and cells. The increase in use of RNA-seq technology has led to the continuous development of new tools for every step of analysis from alignment to downstream pathway analysis. However, effectively using these analysis tools in a scalable and reproducible way can be challenging, especially for non-experts. Using the workflow management system Snakemake we have developed a user friendly, fast, efficient, and comprehensive pipeline for RNA-seq analysis. VIPER (Visualization Pipeline for RNA-seq analysis) is an analysis workflow that combines some of the most popular tools to take RNA-seq analysis from raw sequencing data, through alignment and quality control, into downstream differential expression and pathway analysis. VIPER has been created in a modular fashion to allow for the rapid incorporation of new tools to expand the capabilities. This capacity has already been exploited to include very recently developed tools that explore immune infiltrate and T-cell CDR (Complementarity-Determining Regions) reconstruction abilities. The pipeline has been conveniently packaged such that minimal computational skills are required to download and install the dozens of software packages that VIPER uses. VIPER is a comprehensive solution that performs most standard RNA-seq analyses quickly and effectively with a built-in capacity for customization and expansion.
NASA Astrophysics Data System (ADS)
Nishizawa, Tomoaki; Sugimoto, Nobuo; Shimizu, Atsushi; Uno, Itsushi; Hara, Yukari; Kudo, Rei
2018-04-01
We deployed multi-wavelength Mie-Raman lidars (MMRL) at three sites of the AD-Net and have conducted continuous measurements using them since 2013. To analyze the MMRL data and better understand the externally mixing state of main aerosol components (e.g., dust, sea-salt, and black carbon) in the atmosphere, we developed an integrated package of aerosol component retrieval algorithms, which have already been developed or are being developed, to estimate vertical profiles of the aerosol components. This package applies to the other ground-based lidar network data (e.g., EARLINET) and satellite-borne lidar data (e.g., CALIOP/CALIPSO and ATLID/EarthCARE) as well as the other lidar data of the AD-Net.
Environmental assessment of packaging: Sense and sensibility
NASA Astrophysics Data System (ADS)
Kooijman, Jan M.
1993-09-01
The functions of packaging are derived from product requirements, thus for insight into the environmental effects of packaging the actual combination of product and package has to be evaluated along the production and distribution system. This extension to all related environmental aspects adds realism to the environmental analysis and provides guidance for design while preventing a too detailed investigation of parts of the production system. This approach is contrary to current environmental studies where packaging is always treated as an independent object, neglecting the more important environmental effects of the product that are influenced by packaging. The general analysis and quantification stages for this approach are described, and the currently available methods for the assessment of environmental effects are reviewed. To limit the workload involved in an environmental assessment, a step-by-step analysis and the use of feedback is recommended. First the dominant environmental effects of a particular product and its production and distribution are estimated. Then, on the basis of these preliminary results, the appropriate system boundaries are chosen and the need for further or more detailed environmental analysis is determined. For typical food and drink applications, the effect of different system boundaries on the outcome of environmental assessments and the advantage of the step-by-step analysis of the food supply system is shown. It appears that, depending on the consumer group, different advice for reduction of environmental effects has to be given. Furthermore, because of interrelated environmental effects of the food supply system, the continuing quest for more detailed and accurate analysis of the package components is not necessary for improved management of the environmental effects of packaging.
Padilla-Sanchez, Victor; Gao, Song; Kim, Hyung Rae; Kihara, Daisuke; Sun, Lei; Rossmann, Michael G; Rao, Venigalla B
2014-03-06
Tailed bacteriophages and herpesviruses consist of a structurally well conserved dodecameric portal at a special 5-fold vertex of the capsid. The portal plays critical roles in head assembly, genome packaging, neck/tail attachment, and genome ejection. Although the structures of portals from phages φ29, SPP1, and P22 have been determined, their mechanistic roles have not been well understood. Structural analysis of phage T4 portal (gp20) has been hampered because of its unusual interaction with the Escherichia coli inner membrane. Here, we predict atomic models for the T4 portal monomer and dodecamer, and we fit the dodecamer into the cryo-electron microscopy density of the phage portal vertex. The core structure, like that from other phages, is cone shaped with the wider end containing the "wing" and "crown" domains inside the phage head. A long "stem" encloses a central channel, and a narrow "stalk" protrudes outside the capsid. A biochemical approach was developed to analyze portal function by incorporating plasmid-expressed portal protein into phage heads and determining the effect of mutations on head assembly, DNA translocation, and virion production. We found that the protruding loops of the stalk domain are involved in assembling the DNA packaging motor. A loop that connects the stalk to the channel might be required for communication between the motor and the portal. The "tunnel" loops that project into the channel are essential for sealing the packaged head. These studies established that the portal is required throughout the DNA packaging process, with different domains participating at different stages of genome packaging. © 2013.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gates, A.A.; McCarthy, P.G.; Edl, J.W.
1975-05-01
Elemental tritium is shipped at low pressure in a stainless steel container (LP-50) surrounded by an aluminum vessel and Celotex insulation at least 4 in. thick in a steel drum. Each package contains a large quantity (greater than a Type A quantity) of nonfissile material, as defined in AECM 0529. This report provides the details of the safety analysis performed for this type container.
MOSAIC: Software for creating mosaics from collections of images
NASA Technical Reports Server (NTRS)
Varosi, F.; Gezari, D. Y.
1992-01-01
We have developed a powerful, versatile image processing and analysis software package called MOSAIC, designed specifically for the manipulation of digital astronomical image data obtained with (but not limited to) two-dimensional array detectors. The software package is implemented using the Interactive Data Language (IDL), and incorporates new methods for processing, calibration, analysis, and visualization of astronomical image data, stressing effective methods for the creation of mosaic images from collections of individual exposures, while at the same time preserving the photometric integrity of the original data. Since IDL is available on many computers, the MOSAIC software runs on most UNIX and VAX workstations with the X-Windows or Sun View graphics interface.
Manned Mars Mission program concepts
NASA Technical Reports Server (NTRS)
Hamilton, E. C.; Johnson, P.; Pearson, J.; Tucker, W.
1988-01-01
This paper describes the SRS Manned Mars Mission and Program Analysis study designed to support a manned expedition to Mars contemplated by NASA for the purposes of initiating human exploration and eventual habitation of this planet. The capabilities of the interactive software package being presently developed by the SRS for the mission/program analysis are described, and it is shown that the interactive package can be used to investigate the impact of various mission concepts on the sensitivity of mass required in LEO, schedules, relative costs, and risk. The results, to date, indicate the need for an earth-to-orbit transportation system much larger than the present STS, reliable long-life support systems, and either advanced propulsion or aerobraking technology.
High pressure single-crystal micro X-ray diffraction analysis with GSE_ADA/RSV software
NASA Astrophysics Data System (ADS)
Dera, Przemyslaw; Zhuravlev, Kirill; Prakapenka, Vitali; Rivers, Mark L.; Finkelstein, Gregory J.; Grubor-Urosevic, Ognjen; Tschauner, Oliver; Clark, Simon M.; Downs, Robert T.
2013-08-01
GSE_ADA/RSV is a free software package for custom analysis of single-crystal micro X-ray diffraction (SCμXRD) data, developed with particular emphasis on data from samples enclosed in diamond anvil cells and subject to high pressure conditions. The package has been in extensive use at the high pressure beamlines of Advanced Photon Source (APS), Argonne National Laboratory and Advanced Light Source (ALS), Lawrence Berkeley National Laboratory. The software is optimized for processing of wide-rotation images and includes a variety of peak intensity corrections and peak filtering features, which are custom-designed to make processing of high pressure SCμXRD easier and more reliable.
SEAPAK user's guide, version 2.0. Volume 2: Descriptions of programs
NASA Technical Reports Server (NTRS)
Mcclain, Charles R.; Darzi, Michael; Firestone, James K.; Fu, Gary; Yeh, Eueng-Nan; Endres, Daniel L.
1991-01-01
The SEAPAK is a user-interactive satellite data analysis package that was developed for the processing and interpretation of Nimbus-7/Coastal Zone Color Scanner (CZCS) and the NOAA Advanced Very High Resolution Radiometer (AVHRR) data. Significant revisions were made since version 1.0, and the ancillary environmental data analysis module was greatly expanded. The package continues to be user friendly and user interactive. Also, because the scientific goals of the ocean color research being conducted have shifted to large space and time scales, batch processing capabilities for both satellite and ancillary environmental data analyses were enhanced, thus allowing for large quantities of data to be ingested and analyzed.
MIDAS: Software for the detection and analysis of lunar impact flashes
NASA Astrophysics Data System (ADS)
Madiedo, José M.; Ortiz, José L.; Morales, Nicolás; Cabrera-Caño, Jesús
2015-06-01
Since 2009 we are running a project to identify flashes produced by the impact of meteoroids on the surface of the Moon. For this purpose we are employing small telescopes and high-sensitivity CCD video cameras. To automatically identify these events a software package called MIDAS was developed and tested. This package can also perform the photometric analysis of these flashes and estimate the value of the luminous efficiency. Besides, we have implemented in MIDAS a new method to establish which is the likely source of the meteoroids (known meteoroid stream or sporadic background). The main features of this computer program are analyzed here, and some examples of lunar impact events are presented.
SEAPAK user's guide, version 2.0. Volume 1: System description
NASA Technical Reports Server (NTRS)
Mcclain, Charles R.; Darzi, Michael; Firestone, James K.; Fu, Gary; Yeh, Eueng-Nan; Endres, Daniel L.
1991-01-01
The SEAPAK is a user interactive satellite data analysis package that was developed for the processing and interpretation of Nimbus-7/Coastal Zone Color Scanner (CZCS) and the NOAA Advanced Very High Resolution Radiometer (AVHRR) data. Significant revisions were made to version 1.0 of the guide, and the ancillary environmental data analysis module was expanded. The package continues to emphasize user friendliness and user interactive data analyses. Additionally, because the scientific goals of the ocean color research being conducted have shifted to large space and time scales, batch processing capabilities for both satellite and ancillary environmental data analyses were enhanced, thus allowing large quantities of data to be ingested and analyzed in background.
Goglia, R; Spiteri, M; Ménard, C; Dumas, C; Combris, P; Labarbe, B; Soler, L G; Volatier, J L
2010-11-01
To assess developments in the nutritional quality of food products in various food groups in France, an Observatory of Food Quality (Oqali) was created in 2008. To achieve its aims, Oqali built up a new database to describe each specific food item at the most detailed level, and also included economic parameters (market share and mean prices). The objective of this paper is to give a detailed analysis of the monitoring of the ready-to-eat breakfast cereals (RTEBCs) sector in order to show the benefits of the Oqali database. Analysis was limited to products with nutritional information on labels. Packaging was provided by manufacturers or retailers, or obtained by buying products in regular stores. Economic parameters were obtained from surveys on French food consumption and data from consumer purchase panels. The breakfast cereal sector was divided into 10 categories and 5 types of brand. Oqali has developed anonymous indicators to describe product characteristics for each category of RTEBC and each type of brand by cross-referencing nutritional values with economic data. Packaging-related data were also analysed. The major nutritional parameters studied were energy, protein, fat, saturated fat, carbohydrates, sugars, fibre and sodium. Analysis was performed on the basis of descriptive statistics, multivariate statistics and a Kruskal-Wallis test. For the RTEBC, there is large variability in nutrient content throughout the sector, both within and between product categories. There is no systematic relation between brand type and nutritional quality within each product category, and the proportion of brand type within each product category is different. Nutritional labels, claims and pictograms are widespread on packages but vary according to the type of brand. These findings form the basis for monitoring developments in the nutritional composition and packaging-related data for breakfast cereals in the future. The final objective is to expand the approach illustrated here to all food sectors progressively.
Telescoping Solar Array Concept for Achieving High Packaging Efficiency
NASA Technical Reports Server (NTRS)
Mikulas, Martin; Pappa, Richard; Warren, Jay; Rose, Geoff
2015-01-01
Lightweight, high-efficiency solar arrays are required for future deep space missions using high-power Solar Electric Propulsion (SEP). Structural performance metrics for state-of-the art 30-50 kW flexible blanket arrays recently demonstrated in ground tests are approximately 40 kW/cu m packaging efficiency, 150 W/kg specific power, 0.1 Hz deployed stiffness, and 0.2 g deployed strength. Much larger arrays with up to a megawatt or more of power and improved packaging and specific power are of interest to mission planners for minimizing launch and life cycle costs of Mars exploration. A new concept referred to as the Compact Telescoping Array (CTA) with 60 kW/cu m packaging efficiency at 1 MW of power is described herein. Performance metrics as a function of array size and corresponding power level are derived analytically and validated by finite element analysis. Feasible CTA packaging and deployment approaches are also described. The CTA was developed, in part, to serve as a NASA reference solar array concept against which other proposed designs of 50-1000 kW arrays for future high-power SEP missions could be compared.
MK3TOOLS & NetCDF - storing VLBI data in a machine independent array oriented data format
NASA Astrophysics Data System (ADS)
Hobiger, T.; Koyama, Y.; Kondo, T.
2007-07-01
In the beginning of 2002 the International VLBI Service (IVS) has agreed to introduce a Platform-independent VLBI exchange format (PIVEX) which permits the exchange of observational data and stimulates the research across different analysis groups. Unfortunately PIVEX has never been implemented and many analysis software packages are still depending on prior processing (e.g. ambiguity resolution and computation of ionosphere corrections) done by CALC/SOLVE. Thus MK3TOOLS which handles MK3 databases without CALC/SOLVE being installed has been developed. It uses the NetCDF format to store the data and since interfaces exist for a variety of programming languages (FORTRAN, C/C++, JAVA, Perl, Python) it can be easily incorporated in existing and upcoming analysis software packages.
DCGL v2.0: an R package for unveiling differential regulation from differential co-expression.
Yang, Jing; Yu, Hui; Liu, Bao-Hong; Zhao, Zhongming; Liu, Lei; Ma, Liang-Xiao; Li, Yi-Xue; Li, Yuan-Yuan
2013-01-01
Differential co-expression analysis (DCEA) has emerged in recent years as a novel, systematic investigation into gene expression data. While most DCEA studies or tools focus on the co-expression relationships among genes, some are developing a potentially more promising research domain, differential regulation analysis (DRA). In our previously proposed R package DCGL v1.0, we provided functions to facilitate basic differential co-expression analyses; however, the output from DCGL v1.0 could not be translated into differential regulation mechanisms in a straightforward manner. To advance from DCEA to DRA, we upgraded the DCGL package from v1.0 to v2.0. A new module named "Differential Regulation Analysis" (DRA) was designed, which consists of three major functions: DRsort, DRplot, and DRrank. DRsort selects differentially regulated genes (DRGs) and differentially regulated links (DRLs) according to the transcription factor (TF)-to-target information. DRrank prioritizes the TFs in terms of their potential relevance to the phenotype of interest. DRplot graphically visualizes differentially co-expressed links (DCLs) and/or TF-to-target links in a network context. In addition to these new modules, we streamlined the codes from v1.0. The evaluation results proved that our differential regulation analysis is able to capture the regulators relevant to the biological subject. With ample functions to facilitate differential regulation analysis, DCGL v2.0 was upgraded from a DCEA tool to a DRA tool, which may unveil the underlying differential regulation from the observed differential co-expression. DCGL v2.0 can be applied to a wide range of gene expression data in order to systematically identify novel regulators that have not yet been documented as critical. DCGL v2.0 package is available at http://cran.r-project.org/web/packages/DCGL/index.html or at our project home page http://lifecenter.sgst.cn/main/en/dcgl.jsp.
NASA Technical Reports Server (NTRS)
1979-01-01
A plan for the production of two PEP flight systems is defined. The task's milestones are described. Provisions for the development and assembly of new ground support equipment required for both testing and launch operations are included.
Vehicle Sketch Pad: a Parametric Geometry Modeler for Conceptual Aircraft Design
NASA Technical Reports Server (NTRS)
Hahn, Andrew S.
2010-01-01
The conceptual aircraft designer is faced with a dilemma, how to strike the best balance between productivity and fidelity? Historically, handbook methods have required only the coarsest of geometric parameterizations in order to perform analysis. Increasingly, there has been a drive to upgrade analysis methods, but these require considerably more precise and detailed geometry. Attempts have been made to use computer-aided design packages to fill this void, but their cost and steep learning curve have made them unwieldy at best. Vehicle Sketch Pad (VSP) has been developed over several years to better fill this void. While no substitute for the full feature set of computer-aided design packages, VSP allows even novices to quickly become proficient in defining three-dimensional, watertight aircraft geometries that are adequate for producing multi-disciplinary meta-models for higher order analysis methods, wind tunnel and display models, as well as a starting point for animation models. This paper will give an overview of the development and future course of VSP.
Test Cases for Modeling and Validation of Structures with Piezoelectric Actuators
NASA Technical Reports Server (NTRS)
Reaves, Mercedes C.; Horta, Lucas G.
2001-01-01
A set of benchmark test articles were developed to validate techniques for modeling structures containing piezoelectric actuators using commercially available finite element analysis packages. The paper presents the development, modeling, and testing of two structures: an aluminum plate with surface mounted patch actuators and a composite box beam with surface mounted actuators. Three approaches for modeling structures containing piezoelectric actuators using the commercially available packages: MSC/NASTRAN and ANSYS are presented. The approaches, applications, and limitations are discussed. Data for both test articles are compared in terms of frequency response functions from deflection and strain data to input voltage to the actuator. Frequency response function results using the three different analysis approaches provided comparable test/analysis results. It is shown that global versus local behavior of the analytical model and test article must be considered when comparing different approaches. Also, improper bonding of actuators greatly reduces the electrical to mechanical effectiveness of the actuators producing anti-resonance errors.
MPA Portable: A Stand-Alone Software Package for Analyzing Metaproteome Samples on the Go.
Muth, Thilo; Kohrs, Fabian; Heyer, Robert; Benndorf, Dirk; Rapp, Erdmann; Reichl, Udo; Martens, Lennart; Renard, Bernhard Y
2018-01-02
Metaproteomics, the mass spectrometry-based analysis of proteins from multispecies samples faces severe challenges concerning data analysis and results interpretation. To overcome these shortcomings, we here introduce the MetaProteomeAnalyzer (MPA) Portable software. In contrast to the original server-based MPA application, this newly developed tool no longer requires computational expertise for installation and is now independent of any relational database system. In addition, MPA Portable now supports state-of-the-art database search engines and a convenient command line interface for high-performance data processing tasks. While search engine results can easily be combined to increase the protein identification yield, an additional two-step workflow is implemented to provide sufficient analysis resolution for further postprocessing steps, such as protein grouping as well as taxonomic and functional annotation. Our new application has been developed with a focus on intuitive usability, adherence to data standards, and adaptation to Web-based workflow platforms. The open source software package can be found at https://github.com/compomics/meta-proteome-analyzer .
Pathview Web: user friendly pathway visualization and data integration
Pant, Gaurav; Bhavnasi, Yeshvant K.; Blanchard, Steven G.; Brouwer, Cory
2017-01-01
Abstract Pathway analysis is widely used in omics studies. Pathway-based data integration and visualization is a critical component of the analysis. To address this need, we recently developed a novel R package called Pathview. Pathview maps, integrates and renders a large variety of biological data onto molecular pathway graphs. Here we developed the Pathview Web server, as to make pathway visualization and data integration accessible to all scientists, including those without the special computing skills or resources. Pathview Web features an intuitive graphical web interface and a user centered design. The server not only expands the core functions of Pathview, but also provides many useful features not available in the offline R package. Importantly, the server presents a comprehensive workflow for both regular and integrated pathway analysis of multiple omics data. In addition, the server also provides a RESTful API for programmatic access and conveniently integration in third-party software or workflows. Pathview Web is openly and freely accessible at https://pathview.uncc.edu/. PMID:28482075
elevatr: Access Elevation Data from Various APIs | Science ...
Several web services are available that provide access to elevation data. This package provides access to several of those services and returns elevation data either as a SpatialPointsDataFrame from point elevation services or as a raster object from raster elevation services. Currently, the package supports access to the Mapzen Elevation Service, Mapzen Terrain Service, and the USGS Elevation Point Query Service. The R language for statistical computing is increasingly used for spatial data analysis . This R package, elevatr, is in response to this and provides access to elevation data from various sources directly in R. The impact of `elevatr` is that it will 1) facilitate spatial analysis in R by providing access to foundational dataset for many types of analyses (e.g. hydrology, limnology) 2) open up a new set of users and uses for APIs widely used outside of R, and 3) provide an excellent example federal open source development as promoted by the Federal Source Code Policy (https://sourcecode.cio.gov/).
Matsumoto, Keiichi; Endo, Keigo
2013-06-01
Two kinds of Japanese guidelines for the data acquisition protocol of oncology fluoro-D-glucose-positron emission tomography (FDG-PET)/computed tomography (CT) scans were created by the joint task force of the Japanese Society of Nuclear Medicine Technology (JSNMT) and the Japanese Society of Nuclear Medicine (JSNM), and published in Kakuigaku-Gijutsu 27(5): 425-456, 2007 and 29(2): 195-235, 2009. These guidelines aim to standardize PET image quality among facilities and different PET/CT scanner models. The objective of this study was to develop a personal computer-based performance measurement and image quality processor for the two kinds of Japanese guidelines for oncology (18)F-FDG PET/CT scans. We call this software package the "PET quality control tool" (PETquact). Microsoft Corporation's Windows(™) is used as the operating system for PETquact, which requires 1070×720 image resolution and includes 12 different applications. The accuracy was examined for numerous applications of PETquact. For example, in the sensitivity application, the system sensitivity measurement results were equivalent when comparing two PET sinograms obtained from the PETquact and the report. PETquact is suited for analysis of the two kinds of Japanese guideline, and it shows excellent spec to performance measurements and image quality analysis. PETquact can be used at any facility if the software package is installed on a laptop computer.
Hawkins, Benjamin; Holden, Chris; Mackinder, Sophie
2018-03-09
Despite the extensive literature on the tobacco industry, there has been little attempt to study how transnational tobacco companies (TTCs) coordinate their political activities globally, or to theorise TTC strategies within the context of global governance structures and policy processes. This article draws on three concepts from political science - policy transfer, multi-level governance and venue shifting - to analyse TTCs' integrated, global strategies to oppose augmented packaging requirements across multiple jurisdictions. Following Uruguay's introduction of extended labelling requirements, Australia became the first country in the world to require tobacco products to be sold in standardised ('plain') packaging in 2012. Governments in the European Union, including in the United Kingdom and Ireland, adopted similar laws, with other member states due to follow. TTCs vehemently opposed these measures and developed coordinated, global strategies to oppose their implementation, exploiting the complexity of contemporary global governance arrangements. These included a series of legal challenges in various jurisdictions, alongside political lobbying and public relations campaigns. This article draws on analysis of public documents and 32 semi-structured interviews with key policy actors. It finds that TTCs developed coordinated and highly integrated strategies to oppose packaging restrictions across multiple jurisdictions and levels of governance.
A high-level 3D visualization API for Java and ImageJ.
Schmid, Benjamin; Schindelin, Johannes; Cardona, Albert; Longair, Mark; Heisenberg, Martin
2010-05-21
Current imaging methods such as Magnetic Resonance Imaging (MRI), Confocal microscopy, Electron Microscopy (EM) or Selective Plane Illumination Microscopy (SPIM) yield three-dimensional (3D) data sets in need of appropriate computational methods for their analysis. The reconstruction, segmentation and registration are best approached from the 3D representation of the data set. Here we present a platform-independent framework based on Java and Java 3D for accelerated rendering of biological images. Our framework is seamlessly integrated into ImageJ, a free image processing package with a vast collection of community-developed biological image analysis tools. Our framework enriches the ImageJ software libraries with methods that greatly reduce the complexity of developing image analysis tools in an interactive 3D visualization environment. In particular, we provide high-level access to volume rendering, volume editing, surface extraction, and image annotation. The ability to rely on a library that removes the low-level details enables concentrating software development efforts on the algorithm implementation parts. Our framework enables biomedical image software development to be built with 3D visualization capabilities with very little effort. We offer the source code and convenient binary packages along with extensive documentation at http://3dviewer.neurofly.de.
Large Scale Software Building with CMake in ATLAS
NASA Astrophysics Data System (ADS)
Elmsheuser, J.; Krasznahorkay, A.; Obreshkov, E.; Undrus, A.; ATLAS Collaboration
2017-10-01
The offline software of the ATLAS experiment at the Large Hadron Collider (LHC) serves as the platform for detector data reconstruction, simulation and analysis. It is also used in the detector’s trigger system to select LHC collision events during data taking. The ATLAS offline software consists of several million lines of C++ and Python code organized in a modular design of more than 2000 specialized packages. Because of different workflows, many stable numbered releases are in parallel production use. To accommodate specific workflow requests, software patches with modified libraries are distributed on top of existing software releases on a daily basis. The different ATLAS software applications also require a flexible build system that strongly supports unit and integration tests. Within the last year this build system was migrated to CMake. A CMake configuration has been developed that allows one to easily set up and build the above mentioned software packages. This also makes it possible to develop and test new and modified packages on top of existing releases. The system also allows one to detect and execute partial rebuilds of the release based on single package changes. The build system makes use of CPack for building RPM packages out of the software releases, and CTest for running unit and integration tests. We report on the migration and integration of the ATLAS software to CMake and show working examples of this large scale project in production.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCoy, J.C.
1994-08-01
The Type B drum packages (TBD) are conceptualized as a family of containers in which a single 208 L or 114 L (55 gal or 30 gal) drum containing Type B quantities of radioactive material (RAM) can be packaged for shipment. The TBD containers are being developed to fill a void in the packaging and transportation capabilities of the U.S. Department of Energy as no container packaging single drums of Type B RAM exists offering double containment. Several multiple-drum containers currently exist, as well as a number of shielded casks, but the size and weight of these containers present manymore » operational challenges for single-drum shipments. As an alternative, the TBD containers will offer up to three shielded versions (light, medium, and heavy) and one unshielded version, each offering single or optional double containment for a single drum. To reduce operational complexity, all versions will share similar design and operational features where possible. The primary users of the TBD containers are envisioned to be any organization desiring to ship single drums of Type B RAM, such as laboratories, waste retrieval activities, emergency response teams, etc. Currently, the TBD conceptual design is being developed with the final design and analysis to be completed in 1995 to 1996. Testing and certification of the unshielded version are planned to be completed in 1996 to 1997 with production to begin in 1997 to 1998.« less
Mlalila, Nichrous; Kadam, Dattatreya M; Swai, Hulda; Hilonga, Askwar
2016-09-01
In recent decades, there is a global advancement in manufacturing industry due to increased applications of nanotechnology. Food industry also has been tremendously changing from passive packaging to innovative packaging, to cope with global trends, technological advancements, and consumer preferences. Active research is taking place in food industry and other scientific fields to develop innovative packages including smart, intelligent and active food packaging for more effective and efficient packaging materials with balanced environmental issues. However, in food industry the features behind smart packaging are narrowly defined to be distinguished from intelligent packaging as in other scientific fields, where smart materials are under critical investigations. This review presents some scientific concepts and features pertaining innovative food packaging. The review opens new research window in innovative food packaging to cover the existing disparities for further precise research and development of food packaging industry.
Development of deployable structures for large space platform systems, volume 1
NASA Technical Reports Server (NTRS)
1982-01-01
Generic deployable spacecraft configurations and deployable platform systems concepts were identified. Sizing, building block concepts, orbiter packaging, thermal analysis, cost analysis, and mass properties analysis as related to platform systems integration are considered. Technology needs are examined and the major criteria used in concept selection are delineated. Requirements for deployable habitat modules, tunnels, and OTV hangars are considered.
Packaging Software Assets for Reuse
NASA Astrophysics Data System (ADS)
Mattmann, C. A.; Marshall, J. J.; Downs, R. R.
2010-12-01
The reuse of existing software assets such as code, architecture, libraries, and modules in current software and systems development projects can provide many benefits, including reduced costs, in time and effort, and increased reliability. Many reusable assets are currently available in various online catalogs and repositories, usually broken down by disciplines such as programming language (Ibiblio for Maven/Java developers, PyPI for Python developers, CPAN for Perl developers, etc.). The way these assets are packaged for distribution can play a role in their reuse - an asset that is packaged simply and logically is typically easier to understand, install, and use, thereby increasing its reusability. A well-packaged asset has advantages in being more reusable and thus more likely to provide benefits through its reuse. This presentation will discuss various aspects of software asset packaging and how they can affect the reusability of the assets. The characteristics of well-packaged software will be described. A software packaging domain model will be introduced, and some existing packaging approaches examined. An example case study of a Reuse Enablement System (RES), currently being created by near-term Earth science decadal survey missions, will provide information about the use of the domain model. Awareness of these factors will help software developers package their reusable assets so that they can provide the most benefits for software reuse.
Performance Analysis and Electronics Packaging of the Optical Communications Demonstrator
NASA Technical Reports Server (NTRS)
Jeganathan, M.; Monacos, S.
1998-01-01
The Optical Communications Demonstrator (OCD), under development at the Jet Propulsion Laboratory (JPL), is a laboratory-based lasercomm terminal designed to validate several key technologies, primarily precision beam pointing, high bandwidth tracking, and beacon acquisition.
Effects of a Training Package to Improve the Accuracy of Descriptive Analysis Data Recording
ERIC Educational Resources Information Center
Mayer, Kimberly L.; DiGennaro Reed, Florence D.
2013-01-01
Functional behavior assessment is an important precursor to developing interventions to address a problem behavior. Descriptive analysis, a type of functional behavior assessment, is effective in informing intervention design only if the gathered data accurately capture relevant events and behaviors. We investigated a training procedure to improve…
FIESTA—An R estimation tool for FIA analysts
Tracey S. Frescino; Paul L. Patterson; Gretchen G. Moisen; Elizabeth A. Freeman
2015-01-01
FIESTA (Forest Inventory ESTimation for Analysis) is a user-friendly R package that was originally developed to support the production of estimates consistent with current tools available for the Forest Inventory and Analysis (FIA) National Program, such as FIDO (Forest Inventory Data Online) and EVALIDator. FIESTA provides an alternative data retrieval and reporting...
Bax, Leon; Yu, Ly-Mee; Ikeda, Noriaki; Tsuruta, Harukazu; Moons, Karel G M
2006-10-13
Meta-analysis has become a well-known method for synthesis of quantitative data from previously conducted research in applied health sciences. So far, meta-analysis has been particularly useful in evaluating and comparing therapies and in assessing causes of disease. Consequently, the number of software packages that can perform meta-analysis has increased over the years. Unfortunately, it can take a substantial amount of time to get acquainted with some of these programs and most contain little or no interactive educational material. We set out to create and validate an easy-to-use and comprehensive meta-analysis package that would be simple enough programming-wise to remain available as a free download. We specifically aimed at students and researchers who are new to meta-analysis, with important parts of the development oriented towards creating internal interactive tutoring tools and designing features that would facilitate usage of the software as a companion to existing books on meta-analysis. We took an unconventional approach and created a program that uses Excel as a calculation and programming platform. The main programming language was Visual Basic, as implemented in Visual Basic 6 and Visual Basic for Applications in Excel 2000 and higher. The development took approximately two years and resulted in the 'MIX' program, which can be downloaded from the program's website free of charge. Next, we set out to validate the MIX output with two major software packages as reference standards, namely STATA (metan, metabias, and metatrim) and Comprehensive Meta-Analysis Version 2. Eight meta-analyses that had been published in major journals were used as data sources. All numerical and graphical results from analyses with MIX were identical to their counterparts in STATA and CMA. The MIX program distinguishes itself from most other programs by the extensive graphical output, the click-and-go (Excel) interface, and the educational features. The MIX program is a valid tool for performing meta-analysis and may be particularly useful in educational environments. It can be downloaded free of charge via http://www.mix-for-meta-analysis.info or http://sourceforge.net/projects/meta-analysis.
Yu, Hwan Hee; Song, Myung Wook; Kim, Tae-Kyung; Choi, Yun-Sang; Cho, Gyu Yong; Lee, Na-Kyoung; Paik, Hyun-Dong
2018-01-01
Abstract The objective of this study was to investigate comparison of physicochemical, microbiological, and sensory characteristics of Hanwoo eye of round by various packaging methods [wrapped packaging (WP), modified atmosphere packaging (MAP), vacuum packaging (VP) with three different vacuum films, and vacuum skin packaging (VSP)] at a small scale. Packaged Hanwoo beef samples were stored in refrigerated conditions (4±1°C) for 28 days. Packaged beef was sampled on days 0, 7, 14, 21, and 28. Physicochemical [pH, surface color, thiobarbituric acid reactive substances (TBARS), and volatile basic nitrogen (VBN) values], microbiological, and sensory analysis of packaged beef samples were performed. VP and VSP samples showed low TBARS and VBN values, and pH and surface color did not change substantially during the 28-day period. For VSP, total viable bacteria, psychrotrophic bacteria, lactic acid bacteria, and coliform counts were lower than those for other packaging systems. Salmonella spp. and Escherichia coli O157:H7 were not detected in any packaged beef samples. A sensory analysis showed that the scores for appearance, flavor, color, and overall acceptability did not change significantly until day 7. In total, VSP was effective with respect to significantly higher a* values, physicochemical stability, and microbial safety in Hanwoo packaging (p<0.05). PMID:29805283
Ares, Gastón; Besio, Mariángela; Giménez, Ana; Deliza, Rosires
2010-10-01
Consumers perceive functional foods as member of the particular food category to which they belong. In this context, apart from health and sensory characteristics, non-sensory factors such as packaging might have a key role on determining consumers' purchase decisions regarding functional foods. The aims of the present work were to study the influence of different package attributes on consumer willingness to purchase regular and functional chocolate milk desserts; and to assess if the influence of these attributes was affected by consumers' level of involvement with the product. A conjoint analysis task was carried out with 107 regular milk desserts consumers, who were asked to score their willingness to purchase of 16 milk dessert package concepts varying in five features of the package, and to complete a personal involvement inventory questionnaire. Consumers' level of involvement with the product affected their interest in the evaluated products and their reaction towards the considered conjoint variables, suggesting that it could be a useful segmentation tool during food development. Package colour and the presence of a picture on the label were the variables with the highest relative importance, regardless of consumers' involvement with the product. The importance of these variables was higher than the type of dessert indicating that packaging may play an important role in consumers' perception and purchase intention of functional foods.
Design and development of conformal antenna composite structure
NASA Astrophysics Data System (ADS)
Xie, Zonghong; Zhao, Wei; Zhang, Peng; Li, Xiang
2017-09-01
In the manufacturing process of the common smart skin antenna, the adhesive covered on the radiating elements of the antenna led to severe deviation of the resonant frequency, which degraded the electromagnetic performance of the antenna. In this paper, a new component called package cover was adopted to prevent the adhesive from covering on the radiating elements of the microstrip antenna array. The package cover and the microstrip antenna array were bonded together as packaged antenna which was then embedded into the composite sandwich structure to develop a new structure called conformal antenna composite structure (CACS). The geometric parameters of the microstrip antenna array and the CACS were optimized by the commercial software CST microwave studio. According to the optimal results, the microstrip antenna array and the CACS were manufactured and tested. The experimental and numerical results of electromagnetic performance showed that the resonant frequency of the CACS was close to that of the microstrip antenna array (with error less than 1%) and the CACS had a higher gain (about 2 dB) than the microstrip antenna array. The package system would increase the electromagnetic radiating energy at the design frequency nearly 66%. The numerical model generated by CST microwave studio in this study could successfully predict the electromagnetic performance of the microstrip antenna array and the CACS with relatively good accuracy. The mechanical analysis results showed that the CACS had better flexural property than the composite sandwich structure without the embedment of packaged antenna. The comparison of the electromagnetic performance for the CACS and the MECSSA showed that the package system was useful and effective.
Object-oriented design of medical imaging software.
Ligier, Y; Ratib, O; Logean, M; Girard, C; Perrier, R; Scherrer, J R
1994-01-01
A special software package for interactive display and manipulation of medical images was developed at the University Hospital of Geneva, as part of a hospital wide Picture Archiving and Communication System (PACS). This software package, called Osiris, was especially designed to be easily usable and adaptable to the needs of noncomputer-oriented physicians. The Osiris software has been developed to allow the visualization of medical images obtained from any imaging modality. It provides generic manipulation tools, processing tools, and analysis tools more specific to clinical applications. This software, based on an object-oriented paradigm, is portable and extensible. Osiris is available on two different operating systems: the Unix X-11/OSF-Motif based workstations, and the Macintosh family.
Guidelines for the analysis of free energy calculations.
Klimovich, Pavel V; Shirts, Michael R; Mobley, David L
2015-05-01
Free energy calculations based on molecular dynamics simulations show considerable promise for applications ranging from drug discovery to prediction of physical properties and structure-function studies. But these calculations are still difficult and tedious to analyze, and best practices for analysis are not well defined or propagated. Essentially, each group analyzing these calculations needs to decide how to conduct the analysis and, usually, develop its own analysis tools. Here, we review and recommend best practices for analysis yielding reliable free energies from molecular simulations. Additionally, we provide a Python tool, alchemical-analysis.py, freely available on GitHub as part of the pymbar package (located at http://github.com/choderalab/pymbar), that implements the analysis practices reviewed here for several reference simulation packages, which can be adapted to handle data from other packages. Both this review and the tool covers analysis of alchemical calculations generally, including free energy estimates via both thermodynamic integration and free energy perturbation-based estimators. Our Python tool also handles output from multiple types of free energy calculations, including expanded ensemble and Hamiltonian replica exchange, as well as standard fixed ensemble calculations. We also survey a range of statistical and graphical ways of assessing the quality of the data and free energy estimates, and provide prototypes of these in our tool. We hope this tool and discussion will serve as a foundation for more standardization of and agreement on best practices for analysis of free energy calculations.
A Description and Analysis of the German Packaging Take-Back System
ERIC Educational Resources Information Center
Nakajima, Nina; Vanderburg, Willem H.
2006-01-01
The German packaging ordinance is an example of legislated extended producer responsibility (also known as product take-back). Consumers can leave packaging with retailers, and packagers are required to pay for their recycling and disposal. It can be considered to be successful in reducing waste, spurring the redesign of packaging to be more…
'spup' - an R package for uncertainty propagation analysis in spatial environmental modelling
NASA Astrophysics Data System (ADS)
Sawicka, Kasia; Heuvelink, Gerard
2017-04-01
Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability and being able to deal with case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected visualization methods that are understandable by non-experts with limited background in statistics can be used to summarize and visualize uncertainty about the measured input, model parameters and output of the uncertainty propagation. We demonstrate that the 'spup' package is an effective and easy tool to apply and can be used in multi-disciplinary research and model-based decision support.
Hirsch, Robert M.; De Cicco, Laura A.
2015-01-01
Evaluating long-term changes in river conditions (water quality and discharge) is an important use of hydrologic data. To carry out such evaluations, the hydrologist needs tools to facilitate several key steps in the process: acquiring the data records from a variety of sources, structuring it in ways that facilitate the analysis, processing the data with routines that extract information about changes that may be happening, and displaying findings with graphical techniques. A pair of tightly linked R packages, called dataRetrieval and EGRET (Exploration and Graphics for RivEr Trends), have been developed for carrying out each of these steps in an integrated manner. They are designed to easily accept data from three sources: U.S. Geological Survey hydrologic data, U.S. Environmental Protection Agency (EPA) STORET data, and user-supplied flat files. The dataRetrieval package not only serves as a “front end” to the EGRET package, it can also be used to easily download many types of hydrologic data and organize it in ways that facilitate many other hydrologic applications. The EGRET package has components oriented towards the description of long-term changes in streamflow statistics (high flow, average flow, and low flow) as well as changes in water quality. For the water-quality analysis, it uses Weighted Regressions on Time, Discharge and Season (WRTDS) to describe long-term trends in both concentration and flux. EGRET also creates a wide range of graphical presentations of the water-quality data and of the WRTDS results. This report serves as a user guide to these two R packages, providing detailed guidance on installation and use of the software, documentation of the analysis methods used, as well as guidance on some of the kinds of questions and approaches that the software can facilitate.
fgui: A Method for Automatically Creating Graphical User Interfaces for Command-Line R Packages
Hoffmann, Thomas J.; Laird, Nan M.
2009-01-01
The fgui R package is designed for developers of R packages, to help rapidly, and sometimes fully automatically, create a graphical user interface for a command line R package. The interface is built upon the Tcl/Tk graphical interface included in R. The package further facilitates the developer by loading in the help files from the command line functions to provide context sensitive help to the user with no additional effort from the developer. Passing a function as the argument to the routines in the fgui package creates a graphical interface for the function, and further options are available to tweak this interface for those who want more flexibility. PMID:21625291
Safety analysis report for packaging (onsite) multicanister overpack cask
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edwards, W.S.
1997-07-14
This safety analysis report for packaging (SARP) documents the safety of shipments of irradiated fuel elements in the MUlticanister Overpack (MCO) and MCO Cask for a highway route controlled quantity, Type B fissile package. This SARP evaluates the package during transfers of (1) water-filled MCOs from the K Basins to the Cold Vacuum Drying Facility (CVDF) and (2) sealed and cold vacuum dried MCOs from the CVDF in the 100 K Area to the Canister Storage Building in the 200 East Area.
Shek, Daniel T L; Chan, Stephen C F
2013-01-01
To help university teachers to understand Service-Learning and develop Service-Learning subjects, a 3-h+ e-learning package was developed at The Hong Kong Polytechnic University (PolyU). There are seven units in this e-learning package: introduction session (Unit 1), what is Service-Learning? (Unit 2), impact and benefits of Service-Learning (Unit 3), myths and positive attitudes toward Service-Learning (Unit 4), developing a Service-Learning subject at PolyU (Unit 5), self-reflection about Service-Learning (Unit 6), and concluding session (Unit 7). To understand the views of the users on the e-learning package, the package was offered before formal launching. For the first offering, three focus group sessions were held. Results showed that the users were satisfied with the structural arrangement of the e-learning package and agreed that the e-learning package was useful for them to understand more about Service-Learning. For the second offering, colleagues were generally satisfied with the e-learning package and demonstrated gain in knowledge on Service-Learning. Suggestions for improvement were noted.
IFT Scientific Status Summary 2008: Innovative Food Packaging Solutions
USDA-ARS?s Scientific Manuscript database
Food and beverage packaging comprises 55-65% of the $110 billion value of packaging in the United States. This review provides a summary of innovative technology developments in food packaging. The expanded role of food and beverage packaging is reviewed. Active and intelligent food packaging, ba...
NASA Technical Reports Server (NTRS)
1981-01-01
This phase consists of the engineering design, fabrication, assembly, operation, economic analysis, and process support R&D for an Experimental Process System Development Unit (EPSDU). The mechanical bid package was issued and the bid responses are under evaluation. Similarly, the electrical bid package was issued, however, responses are not yet due. The majority of all equipment is on order or has been received at the EPSDU site. The pyrolysis/consolidation process design package was issued. Preparation of process and instrumentation diagram for the free-space reactor was started. In the area of melting/consolidation, Kayex successfully melted chunk silicon and have produced silicon shot. The free-space reactor powder was successfully transported pneumatically from a storage bin to the auger feeder twenty-five feet up and was melted. The fluid-bed PDU has successfully operated at silane feed concentrations up to 21%. The writing of the operating manual has started. Overall, the design phase is nearing completion.
MWASTools: an R/bioconductor package for metabolome-wide association studies.
Rodriguez-Martinez, Andrea; Posma, Joram M; Ayala, Rafael; Neves, Ana L; Anwar, Maryam; Petretto, Enrico; Emanueli, Costanza; Gauguier, Dominique; Nicholson, Jeremy K; Dumas, Marc-Emmanuel
2018-03-01
MWASTools is an R package designed to provide an integrated pipeline to analyse metabonomic data in large-scale epidemiological studies. Key functionalities of our package include: quality control analysis; metabolome-wide association analysis using various models (partial correlations, generalized linear models); visualization of statistical outcomes; metabolite assignment using statistical total correlation spectroscopy (STOCSY); and biological interpretation of metabolome-wide association studies results. The MWASTools R package is implemented in R (version > =3.4) and is available from Bioconductor: https://bioconductor.org/packages/MWASTools/. m.dumas@imperial.ac.uk. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.
Dual Arm Work Package performance estimates and telerobot task network simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Draper, J.V.; Blair, L.M.
1997-02-01
This paper describes the methodology and results of a network simulation study of the Dual Arm Work Package (DAWP), to be employed for dismantling the Argonne National Laboratory CP-5 reactor. The development of the simulation model was based upon the results of a task analysis for the same system. This study was performed by the Oak Ridge National Laboratory (ORNL), in the Robotics and Process Systems Division. Funding was provided the US Department of Energy`s Office of Technology Development, Robotics Technology Development Program (RTDP). The RTDP is developing methods of computer simulation to estimate telerobotic system performance. Data were collectedmore » to provide point estimates to be used in a task network simulation model. Three skilled operators performed six repetitions of a pipe cutting task representative of typical teleoperation cutting operations.« less
Ryberg, Karen R.; Vecchia, Aldo V.
2012-01-01
Hydrologic time series data and associated anomalies (multiple components of the original time series representing variability at longer-term and shorter-term time scales) are useful for modeling trends in hydrologic variables, such as streamflow, and for modeling water-quality constituents. An R package, called waterData, has been developed for importing daily hydrologic time series data from U.S. Geological Survey streamgages into the R programming environment. In addition to streamflow, data retrieval may include gage height and continuous physical property data, such as specific conductance, pH, water temperature, turbidity, and dissolved oxygen. The package allows for importing daily hydrologic data into R, plotting the data, fixing common data problems, summarizing the data, and the calculation and graphical presentation of anomalies.
DOT National Transportation Integrated Search
2014-09-01
The objective of this research was to provide an improved understanding of pedestrian-vehicle interaction : at mid-block pedestrian crossings and develop methods that can be used in traffic operational analysis and : microsimulation packages. Models ...
Models and techniques for evaluating the effectiveness of aircraft computing systems
NASA Technical Reports Server (NTRS)
Meyer, J. F.
1978-01-01
Progress in the development of system models and techniques for the formulation and evaluation of aircraft computer system effectiveness is reported. Topics covered include: analysis of functional dependence: a prototype software package, METAPHOR, developed to aid the evaluation of performability; and a comprehensive performability modeling and evaluation exercise involving the SIFT computer.
Novel Techniques for Millimeter-Wave Packages
NASA Technical Reports Server (NTRS)
Herman, Martin I.; Lee, Karen A.; Kolawa, Elzbieta A.; Lowry, Lynn E.; Tulintseff, Ann N.
1995-01-01
A new millimeter-wave package architecture with supporting electrical, mechanical and material science experiment and analysis is presented. This package is well suited for discrete devices, monolithic microwave integrated circuits (MMIC's) and multichip module (MCM) applications. It has low-loss wide-band RF transitions which are necessary to overcome manufacturing tolerances leading to lower per unit cost Potential applications of this new packaging architecture which go beyond the standard requirements of device protection include integration of antennas, compatibility to photonic networks and direct transitions to waveguide systems. Techniques for electromagnetic analysis, thermal control and hermetic sealing were explored. Three dimensional electromagnetic analysis was performed using a finite difference time-domain (FDTD) algorithm and experimentally verified for millimeter-wave package input and output transitions. New multi-material system concepts (AlN, Cu, and diamond thin films) which allow excellent surface finishes to be achieved with enhanced thermal management have been investigated. A new approach utilizing block copolymer coatings was employed to hermetically seal packages which met MIL STD-883.
GenomeGraphs: integrated genomic data visualization with R.
Durinck, Steffen; Bullard, James; Spellman, Paul T; Dudoit, Sandrine
2009-01-06
Biological studies involve a growing number of distinct high-throughput experiments to characterize samples of interest. There is a lack of methods to visualize these different genomic datasets in a versatile manner. In addition, genomic data analysis requires integrated visualization of experimental data along with constantly changing genomic annotation and statistical analyses. We developed GenomeGraphs, as an add-on software package for the statistical programming environment R, to facilitate integrated visualization of genomic datasets. GenomeGraphs uses the biomaRt package to perform on-line annotation queries to Ensembl and translates these to gene/transcript structures in viewports of the grid graphics package. This allows genomic annotation to be plotted together with experimental data. GenomeGraphs can also be used to plot custom annotation tracks in combination with different experimental data types together in one plot using the same genomic coordinate system. GenomeGraphs is a flexible and extensible software package which can be used to visualize a multitude of genomic datasets within the statistical programming environment R.
Padilla-Sanchez, Victor; Gao, Song; Kim, Hyung Rae; Kihara, Daisuke; Sun, Lei; Rossmann, Michael G.; Rao, Venigalla B.
2013-01-01
Tailed bacteriophages and herpesviruses consist of a structurally well conserved dodecameric portal at a special five-fold vertex of the capsid. The portal plays critical roles in head assembly, genome packaging, neck/tail attachment, and genome ejection. Although the structures of portals from phages φ29, SPP1 and P22 have been determined, their mechanistic roles have not been well understood. Structural analysis of phage T4 portal (gp20) has been hampered because of its unusual interaction with the E. coli inner membrane. Here, we predict atomic models for the T4 portal monomer and dodecamer, and fit the dodecamer into the cryoEM density of the phage portal vertex. The core structure, like that from other phages, is cone-shaped with the wider end containing the “wing” and “crown” domains inside the phage head. A long “stem” encloses a central channel, and a narrow “stalk” protrudes outside the capsid. A biochemical approach was developed to analyze portal function by incorporating plasmid-expressed portal protein into phage heads and determining the effect of mutations on head assembly, DNA translocation, and virion production. We found that the protruding loops of the stalk domain are involved in assembling the DNA packaging motor. A loop that connects the stalk to the channel might be required for communication between the motor and portal. The “tunnel” loops that project into the channel are essential for sealing the packaged head. These studies established that the portal is required throughout the DNA packaging process, with different domains participating at different stages of genome packaging. PMID:24126213
Toward the greening of nuclear energy: A content analysis of nuclear energy frames from 1991 to 2008
NASA Astrophysics Data System (ADS)
Miller, Sonya R.
Framing theory has emerged as one of the predominant theories employed in mass communications research in the 21st century. Frames are identified as interpretive packages for content where some issue attributes are highlighted over other attributes. While framing effects studies appear plentiful, longitudinal studies assessing trends in dominant framing packages and story elements for an issue appear to be less understood. Through content analysis, this study examines dominant frame packages, story elements, headline tone, story tone, stereotypes, and source attribution for nuclear energy from 1991-2008 in the New York Times, USA Today, the Wall Street Journal, and the Washington Post. Unlike many content analysis studies, this study compares intercoder reliability among three indices---percentage agreement, proportional reduction of loss and Scott's Pi. The newspapers represented in this study possess a commonality in the types of dominant frames packages employed. Significant dominant frame packages among the four newspapers include human/health, proliferation, procedural, and marketplace. While the procedural frame package was more likely to appear prior to the 1997 Kyoto Protocol, the proliferation frame packaged was more likely to appear after the Kyoto Protol. Over time, the sustainable frame package demonstrated increased significance. This study is part of the growing literature regarding the function of frames over time.
Systemic bioinformatics analysis of skeletal muscle gene expression profiles of sepsis
Yang, Fang; Wang, Yumei
2018-01-01
Sepsis is a type of systemic inflammatory response syndrome with high morbidity and mortality. Skeletal muscle dysfunction is one of the major complications of sepsis that may also influence the outcome of sepsis. The aim of the present study was to explore and identify potential mechanisms and therapeutic targets of sepsis. Systemic bioinformatics analysis of skeletal muscle gene expression profiles from the Gene Expression Omnibus was performed. Differentially expressed genes (DEGs) in samples from patients with sepsis and control samples were screened out using the limma package. Differential co-expression and coregulation (DCE and DCR, respectively) analysis was performed based on the Differential Co-expression Analysis package to identify differences in gene co-expression and coregulation patterns between the control and sepsis groups. Gene Ontology terms and Kyoto Encyclopedia of Genes and Genomes pathways of DEGs were identified using the Database for Annotation, Visualization and Integrated Discovery, and inflammatory, cancer and skeletal muscle development-associated biological processes and pathways were identified. DCE and DCR analysis revealed several potential therapeutic targets for sepsis, including genes and transcription factors. The results of the present study may provide a basis for the development of novel therapeutic targets and treatment methods for sepsis. PMID:29805480
DOE Office of Scientific and Technical Information (OSTI.GOV)
Driscoll, Frederick; Platt, Andrew; Sirnivas, Senu
This project was performed under the Work for Others—Funds in Agreement FIA-14-1793 between Statoil and the Alliance for Sustainable Energy, manager and operator of the National Renewable Energy Laboratory (NREL). To support the development of a 6-MW spar-mounted offshore wind turbine, Statoil funded NREL to perform tasks in the following three categories: 1. Design and analysis 2. Wake modeling 3. Concept resource assessment. This summarizes the document reports on the design and analysis work package, which built a FAST [Jonkman & Buhl, 2005] computer model of the Hywind 6-MW floating wind turbine system and uses this tool to evaluate themore » performance of a set of design load cases (DLCs). The FAST model was also used in Work Package 2: Wake Modeling.« less
An Interactive Tool for Discrete Phase Analysis in Two-Phase Flows
NASA Technical Reports Server (NTRS)
Dejong, Frederik J.; Thoren, Stephen J.
1993-01-01
Under a NASA MSFC SBIR Phase 1 effort an interactive software package has been developed for the analysis of discrete (particulate) phase dynamics in two-phase flows in which the discrete phase does not significantly affect the continuous phase. This package contains a Graphical User Interface (based on the X Window system and the Motif tool kit) coupled to a particle tracing program, which allows the user to interactively set up and run a case for which a continuous phase grid and flow field are available. The software has been applied to a solid rocket motor problem, to demonstrate its ease of use and its suitability for problems of engineering interest, and has been delivered to NASA Marshall Space Flight Center.
NASA Technical Reports Server (NTRS)
Hirt, E. F.; Fox, G. L.
1982-01-01
Two specific NASTRAN preprocessors and postprocessors are examined. A postprocessor for dynamic analysis and a graphical interactive package for model generation and review of resuls are presented. A computer program that provides response spectrum analysis capability based on data from NASTRAN finite element model is described and the GIFTS system, a graphic processor to augment NASTRAN is introduced.
Initial Results: An Ultra-Low-Background Germanium Crystal Array
2010-09-01
data (focused on γ -γ coincidence signatures) (Smith et al., 2004) and the Multi- Isotope Coincidence Analysis code (MICA) (Warren et al., 2006). The...The follow-on “CASCADES” project aims to develop a multicoincidence data- analysis package and make robust fission-product demonstration measurements...sensitivity. This effort is focused on improving gamma analysis capabilities for nuclear detonation detection (NDD) applications, e.g., nuclear treaty
Bax, Leon; Yu, Ly-Mee; Ikeda, Noriaki; Tsuruta, Harukazu; Moons, Karel GM
2006-01-01
Background Meta-analysis has become a well-known method for synthesis of quantitative data from previously conducted research in applied health sciences. So far, meta-analysis has been particularly useful in evaluating and comparing therapies and in assessing causes of disease. Consequently, the number of software packages that can perform meta-analysis has increased over the years. Unfortunately, it can take a substantial amount of time to get acquainted with some of these programs and most contain little or no interactive educational material. We set out to create and validate an easy-to-use and comprehensive meta-analysis package that would be simple enough programming-wise to remain available as a free download. We specifically aimed at students and researchers who are new to meta-analysis, with important parts of the development oriented towards creating internal interactive tutoring tools and designing features that would facilitate usage of the software as a companion to existing books on meta-analysis. Results We took an unconventional approach and created a program that uses Excel as a calculation and programming platform. The main programming language was Visual Basic, as implemented in Visual Basic 6 and Visual Basic for Applications in Excel 2000 and higher. The development took approximately two years and resulted in the 'MIX' program, which can be downloaded from the program's website free of charge. Next, we set out to validate the MIX output with two major software packages as reference standards, namely STATA (metan, metabias, and metatrim) and Comprehensive Meta-Analysis Version 2. Eight meta-analyses that had been published in major journals were used as data sources. All numerical and graphical results from analyses with MIX were identical to their counterparts in STATA and CMA. The MIX program distinguishes itself from most other programs by the extensive graphical output, the click-and-go (Excel) interface, and the educational features. Conclusion The MIX program is a valid tool for performing meta-analysis and may be particularly useful in educational environments. It can be downloaded free of charge via or . PMID:17038197
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiang, Wei; Gowri, Krishnan; Thornton, Brian A.
2010-06-30
This paper presents the process, methodology, and assumptions for development of the 50% Energy Savings Design Technology Packages for Highway Lodging Buildings, a design guidance document that provides specific recommendations for achieving 50% energy savings in roadside motels (highway lodging) above the requirements of ANSI/ASHRAE/IESNA Standard 90.1-2004. This 50% solution represents a further step toward realization of the U.S. Department of Energy’s net-zero energy building goal, and go beyond the 30% savings in the Advanced Energy Design Guide series (upon which this work was built). This work can serve as the technical feasibility study for the development of a 50%more » saving Advanced Energy Design Guide for highway lodging, and thus should greatly expedite the development process. The purpose of this design package is to provide user-friendly design assistance to designers, developers, and owners of highway lodging properties. It is intended to encourage energy-efficient design by providing prescriptive energy-efficiency recommendations for each climate zone that attains the 50% the energy savings target. This paper describes the steps that were taken to demonstrate the technical feasibility of achieving a 50% reduction in whole-building energy use with practical and commercially available technologies. The energy analysis results are presented, indicating the recommended energy-efficient measures achieved a national-weighted average energy savings of 55%, relative to Standard 90.1-2004. The cost-effectiveness of the recommended technology package is evaluated and the result shows an average simple payback of 11.3 years.« less
In-Package Chemistry Abstraction
DOE Office of Scientific and Technical Information (OSTI.GOV)
E. Thomas
2004-11-09
This report was developed in accordance with the requirements in ''Technical Work Plan for: Regulatory Integration Modeling and Analysis of the Waste Form and Waste Package'' (BSC 2004 [DIRS 171583]). The purpose of the in-package chemistry model is to predict the bulk chemistry inside of a breached waste package and to provide simplified expressions of that chemistry as function of time after breach to Total Systems Performance Assessment for the License Application (TSPA-LA). The scope of this report is to describe the development and validation of the in-package chemistry model. The in-package model is a combination of two models, amore » batch reactor model that uses the EQ3/6 geochemistry-modeling tool, and a surface complexation model that is applied to the results of the batch reactor model. The batch reactor model considers chemical interactions of water with the waste package materials and the waste form for commercial spent nuclear fuel (CSNF) waste packages and codisposed waste packages that contain both high-level waste glass (HLWG) and DOE spent fuel. The surface complexation model includes the impact of fluid-surface interactions (i.e., surface complexation) on the resulting fluid composition. The model examines two types of water influx: (1) the condensation of water vapor that diffuses into the waste package, and (2) seepage water that enters the waste package from the drift as a liquid. (1) Vapor Influx Case: The condensation of vapor onto the waste package internals is simulated as pure H2O and enters at a rate determined by the water vapor pressure for representative temperature and relative humidity conditions. (2) Water Influx Case: The water entering a waste package from the drift is simulated as typical groundwater and enters at a rate determined by the amount of seepage available to flow through openings in a breached waste package. TSPA-LA uses the vapor influx case for the nominal scenario for simulations where the waste package has been breached but the drip shield remains intact, so all of the seepage flow is diverted from the waste package. The chemistry from the vapor influx case is used to determine the stability of colloids and the solubility of radionuclides available for transport by diffusion, and to determine the degradation rates for the waste forms. TSPA-LA uses the water influx case for the seismic scenario, where the waste package has been breached and the drip shield has been damaged such that seepage flow is actually directed into the waste package. The chemistry from the water influx case that is a function of the flow rate is used to determine the stability of colloids and the solubility of radionuclides available for transport by diffusion and advection, and to determine the degradation rates for the CSNF and HLW glass. TSPA-LA does not use this model for the igneous scenario. Outputs from the in-package chemistry model implemented inside TSPA-LA include pH, ionic strength, and total carbonate concentration. These inputs to TSPA-LA will be linked to the following principle factors: dissolution rates of the CSNF and HLWG, dissolved concentrations of radionuclides, and colloid generation.« less
REddyProc: Enabling researchers to process Eddy-Covariance data
NASA Astrophysics Data System (ADS)
Wutzler, Thomas; Moffat, Antje; Migliavacca, Mirco; Knauer, Jürgen; Menzer, Olaf; Sickel, Kerstin; Reichstein, Markus
2017-04-01
Analysing Eddy-Covariance measurements involves extensive processing, which puts technical labour to researchers. There is a need to overcome difficulties in data processing associated with deploying, adapting and using existing software and online tools. We tackled that need by developing the REddyProc package in the open source cross-platform language R that provides standard processing routines for reading half-hourly files from different formats, including from the recently released FLUXNET 2015 dataset, uStar threshold estimation and associated uncertainty, gap-filling, flux partitioning (both night-time or daytime based), and visualization of results. Although different in some features, the package mimics the online tool that has been extensively used by many users and site Principal Investigators (PIs) in the last years, and available on the website of the Max Planck Institute for Biogeochemistry. Generally, REddyProc results are statistically equal to results based on the state-of the art tools. The provided routines can be easily installed, configured, used, and integrated with further analysis. Hence the eddy covariance community will benefit from using the provided package allowing easier integration of standard processing with extended analysis. This complements activities by AmeriFlux, ICOS, NEON, and other regional networks for developing codes for standardized data processing of multiple sites in FLUXNET.
NASA Astrophysics Data System (ADS)
Chen, Kewei; Ge, Xiaolin; Yao, Li; Bandy, Dan; Alexander, Gene E.; Prouty, Anita; Burns, Christine; Zhao, Xiaojie; Wen, Xiaotong; Korn, Ronald; Lawson, Michael; Reiman, Eric M.
2006-03-01
Having approved fluorodeoxyglucose positron emission tomography (FDG PET) for the diagnosis of Alzheimer's disease (AD) in some patients, the Centers for Medicare and Medicaid Services suggested the need to develop and test analysis techniques to optimize diagnostic accuracy. We developed an automated computer package comparing an individual's FDG PET image to those of a group of normal volunteers. The normal control group includes FDG-PET images from 82 cognitively normal subjects, 61.89+/-5.67 years of age, who were characterized demographically, clinically, neuropsychologically, and by their apolipoprotein E genotype (known to be associated with a differential risk for AD). In addition, AD-affected brain regions functionally defined as based on a previous study (Alexander, et al, Am J Psychiatr, 2002) were also incorporated. Our computer package permits the user to optionally select control subjects, matching the individual patient for gender, age, and educational level. It is fully streamlined to require minimal user intervention. With one mouse click, the program runs automatically, normalizing the individual patient image, setting up a design matrix for comparing the single subject to a group of normal controls, performing the statistics, calculating the glucose reduction overlap index of the patient with the AD-affected brain regions, and displaying the findings in reference to the AD regions. In conclusion, the package automatically contrasts a single patient to a normal subject database using sound statistical procedures. With further validation, this computer package could be a valuable tool to assist physicians in decision making and communicating findings with patients and patient families.
Control system design and analysis using the INteractive Controls Analysis (INCA) program
NASA Technical Reports Server (NTRS)
Bauer, Frank H.; Downing, John P.
1987-01-01
The INteractive Controls Analysis (INCA) program was developed at the Goddard Space Flight Center to provide a user friendly efficient environment for the design and analysis of linear control systems. Since its inception, INCA has found extensive use in the design, development, and analysis of control systems for spacecraft, instruments, robotics, and pointing systems. Moreover, the results of the analytic tools imbedded in INCA have been flight proven with at least three currently orbiting spacecraft. This paper describes the INCA program and illustrates, using a flight proven example, how the package can perform complex design analyses with relative ease.
MAP - a mapping and analysis program for harvest planning
Robert N. Eli; Chris B. LeDoux; Penn A. Peters
1984-01-01
The Northeastern Forest Experiment Station and the Department of Civil Engineering at West Virginia University are cooperating in the development of a Mapping and Analysis Program, to be named MAP. The goal of this computer software package is to significantly improve the planning and harvest efficiency of small to moderately sized harvest units located in mountainous...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bower, G.
We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities.
Mishima, Hiroyuki; Lidral, Andrew C; Ni, Jun
2008-05-28
Genetic association studies have been used to map disease-causing genes. A newly introduced statistical method, called exhaustive haplotype association study, analyzes genetic information consisting of different numbers and combinations of DNA sequence variations along a chromosome. Such studies involve a large number of statistical calculations and subsequently high computing power. It is possible to develop parallel algorithms and codes to perform the calculations on a high performance computing (HPC) system. However, most existing commonly-used statistic packages for genetic studies are non-parallel versions. Alternatively, one may use the cutting-edge technology of grid computing and its packages to conduct non-parallel genetic statistical packages on a centralized HPC system or distributed computing systems. In this paper, we report the utilization of a queuing scheduler built on the Grid Engine and run on a Rocks Linux cluster for our genetic statistical studies. Analysis of both consecutive and combinational window haplotypes was conducted by the FBAT (Laird et al., 2000) and Unphased (Dudbridge, 2003) programs. The dataset consisted of 26 loci from 277 extended families (1484 persons). Using the Rocks Linux cluster with 22 compute-nodes, FBAT jobs performed about 14.4-15.9 times faster, while Unphased jobs performed 1.1-18.6 times faster compared to the accumulated computation duration. Execution of exhaustive haplotype analysis using non-parallel software packages on a Linux-based system is an effective and efficient approach in terms of cost and performance.
Mishima, Hiroyuki; Lidral, Andrew C; Ni, Jun
2008-01-01
Background Genetic association studies have been used to map disease-causing genes. A newly introduced statistical method, called exhaustive haplotype association study, analyzes genetic information consisting of different numbers and combinations of DNA sequence variations along a chromosome. Such studies involve a large number of statistical calculations and subsequently high computing power. It is possible to develop parallel algorithms and codes to perform the calculations on a high performance computing (HPC) system. However, most existing commonly-used statistic packages for genetic studies are non-parallel versions. Alternatively, one may use the cutting-edge technology of grid computing and its packages to conduct non-parallel genetic statistical packages on a centralized HPC system or distributed computing systems. In this paper, we report the utilization of a queuing scheduler built on the Grid Engine and run on a Rocks Linux cluster for our genetic statistical studies. Results Analysis of both consecutive and combinational window haplotypes was conducted by the FBAT (Laird et al., 2000) and Unphased (Dudbridge, 2003) programs. The dataset consisted of 26 loci from 277 extended families (1484 persons). Using the Rocks Linux cluster with 22 compute-nodes, FBAT jobs performed about 14.4–15.9 times faster, while Unphased jobs performed 1.1–18.6 times faster compared to the accumulated computation duration. Conclusion Execution of exhaustive haplotype analysis using non-parallel software packages on a Linux-based system is an effective and efficient approach in terms of cost and performance. PMID:18541045
R package to estimate intracluster correlation coefficient with confidence interval for binary data.
Chakraborty, Hrishikesh; Hossain, Akhtar
2018-03-01
The Intracluster Correlation Coefficient (ICC) is a major parameter of interest in cluster randomized trials that measures the degree to which responses within the same cluster are correlated. There are several types of ICC estimators and its confidence intervals (CI) suggested in the literature for binary data. Studies have compared relative weaknesses and advantages of ICC estimators as well as its CI for binary data and suggested situations where one is advantageous in practical research. The commonly used statistical computing systems currently facilitate estimation of only a very few variants of ICC and its CI. To address the limitations of current statistical packages, we developed an R package, ICCbin, to facilitate estimating ICC and its CI for binary responses using different methods. The ICCbin package is designed to provide estimates of ICC in 16 different ways including analysis of variance methods, moments based estimation, direct probabilistic methods, correlation based estimation, and resampling method. CI of ICC is estimated using 5 different methods. It also generates cluster binary data using exchangeable correlation structure. ICCbin package provides two functions for users. The function rcbin() generates cluster binary data and the function iccbin() estimates ICC and it's CI. The users can choose appropriate ICC and its CI estimate from the wide selection of estimates from the outputs. The R package ICCbin presents very flexible and easy to use ways to generate cluster binary data and to estimate ICC and it's CI for binary response using different methods. The package ICCbin is freely available for use with R from the CRAN repository (https://cran.r-project.org/package=ICCbin). We believe that this package can be a very useful tool for researchers to design cluster randomized trials with binary outcome. Copyright © 2017 Elsevier B.V. All rights reserved.
Natural biopolimers in organic food packaging
NASA Astrophysics Data System (ADS)
Wieczynska, Justyna; Cavoski, Ivana; Chami, Ziad Al; Mondelli, Donato; Di Donato, Paola; Di Terlizzi, Biagio
2014-05-01
Concerns on environmental and waste problems caused by use of non-biodegradable and non-renewable based plastic packaging have caused an increase interest in developing biodegradable packaging using renewable natural biopolymers. Recently, different types of biopolymers like starch, cellulose, chitosan, casein, whey protein, collagen, egg white, soybean protein, corn zein, gelatin and wheat gluten have attracted considerable attention as potential food packaging materials. Recyclable or biodegradable packaging material in organic processing standards is preferable where possible but specific principles of packaging are not precisely defined and standards have to be assessed. There is evidence that consumers of organic products have specific expectations not only with respect to quality characteristics of processed food but also in social and environmental aspects of food production. Growing consumer sophistication is leading to a proliferation in food eco-label like carbon footprint. Biopolymers based packaging for organic products can help to create a green industry. Moreover, biopolymers can be appropriate materials for the development of an active surfaces designed to deliver incorporated natural antimicrobials into environment surrounding packaged food. Active packaging is an innovative mode of packaging in which the product and the environment interact to prolong shelf life or enhance safety or sensory properties, while maintaining the quality of the product. The work will discuss the various techniques that have been used for development of an active antimicrobial biodegradable packaging materials focusing on a recent findings in research studies. With the current focus on exploring a new generation of biopolymer-based food packaging materials with possible applications in organic food packaging. Keywords: organic food, active packaging, biopolymers , green technology
Thornberg, Steven M [Peralta, NM
2012-07-31
A system is provided for testing the hermeticity of a package, such as a microelectromechanical systems package containing a sealed gas volume, with a sampling device that has the capability to isolate the package and breach the gas seal connected to a pulse valve that can controllably transmit small volumes down to 2 nanoliters to a gas chamber for analysis using gas chromatography/mass spectroscopy diagnostics.
Taminau, Jonatan; Meganck, Stijn; Lazar, Cosmin; Steenhoff, David; Coletta, Alain; Molter, Colin; Duque, Robin; de Schaetzen, Virginie; Weiss Solís, David Y; Bersini, Hugues; Nowé, Ann
2012-12-24
With an abundant amount of microarray gene expression data sets available through public repositories, new possibilities lie in combining multiple existing data sets. In this new context, analysis itself is no longer the problem, but retrieving and consistently integrating all this data before delivering it to the wide variety of existing analysis tools becomes the new bottleneck. We present the newly released inSilicoMerging R/Bioconductor package which, together with the earlier released inSilicoDb R/Bioconductor package, allows consistent retrieval, integration and analysis of publicly available microarray gene expression data sets. Inside the inSilicoMerging package a set of five visual and six quantitative validation measures are available as well. By providing (i) access to uniformly curated and preprocessed data, (ii) a collection of techniques to remove the batch effects between data sets from different sources, and (iii) several validation tools enabling the inspection of the integration process, these packages enable researchers to fully explore the potential of combining gene expression data for downstream analysis. The power of using both packages is demonstrated by programmatically retrieving and integrating gene expression studies from the InSilico DB repository [https://insilicodb.org/app/].
GIS-Based System of Hydrologic and Hydraulic Applications for Highway Engineering
DOT National Transportation Integrated Search
1999-10-01
In this research project, a GIS has been developed to assist in the design of highway drainage facilities by utilizing hydrologic spatial data to calculate the input parameters for standard hydrologic software packages. This GIS reduces the analysis ...
Lora, Antonio; Cosentino, Ugo; Gandini, Anna; Zocchetti, Carlo
2007-01-01
The treatment of schizophrenic disorders is the most important challenge for community care. The analysis focuses on packages of care provided to 23.602 patients with a ICD-10 diagnosis of schizophrenic disorder and treated in 2001 by the Departments of Mental Health in Lombardy, Italy. Packages of care refer to a mix of treatments provided to each patient during the year by different settings. Direct costs of the packages were calculated. Linear Discriminant Analysis has been used to link socio-demographic and diagnostic sub-groups of the patients to packages of care. People with schizophrenic disorders received relatively few care packages: only four packages involved more than 5%. Two thirds of the patients received only care provided by Community Mental Health Centres. In the other two packages with a percentage over 5%, the activity was provided by CMHCs, jointly with General Hospitals or Day Care Facilities. Complex care packages were rare (only 6%). As well as the intensity, also the variety of care provided by CMHCs increased with the complexity of care packages. In Lombardy more than half of the resources were spent for schizophrenia. The range of the costs per package was very wide. LDA failed to link characteristics of the patients to packages of care. Care packages are useful tools to understand better how mental health system works, how resources have been spent and to point out problems in the quality of care.
The gputools package enables GPU computing in R.
Buckner, Joshua; Wilson, Justin; Seligman, Mark; Athey, Brian; Watson, Stanley; Meng, Fan
2010-01-01
By default, the R statistical environment does not make use of parallelism. Researchers may resort to expensive solutions such as cluster hardware for large analysis tasks. Graphics processing units (GPUs) provide an inexpensive and computationally powerful alternative. Using R and the CUDA toolkit from Nvidia, we have implemented several functions commonly used in microarray gene expression analysis for GPU-equipped computers. R users can take advantage of the better performance provided by an Nvidia GPU. The package is available from CRAN, the R project's repository of packages, at http://cran.r-project.org/web/packages/gputools More information about our gputools R package is available at http://brainarray.mbni.med.umich.edu/brainarray/Rgpgpu
Instrumentation for motor-current signature analysis using synchronous sampling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Castleberry, K.N.
1996-07-01
Personnel in the Instrumentation and Controls Division at Oak Ridge National Laboratory, in association with the United States Enrichment Corporation, the U.S. Navy, and various Department of Energy sponsors, have been involved in the development and application of motor-current signature analysis for several years. In that time, innovation in the field has resulted in major improvements in signal processing, analysis, and system performance and capabilities. Recent work has concentrated on industrial implementation of one of the most promising new techniques. This report describes the developed method and the instrumentation package that is being used to investigate and develop potential applications.
Single-Crystal Sapphire Optical Fiber Sensor Instrumentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pickrell, Gary; Scott, Brian; Wang, Anbo
2013-12-31
This report summarizes technical progress on the program “Single-Crystal Sapphire Optical Fiber Sensor Instrumentation,” funded by the National Energy Technology Laboratory of the U.S. Department of Energy, and performed by the Center for Photonics Technology of the Bradley Department of Electrical and Computer Engineering at Virginia Tech. This project was completed in three phases, each with a separate focus. Phase I of the program, from October 1999 to April 2002, was devoted to development of sensing schema for use in high temperature, harsh environments. Different sensing designs were proposed and tested in the laboratory. Phase II of the program, frommore » April 2002 to April 2009, focused on bringing the sensor technologies, which had already been successfully demonstrated in the laboratory, to a level where the sensors could be deployed in harsh industrial environments and eventually become commercially viable through a series of field tests. Also, a new sensing scheme was developed and tested with numerous advantages over all previous ones in Phase II. Phase III of the program, September 2009 to December 2013, focused on development of the new sensing scheme for field testing in conjunction with materials engineering of the improved sensor packaging lifetimes. In Phase I, three different sensing principles were studied: sapphire air-gap extrinsic Fabry-Perot sensors; intensity-based polarimetric sensors; and broadband polarimetric sensors. Black body radiation tests and corrosion tests were also performed in this phase. The outcome of the first phase of this program was the selection of broadband polarimetric differential interferometry (BPDI) for further prototype instrumentation development. This approach is based on the measurement of the optical path difference (OPD) between two orthogonally polarized light beams in a single-crystal sapphire disk. At the beginning of Phase II, in June 2004, the BPDI sensor was tested at the Wabash River coal gasifier facility in Terre Haute, Indiana. Due to business conditions at industrial partner and several logistical problems, this field test was not successful. An alternative high-temperature sensing system using sapphire wafer-based extrinsic Fabry-Perot interferometry was then developed as a significant improvement over the BPDI solution. From June 2006 to June 2008, three consecutive field tests were performed with the new sapphire wafer sensors at the TECO coal gasifier in Tampa, Florida. One of the sensors survived in the industrial coal gasifier for 7 months, over which time the existing thermocouples were replaced twice. The outcome of these TECO field tests suggests that the sapphire wafer sensor has very good potential to be commercialized. However packaging and sensor protection issues need additional development. During Phase III, several major improvements in the design and fabrication process of the sensor have been achieved through experiments and theoretical analysis. Studies on the property of the key components in the sensor head, including the sapphire fiber and sapphire wafer, were also conducted, for a better understanding of the sensor behavior. A final design based on all knowledge and experience has been developed, free of any issues encountered during the entire research. Sensors with this design performed well as expected in lab long-term tests, and were deployed in the sensing probe of the final coal-gasifier field test. Sensor packaging and protection was improved through materials engineering through testing of packaging designs in two blank probe packaging tests at Eastman Chemical in Kingsport, TN. Performance analysis of the blank probe packaging resulted in improve package designs culminating in a 3rd generation probe packaging utilized for the full field test of the sapphire optical sensor and materials designed sensor packaging.« less
Food packaging history and innovations.
Risch, Sara J
2009-09-23
Food packaging has evolved from simply a container to hold food to something today that can play an active role in food quality. Many packages are still simply containers, but they have properties that have been developed to protect the food. These include barriers to oxygen, moisture, and flavors. Active packaging, or that which plays an active role in food quality, includes some microwave packaging as well as packaging that has absorbers built in to remove oxygen from the atmosphere surrounding the product or to provide antimicrobials to the surface of the food. Packaging has allowed access to many foods year-round that otherwise could not be preserved. It is interesting to note that some packages have actually allowed the creation of new categories in the supermarket. Examples include microwave popcorn and fresh-cut produce, which owe their existence to the unique packaging that has been developed.
Brandwein, Michael; Al-Quntar, Abed; Goldberg, Hila; Mosheyev, Gregory; Goffer, Moshe; Marin-Iniesta, Fulgencio; López-Gómez, Antonio; Steinberg, Doron
2016-01-01
Various surfaces associated with the storage and packing of food are known to harbor distinct bacterial pathogens. Conspicuously absent among the plethora of studies implicating food packaging materials and machinery is the study of corrugated cardboard packaging, the worldwide medium for transporting fresh produce. In this study, we observed the microbial communities of three different store-bought fruits and vegetables, along with their analog cardboard packaging using high throughput sequencing technology. We further developed an anti-biofilm polymer meant to coat corrugated cardboard surfaces and mediate bacterial biofilm growth on said surfaces. Integration of a novel thiazolidinedione derivative into the acrylic emulsion polymers was assessed using Energy Dispersive X-ray Spectrometry (EDS) analysis and surface topography was visualized and quantified on corrugated cardboard surfaces. Biofilm growth was measured using q-PCR targeting the gene encoding 16s rRNA. Additionally, architectural structure of the biofilm was observed using SEM. The uniform integration of the thiazolidinedione derivative TZD-6 was confirmed, and it was determined via q-PCR to reduce biofilm growth by ~80% on tested surfaces. A novel and effective method for reducing microbial load and preventing contamination on food packaging is thereby proposed.
Photonics and nanophotonics and information and communication technologies in modern food packaging.
Sarapulova, Olha; Sherstiuk, Valentyn; Shvalagin, Vitaliy; Kukhta, Aleksander
2015-01-01
The analysis of the problem of conjunction of information and communication technologies (ICT) with packaging industry and food production was made. The perspective of combining the latest advances of nanotechnology, including nanophotonics, and ICT for creating modern smart packaging was shown. There were investigated luminescent films with zinc oxide nanoparticles, which change luminescence intensity as nano-ZnO interacts with decay compounds of food products, for active and intelligent packaging. High luminescent transparent films were obtained from colloidal suspension of ZnO and polyvinylpyrrolidone (PVP). The influence of molecular mass, concentration of nano-ZnO, and film thickness on luminescent properties of films was studied in order to optimize the content of the compositions. The possibility of covering the obtained films with polyvinyl alcohol was considered for eliminating water soluble properties of PVP. The luminescent properties of films with different covers were studied. The insoluble in water composition based on ZnO stabilized with colloidal silicon dioxide and PVP in polymethylmethacrylate was developed, and the luminescent properties of films were investigated. The compositions are non-toxic, safe, and suitable for applying to the inner surface of active and intelligent packaging by printing techniques, such as screen printing, flexography, inkjet, and pad printing.
Photonics and Nanophotonics and Information and Communication Technologies in Modern Food Packaging
NASA Astrophysics Data System (ADS)
Sarapulova, Olha; Sherstiuk, Valentyn; Shvalagin, Vitaliy; Kukhta, Aleksander
2015-05-01
The analysis of the problem of conjunction of information and communication technologies (ICT) with packaging industry and food production was made. The perspective of combining the latest advances of nanotechnology, including nanophotonics, and ICT for creating modern smart packaging was shown. There were investigated luminescent films with zinc oxide nanoparticles, which change luminescence intensity as nano-ZnO interacts with decay compounds of food products, for active and intelligent packaging. High luminescent transparent films were obtained from colloidal suspension of ZnO and polyvinylpyrrolidone (PVP). The influence of molecular mass, concentration of nano-ZnO, and film thickness on luminescent properties of films was studied in order to optimize the content of the compositions. The possibility of covering the obtained films with polyvinyl alcohol was considered for eliminating water soluble properties of PVP. The luminescent properties of films with different covers were studied. The insoluble in water composition based on ZnO stabilized with colloidal silicon dioxide and PVP in polymethylmethacrylate was developed, and the luminescent properties of films were investigated. The compositions are non-toxic, safe, and suitable for applying to the inner surface of active and intelligent packaging by printing techniques, such as screen printing, flexography, inkjet, and pad printing.
White, Gary C.; Hines, J.E.
2004-01-01
The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methodsavailable. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them.In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented.Rotella et al. (2004) compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004).Efford et al. (2004) present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years.Barker & White (2004) discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine pieces of likelihood functions together to generate estimates. The idea is interesting, and maybe some bright young statistician can work out the specifics to implement the procedure.Choquet et al. (2004) describe MSURGE, a software package that implements the multistate capture–recapture models. The unique feature of MSURGE is that the design matrix is constructed with an interpreted language called GEMACO. Because MSURGE is limited to just multistate models, the special requirements of these likelihoods can be provided.The software and methods presented in these papers gives biologists and wildlife managers an expanding range of possibilities for data analysis. Although ease–of–use is generally getting better, it does not replace the need for understanding of the requirements and structure of the models being computed. The internet provides access to many free software packages as well as user–discussion groups to share knowledge and ideas. (A starting point for wildlife–related applications is (http://www.phidot.org).
System analyses on advanced nuclear fuel cycle and waste management
NASA Astrophysics Data System (ADS)
Cheon, Myeongguk
To evaluate the impacts of accelerator-driven transmutation of waste (ATW) fuel cycle on a geological repository, two mathematical models are developed: a reactor system analysis model and a high-level waste (HLW) conditioning model. With the former, fission products and residual trans-uranium (TRU) contained in HLW generated from a reference ATW plant operations are quantified and the reduction of TRU inventory included in commercial spent-nuclear fuel (CSNF) is evaluated. With the latter, an optimized waste loading and composition in solidification of HLW are determined and the volume reduction of waste packages associated with CSNF is evaluated. WACOM, a reactor system analysis code developed in this study for burnup calculation, is validated by ORIGEN2.1 and MCNP. WACOM is used to perform multicycle analysis for the reference lead-bismuth eutectic (LBE) cooled transmuter. By applying the results of this analysis to the reference ATW deployment scenario considered in the ATW roadmap, the HLW generated from the ATW fuel cycle is quantified and the reduction of TRU inventory contained in CSNF is evaluated. A linear programming (LP) model has been developed for determination of an optimized waste loading and composition in solidification of HLW. The model has been applied to a US-defense HLW. The optimum waste loading evaluated by the LP model was compared with that estimated by the Defense Waste Processing Facility (DWPF) in the US and a good agreement was observed. The LP model was then applied to the volume reduction of waste packages associated with CSNF. Based on the obtained reduction factors, the expansion of Yucca Mountain Repository (YMR) capacity is evaluated. It is found that with the reference ATW system, the TRU contained in CSNF could be reduced by a factor of ˜170 in terms of inventory and by a factor of ˜40 in terms of toxicity under the assumed scenario. The number of waste packages related to CSNF could be reduced by a factor of ˜8 in terms of volume and by factor of ˜10 on the basis of electricity generation when a sufficient cooling time for discharged spent fuel and zero process chemicals in HLW are assumed. The expansion factor of Yucca Mountain Repository capacity is estimated to be a factor of 2.4, much smaller than the reduction factor of CSNF waste packages, due to the existence of DOE-owned spent fuel and HLW. The YMR, however, could support 10 times greater electricity generation as long as the statutory capacity of DOE-owned SNF and HLW remains unchanged. This study also showed that the reduction of the number of waste packages could strongly be subject to the heat generation rate of HLW and the amount of process chemicals contained in HLW. For a greater reduction of the number of waste packages, a sufficient cooling time for discharged fuel and efforts to minimize the amount of process chemicals contained in HLW are crucial.
PynPoint code for exoplanet imaging
NASA Astrophysics Data System (ADS)
Amara, A.; Quanz, S. P.; Akeret, J.
2015-04-01
We announce the public release of PynPoint, a Python package that we have developed for analysing exoplanet data taken with the angular differential imaging observing technique. In particular, PynPoint is designed to model the point spread function of the central star and to subtract its flux contribution to reveal nearby faint companion planets. The current version of the package does this correction by using a principal component analysis method to build a basis set for modelling the point spread function of the observations. We demonstrate the performance of the package by reanalysing publicly available data on the exoplanet β Pictoris b, which consists of close to 24,000 individual image frames. We show that PynPoint is able to analyse this typical data in roughly 1.5 min on a Mac Pro, when the number of images is reduced by co-adding in sets of 5. The main computational work, the calculation of the Singular-Value-Decomposition, parallelises well as a result of a reliance on the SciPy and NumPy packages. For this calculation the peak memory load is 6 GB, which can be run comfortably on most workstations. A simpler calculation, by co-adding over 50, takes 3 s with a peak memory usage of 600 MB. This can be performed easily on a laptop. In developing the package we have modularised the code so that we will be able to extend functionality in future releases, through the inclusion of more modules, without it affecting the users application programming interface. We distribute the PynPoint package under GPLv3 licence through the central PyPI server, and the documentation is available online (http://pynpoint.ethz.ch).
Optical sensors for application in intelligent food-packaging technology
NASA Astrophysics Data System (ADS)
McEvoy, Aisling K.; Von Bueltzingsloewen, Christoph; McDonagh, Colette M.; MacCraith, Brian D.; Klimant, Ingo; Wolfbeis, Otto S.
2003-03-01
Modified Atmosphere Packaged (MAP) food employs a protective gas mixture, which normally contains selected amounts of carbon dioxide (CO2) and oxygen (O2), in order to extend the shelf life of food. Conventional MAP analysis of package integrity involves destructive sampling of packages followed by carbon dioxide and oxygen detection. For quality control reasons, as well as to enhance food safety, the concept of optical on-pack sensors for monitoring the gas composition of the MAP package at different stages of the distribution process is very attractive. The objective of this work was to develop printable formulations of oxygen and carbon dioxide sensors for use in food packaging. Oxygen sensing is achieved by detecting the degree of quenching of a fluorescent ruthenium complex entrapped in a sol-gel matrix. In particular, a measurement technique based on the quenching of the fluorescence decay time, phase fluorometric detection, is employed. A scheme for detecting CO2 has been developed which is compatible with the oxygen detection scheme. It is fluorescence-based and uses the pH-sensitive 8-hydroxypyrene-1,3,6-trisulfonic acid (HPTS) indicator dye encapsulated in an organically modified silica (ORMOSIL) glass matrix. Dual Luminophore Referencing (DLR) has been employed as an internal referencing scheme, which provides many of the advantages of lifetime-based fluorometric methods. Oxygen cross-sensitivity was minimised by encapsulating the reference luminophore in dense sol-gel microspheres. The sensor performance compared well with standard methods for both oxygen and carbon dioxide detection. The results of preliminary on-pack print trials are presented and a preliminary design of an integrated dual gas optical read-out device is discussed.
Status and Trend of Automotive Power Packaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liang, Zhenxian
2012-01-01
Comprehensive requirements in aspects of cost, reliability, efficiency, form factor, weight, and volume for power electronics modules in modern electric drive vehicles have driven the development of automotive power packaging technology intensively. Innovation in materials, interconnections, and processing techniques is leading to enormous improvements in power modules. In this paper, the technical development of and trends in power module packaging are evaluated by examining technical details with examples of industrial products. The issues and development directions for future automotive power module packaging are also discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roeleveld, J.J.
1985-01-01
This dissertation develops a general model of technological substitution that could be of help to planners and decision makers in industry who are faced with the problems created by continual technological change. The model as presented differs from existing models in the theoretical literature because of its emphasis on analyzing current and potential technologies in an attempt to understand the underlying factors contributing to technological substitution. The general model and the cost model that is part of it belong to that step in the interactive planning cycle called the formulation of the mess. The methodology underlying the cost model ismore » a combination of life-cycle analysis (i.e., from raw materials in nature, through all intermediate products, to waste returned to the environment) and resoumetrics, which is an engineering approach to measuring all physical inputs required to produce a certain level of output. The models are illustrated with a specific field of interest: substitution of primary packaging technologies in the US brewing industry. The physical costs of packaging beer in different containers are compared. Strategic considerations for a brewery deciding to adopt plastic packaging technology are discussed. Attention is given to another potential fruitful application of the model in the field of technology transfer to developing countries.« less
Localized heating/bonding techniques in MEMS packaging
NASA Astrophysics Data System (ADS)
Mabesa, J. R., Jr.; Scott, A. J.; Wu, X.; Auner, G. W.
2005-05-01
Packaging is used to protect and enable intelligent sensor systems utilized in manned/unmanned ground vehicle systems/subsystems. Because Micro electro mechanical systems (MEMS) are used often in these sensor or actuation products, it must interact with the surrounding environment, which may be in direct conflict with the desire to isolate the electronics for improved reliability/durability performance. For some very simple devices, performance requirements may allow a high degree of isolation from the environment (e.g., stints and accelerometers). Other more complex devices (i.e. chemical and biological analysis systems, particularly in vivo systems) present extremely complex packaging requirements. Power and communications to MEMS device arrays are also extremely problematic. The following describes the research being performed at the U.S. Army Research, Development, and Engineering Command (RDECOM) Tank and Automotive Research, Development, and Engineering Center (TARDEC), in collaboration with Wayne State University, in Detroit, MI. The focus of the packaging research is limited to six main categories: a) provision for feed-through for electrical, optical, thermal, and fluidic interfaces; b) environmental management including atmosphere, hermiticity, and temperature; c) control of stress and mechanical durability; d) management of thermal properties to minimize absorption and/or emission; e) durability and structural integrity; and f) management of RF/magnetic/electrical and optical interference and/or radiation properties and exposure.
The Role of Packaging in Solid Waste Management 1966 to 1976.
ERIC Educational Resources Information Center
Darnay, Arsen; Franklin, William E.
The goals of waste processors and packagers obviously differ: the packaging industry seeks durable container material that will be unimpaired by external factors. Until recently, no systematic analysis of the relationship between packaging and solid waste disposal had been undertaken. This three-part document defines these interactions, and the…
Teerawattananon, Yot; Tangcharoensathien, Viroj
2004-10-01
In October 2001 Thailand introduced universal healthcare coverage (UC) financed by general tax revenue. This paper aims to assess the design and content of the UC benefit package, focusing on the part of the package concerned with sexual and reproductive health (SRH). The economic concept of need, demand and supply in the process of developing the SRH package was applied to the analysis. The analysis indicated that SRH constitutes a major part of the package, including the control of communicable and non-communicable diseases, the promotion and maintenance of reproductive health, and early detection and management of reproductive health problems. In addition, the authors identified seven areas within three overlapping spheres; namely need, demand and supply. The burden of disease on reproductive conditions was used as a proxy indicator of health needs in the population; the findings of a study of private obstetric practice in public hospitals as a proxy of patients' demands; and the SRH services offered in the UC package as a proxy of general healthcare supply. The authors recommend that in order to ensure that healthcare needs match consumer demand, the inclusion of SRH services not currently offered in the package (e.g. treatment of HIV infection, abortion services) should be considered, if additional resources can be made available. Where health needs exist but consumers do not express demand, and the appropriate SRH services would provide external benefits to society (e.g. the programme for prevention of sexual and gender-related violence), policymakers are encouraged to expand and offer these services. Efforts should be made to create consumer awareness and stimulate demand. Research can play an important role in identifying the services in which supply matches demand but does not necessarily reflect the health needs of the population (e.g. unnecessary investigations and prescriptions). Where only demand or supply exists (e.g. breast cosmetic procedures and unproven effective interventions), these SRH services should be excluded from the package and left to private financing and providers, the government playing a regulatory role. Copyright 2004 Oxford University Press
Chemical Applications for Enhanced World Security
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leibman, Christopher Patrick
The purpose of this project is to reduce complexity of chemical analysis by combining chemical and physical processing steps into on package; develop instrumentation that cost less and is easy to use in a field laboratory by non-experts; and develop this "chemical application" so uranium enrichment can be measured onsite, eliminating the need for radioactive sample transport.
Development of an Operational Data Assimilation Package Using NAAPS and NAVDAS
2009-12-01
EGU2008- A-l 1193, EGU General Assembly 2008. Reid, J. S., H. J. Hyer, D. L. Westphal, R. Scheffe, J. Zhang, and E. M. Prins (2008), Developing a...analysis to gauge model improvement. Included is not only the collection 5 over ocean, but also the NRL provided over land aerosol products generated
DOE Office of Scientific and Technical Information (OSTI.GOV)
Page, R.; Jones, J.R.
1997-07-01
Ensuring that safety analysis needs are met in the future is likely to lead to the development of new codes and the further development of existing codes. It is therefore advantageous to define standards for data interfaces and to develop software interfacing techniques which can readily accommodate changes when they are made. Defining interface standards is beneficial but is necessarily restricted in application if future requirements are not known in detail. Code interfacing methods are of particular relevance with the move towards automatic grid frequency response operation where the integration of plant dynamic, core follow and fault study calculation toolsmore » is considered advantageous. This paper describes the background and features of a new code TALINK (Transient Analysis code LINKage program) used to provide a flexible interface to link the RELAP5 thermal hydraulics code with the PANTHER neutron kinetics and the SIBDYM whole plant dynamic modelling codes used by Nuclear Electric. The complete package enables the codes to be executed in parallel and provides an integrated whole plant thermal-hydraulics and neutron kinetics model. In addition the paper discusses the capabilities and pedigree of the component codes used to form the integrated transient analysis package and the details of the calculation of a postulated Sizewell `B` Loss of offsite power fault transient.« less
Analysis of Variance: What Is Your Statistical Software Actually Doing?
ERIC Educational Resources Information Center
Li, Jian; Lomax, Richard G.
2011-01-01
Users assume statistical software packages produce accurate results. In this article, the authors systematically examined Statistical Package for the Social Sciences (SPSS) and Statistical Analysis System (SAS) for 3 analysis of variance (ANOVA) designs, mixed-effects ANOVA, fixed-effects analysis of covariance (ANCOVA), and nested ANOVA. For each…
49 CFR 109.9 - Transportation for examination and analysis.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 49 Transportation 2 2011-10-01 2011-10-01 false Transportation for examination and analysis. 109.9... analysis. (a) An agent may direct a package to be transported to a facility for examination and analysis... the package conforms to subchapter C of this chapter; (2) Conflicting information concerning the...
2010-06-01
scanners, readers, or imagers. These types of ADCS devices use two slightly different technologies. Laser scanners use a photodiode to measure the...structure of a ship, but the LCS utilizes modular mission packages that can be removed and replaced when the threat , environment, or mission changes...would need to support a wide array of business applications and users (Clarion, 2009). The DoD’s solution to this deficiency is called IUID. IUID is a
Federal Emergency Management Information System (FEMIS) system administration guide. Version 1.2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burford, M.J.; Burnett, R.A.; Curtis, L.M.
The Federal Emergency Management Information System (FEMIS) is an emergency management planning and analysis tool that is being developed under the direction of the US Army Chemical biological Defense Command. The FEMIS System Administration Guide defines FEMIS hardware and software requirements and gives instructions for installing the FEMIS system package. System administrators, database administrators, and general users can use this guide to install, configure, and maintain the FEMIS client software package. This document provides a description of the FEMIS environment; distribution media; data, communications, and electronic mail servers; user workstations; and system management.
A versatile computer package for mechanism analysis, part 2: Dynamics and balance
NASA Astrophysics Data System (ADS)
Davies, T.
The algorithms required for the shaking force components, the shaking moment about the crankshaft axis, and the input torque and bearing load components are discussed using the textile machine as a focus for the discussion. The example is also used to provide illustrations of the output for options on the hodograph of the shaking force vector. This provides estimates of the optimum contrarotating masses and their locations for a generalized primary Lanchester balancer. The suitability of generalized Lanchester balancers particularly for textile machinery, and the overall strategy used during the development of the package are outlined.
Resilience Among Students at the Basic Enlisted Submarine School
2016-12-01
reported resilience. The Hayes’ Macro in the Statistical Package for the Social Sciences (SSPS) was used to uncover factors relevant to mediation analysis... Statistical Package for the Social Sciences (SPSS) was used to uncover factors relevant to mediation analysis. Findings suggest that the encouragement of...to Stressful Experiences Scale RTC Recruit Training Command SPSS Statistical Package for the Social Sciences SS Social Support SWB Subjective Well
obitools: a unix-inspired software package for DNA metabarcoding.
Boyer, Frédéric; Mercier, Céline; Bonin, Aurélie; Le Bras, Yvan; Taberlet, Pierre; Coissac, Eric
2016-01-01
DNA metabarcoding offers new perspectives in biodiversity research. This recently developed approach to ecosystem study relies heavily on the use of next-generation sequencing (NGS) and thus calls upon the ability to deal with huge sequence data sets. The obitools package satisfies this requirement thanks to a set of programs specifically designed for analysing NGS data in a DNA metabarcoding context. Their capacity to filter and edit sequences while taking into account taxonomic annotation helps to set up tailor-made analysis pipelines for a broad range of DNA metabarcoding applications, including biodiversity surveys or diet analyses. The obitools package is distributed as an open source software available on the following website: http://metabarcoding.org/obitools. A Galaxy wrapper is available on the GenOuest core facility toolshed: http://toolshed.genouest.org. © 2015 John Wiley & Sons Ltd.
NASA Technical Reports Server (NTRS)
Chen, Liang-Yu; Neudeck, Philip G.; Behelm, Glenn M.; Spry, David J.; Meredith, Roger D.; Hunter, Gary W.
2015-01-01
This paper presents ceramic substrates and thick-film metallization based packaging technologies in development for 500C silicon carbide (SiC) electronics and sensors. Prototype high temperature ceramic chip-level packages and printed circuit boards (PCBs) based on ceramic substrates of aluminum oxide (Al2O3) and aluminum nitride (AlN) have been designed and fabricated. These ceramic substrate-based chip-level packages with gold (Au) thick-film metallization have been electrically characterized at temperatures up to 550C. The 96 alumina packaging system composed of chip-level packages and PCBs has been successfully tested with high temperature SiC discrete transistor devices at 500C for over 10,000 hours. In addition to tests in a laboratory environment, a SiC junction field-effect-transistor (JFET) with a packaging system composed of a 96 alumina chip-level package and an alumina printed circuit board was tested on low earth orbit for eighteen months via a NASA International Space Station experiment. In addition to packaging systems for electronics, a spark-plug type sensor package based on this high temperature interconnection system for high temperature SiC capacitive pressure sensors was also developed and tested. In order to further significantly improve the performance of packaging system for higher packaging density, higher operation frequency, power rating, and even higher temperatures, some fundamental material challenges must be addressed. This presentation will discuss previous development and some of the challenges in material science (technology) to improve high temperature dielectrics for packaging applications.
Pathview Web: user friendly pathway visualization and data integration.
Luo, Weijun; Pant, Gaurav; Bhavnasi, Yeshvant K; Blanchard, Steven G; Brouwer, Cory
2017-07-03
Pathway analysis is widely used in omics studies. Pathway-based data integration and visualization is a critical component of the analysis. To address this need, we recently developed a novel R package called Pathview. Pathview maps, integrates and renders a large variety of biological data onto molecular pathway graphs. Here we developed the Pathview Web server, as to make pathway visualization and data integration accessible to all scientists, including those without the special computing skills or resources. Pathview Web features an intuitive graphical web interface and a user centered design. The server not only expands the core functions of Pathview, but also provides many useful features not available in the offline R package. Importantly, the server presents a comprehensive workflow for both regular and integrated pathway analysis of multiple omics data. In addition, the server also provides a RESTful API for programmatic access and conveniently integration in third-party software or workflows. Pathview Web is openly and freely accessible at https://pathview.uncc.edu/. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
Joyce, Brendan; Lee, Danny; Rubio, Alex; Ogurtsov, Aleksey; Alves, Gelio; Yu, Yi-Kuo
2018-03-15
RAId is a software package that has been actively developed for the past 10 years for computationally and visually analyzing MS/MS data. Founded on rigorous statistical methods, RAId's core program computes accurate E-values for peptides and proteins identified during database searches. Making this robust tool readily accessible for the proteomics community by developing a graphical user interface (GUI) is our main goal here. We have constructed a graphical user interface to facilitate the use of RAId on users' local machines. Written in Java, RAId_GUI not only makes easy executions of RAId but also provides tools for data/spectra visualization, MS-product analysis, molecular isotopic distribution analysis, and graphing the retrieval versus the proportion of false discoveries. The results viewer displays and allows the users to download the analyses results. Both the knowledge-integrated organismal databases and the code package (containing source code, the graphical user interface, and a user manual) are available for download at https://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads/raid.html .
Patel, Jayshree; Mulhall, Brian; Wolf, Heinz; Klohr, Steven; Guazzo, Dana Morton
2011-01-01
A leak test performed according to ASTM F2338-09 Standard Test Method for Nondestructive Detection of Leaks in Packages by Vacuum Decay Method was developed and validated for container-closure integrity verification of a lyophilized product in a parenteral vial package system. This nondestructive leak test method is intended for use in manufacturing as an in-process package integrity check, and for testing product stored on stability in lieu of sterility tests. Method development and optimization challenge studies incorporated artificially defective packages representing a range of glass vial wall and sealing surface defects, as well as various elastomeric stopper defects. Method validation required 3 days of random-order replicate testing of a test sample population of negative-control, no-defect packages and positive-control, with-defect packages. Positive-control packages were prepared using vials each with a single hole laser-drilled through the glass vial wall. Hole creation and hole size certification was performed by Lenox Laser. Validation study results successfully demonstrated the vacuum decay leak test method's ability to accurately and reliably detect those packages with laser-drilled holes greater than or equal to approximately 5 μm in nominal diameter. All development and validation studies were performed at Whitehouse Analytical Laboratories in Whitehouse, NJ, under the direction of consultant Dana Guazzo of RxPax, LLC, using a VeriPac 455 Micro Leak Test System by Packaging Technologies & Inspection (Tuckahoe, NY). Bristol Myers Squibb (New Brunswick, NJ) fully subsidized all work. A leak test performed according to ASTM F2338-09 Standard Test Method for Nondestructive Detection of Leaks in Packages by Vacuum Decay Method was developed and validated to detect defects in stoppered vial packages containing lyophilized product for injection. This nondestructive leak test method is intended for use in manufacturing as an in-process package integrity check, and for testing product stored on stability in lieu of sterility tests. Test method validation study results proved the method capable of detecting holes laser-drilled through the glass vial wall greater than or equal to 5 μm in nominal diameter. Total test time is less than 1 min per package. All method development and validation studies were performed at Whitehouse Analytical Laboratories in Whitehouse, NJ, under the direction of consultant Dana Guazzo of RxPax, LLC, using a VeriPac 455 Micro Leak Test System by Packaging Technologies & Inspection (Tuckahoe, NY). Bristol Myers Squibb (New Brunswick, NJ) fully subsidized all work.
System Operations Studies : Feeder System Model. User's Manual.
DOT National Transportation Integrated Search
1982-11-01
The Feeder System Model (FSM) is one of the analytic models included in the System Operations Studies (SOS) software package developed for urban transit systems analysis. The objective of the model is to assign a proportion of the zone-to-zone travel...
EPA and EFSA approaches for Benchmark Dose modeling
Benchmark dose (BMD) modeling has become the preferred approach in the analysis of toxicological dose-response data for the purpose of deriving human health toxicity values. The software packages most often used are Benchmark Dose Software (BMDS, developed by EPA) and PROAST (de...
ERIC Educational Resources Information Center
Kurtz, Peter; And Others
This report is concerned with the implementation of two interrelated computer systems: an automatic document analysis and classification package, and an on-line interactive information retrieval system which utilizes the information gathered during the automatic classification phase. Well-known techniques developed by Salton and Dennis have been…
Analysis, annotation, and profiling of the oat seed transcriptome
USDA-ARS?s Scientific Manuscript database
Novel high-throughput next generation sequencing (NGS) technologies are providing opportunities to explore genomes and transcriptomes in a cost-effective manner. To construct a gene expression atlas of developing oat (Avena sativa) seeds, two software packages specifically designed for RNA-seq (Trin...
Preliminary design review package on air flat plate collector for solar heating and cooling system
NASA Technical Reports Server (NTRS)
1977-01-01
Guidelines to be used in the development and fabrication of a prototype air flat plate collector subsystem containing 320 square feet (10-4 ft x 8 ft panels) of collector area are presented. Topics discussed include: (1) verification plan; (2) thermal analysis; (3) safety hazard analysis; (4) drawing list; (5) special handling, installation and maintenance tools; (6) structural analysis; and (7) selected drawings.
Chang, Lin; Bi, Pengyu; Li, Xiaochen; Wei, Yun
2015-06-15
A novel trace analytical method based on solvent sublation (SS) and gas chromatography-mass spectrometry (GC-MS) was developed for the trace determination of twenty-two phthalate esters (PAEs) from plastic beverage packaging. In the solvent sublation section, the effects of solution pH, NaCl concentration, nitrogen flow rate, and sublation time on the sublation efficiency were investigated in detail, and the optimal conditions were obtained. The trace PAEs migrated from plastic beverage packaging to food simulants were separated and concentrated by solvent sublation, and then the trace target compounds in the concentrated solution were analyzed by GC-MS. According to the European Union Regulation, the food simulants including distilled water for the normal beverages and acetic acid solution (3%) for the acetic beverage of yogurt were prepared for migration tests. The trace analysis method showed good linearity, low limits of detection (LODs) of 1.6-183.5 ng/L, and satisfied recoveries (67.3-113.7%). Copyright © 2015 Elsevier Ltd. All rights reserved.
BWR ASSEMBLY SOURCE TERMS FOR WASTE PACKAGE DESIGN
DOE Office of Scientific and Technical Information (OSTI.GOV)
T.L. Lotz
1997-02-15
This analysis is prepared by the Mined Geologic Disposal System (MGDS) Waste Package Development Department (WPDD) to provide boiling water reactor (BWR) assembly radiation source term data for use during Waste Package (WP) design. The BWR assembly radiation source terms are to be used for evaluation of radiolysis effects at the WP surface, and for personnel shielding requirements during assembly or WP handling operations. The objectives of this evaluation are to generate BWR assembly radiation source terms that bound selected groupings of BWR assemblies, with regard to assembly average burnup and cooling time, which comprise the anticipated MGDS BWR commercialmore » spent nuclear fuel (SNF) waste stream. The source term data is to be provided in a form which can easily be utilized in subsequent shielding/radiation dose calculations. Since these calculations may also be used for Total System Performance Assessment (TSPA), with appropriate justification provided by TSPA, or radionuclide release rate analysis, the grams of each element and additional cooling times out to 25 years will also be calculated and the data included in the output files.« less
RNAstructure: software for RNA secondary structure prediction and analysis.
Reuter, Jessica S; Mathews, David H
2010-03-15
To understand an RNA sequence's mechanism of action, the structure must be known. Furthermore, target RNA structure is an important consideration in the design of small interfering RNAs and antisense DNA oligonucleotides. RNA secondary structure prediction, using thermodynamics, can be used to develop hypotheses about the structure of an RNA sequence. RNAstructure is a software package for RNA secondary structure prediction and analysis. It uses thermodynamics and utilizes the most recent set of nearest neighbor parameters from the Turner group. It includes methods for secondary structure prediction (using several algorithms), prediction of base pair probabilities, bimolecular structure prediction, and prediction of a structure common to two sequences. This contribution describes new extensions to the package, including a library of C++ classes for incorporation into other programs, a user-friendly graphical user interface written in JAVA, and new Unix-style text interfaces. The original graphical user interface for Microsoft Windows is still maintained. The extensions to RNAstructure serve to make RNA secondary structure prediction user-friendly. The package is available for download from the Mathews lab homepage at http://rna.urmc.rochester.edu/RNAstructure.html.
Computational models for the viscous/inviscid analysis of jet aircraft exhaust plumes
NASA Astrophysics Data System (ADS)
Dash, S. M.; Pergament, H. S.; Thorpe, R. D.
1980-05-01
Computational models which analyze viscous/inviscid flow processes in jet aircraft exhaust plumes are discussed. These models are component parts of an NASA-LaRC method for the prediction of nozzle afterbody drag. Inviscid/shock processes are analyzed by the SCIPAC code which is a compact version of a generalized shock capturing, inviscid plume code (SCIPPY). The SCIPAC code analyzes underexpanded jet exhaust gas mixtures with a self-contained thermodynamic package for hydrocarbon exhaust products and air. A detailed and automated treatment of the embedded subsonic zones behind Mach discs is provided in this analysis. Mixing processes along the plume interface are analyzed by two upgraded versions of an overlaid, turbulent mixing code (BOAT) developed previously for calculating nearfield jet entrainment. The BOATAC program is a frozen chemistry version of BOAT containing the aircraft thermodynamic package as SCIPAC; BOATAB is an afterburning version with a self-contained aircraft (hydrocarbon/air) finite-rate chemistry package. The coupling of viscous and inviscid flow processes is achieved by an overlaid procedure with interactive effects accounted for by a displacement thickness type correction to the inviscid plume interface.
NASA Technical Reports Server (NTRS)
Dash, S. M.; Pergament, H. S.; Thorpe, R. D.
1980-01-01
Computational models which analyze viscous/inviscid flow processes in jet aircraft exhaust plumes are discussed. These models are component parts of an NASA-LaRC method for the prediction of nozzle afterbody drag. Inviscid/shock processes are analyzed by the SCIPAC code which is a compact version of a generalized shock capturing, inviscid plume code (SCIPPY). The SCIPAC code analyzes underexpanded jet exhaust gas mixtures with a self-contained thermodynamic package for hydrocarbon exhaust products and air. A detailed and automated treatment of the embedded subsonic zones behind Mach discs is provided in this analysis. Mixing processes along the plume interface are analyzed by two upgraded versions of an overlaid, turbulent mixing code (BOAT) developed previously for calculating nearfield jet entrainment. The BOATAC program is a frozen chemistry version of BOAT containing the aircraft thermodynamic package as SCIPAC; BOATAB is an afterburning version with a self-contained aircraft (hydrocarbon/air) finite-rate chemistry package. The coupling of viscous and inviscid flow processes is achieved by an overlaid procedure with interactive effects accounted for by a displacement thickness type correction to the inviscid plume interface.
FluxPyt: a Python-based free and open-source software for 13C-metabolic flux analyses.
Desai, Trunil S; Srivastava, Shireesh
2018-01-01
13 C-Metabolic flux analysis (MFA) is a powerful approach to estimate intracellular reaction rates which could be used in strain analysis and design. Processing and analysis of labeling data for calculation of fluxes and associated statistics is an essential part of MFA. However, various software currently available for data analysis employ proprietary platforms and thus limit accessibility. We developed FluxPyt, a Python-based truly open-source software package for conducting stationary 13 C-MFA data analysis. The software is based on the efficient elementary metabolite unit framework. The standard deviations in the calculated fluxes are estimated using the Monte-Carlo analysis. FluxPyt also automatically creates flux maps based on a template for visualization of the MFA results. The flux distributions calculated by FluxPyt for two separate models: a small tricarboxylic acid cycle model and a larger Corynebacterium glutamicum model, were found to be in good agreement with those calculated by a previously published software. FluxPyt was tested in Microsoft™ Windows 7 and 10, as well as in Linux Mint 18.2. The availability of a free and open 13 C-MFA software that works in various operating systems will enable more researchers to perform 13 C-MFA and to further modify and develop the package.
FluxPyt: a Python-based free and open-source software for 13C-metabolic flux analyses
Desai, Trunil S.
2018-01-01
13C-Metabolic flux analysis (MFA) is a powerful approach to estimate intracellular reaction rates which could be used in strain analysis and design. Processing and analysis of labeling data for calculation of fluxes and associated statistics is an essential part of MFA. However, various software currently available for data analysis employ proprietary platforms and thus limit accessibility. We developed FluxPyt, a Python-based truly open-source software package for conducting stationary 13C-MFA data analysis. The software is based on the efficient elementary metabolite unit framework. The standard deviations in the calculated fluxes are estimated using the Monte-Carlo analysis. FluxPyt also automatically creates flux maps based on a template for visualization of the MFA results. The flux distributions calculated by FluxPyt for two separate models: a small tricarboxylic acid cycle model and a larger Corynebacterium glutamicum model, were found to be in good agreement with those calculated by a previously published software. FluxPyt was tested in Microsoft™ Windows 7 and 10, as well as in Linux Mint 18.2. The availability of a free and open 13C-MFA software that works in various operating systems will enable more researchers to perform 13C-MFA and to further modify and develop the package. PMID:29736347
Unidata: 30 Years of FOSS for the Geosciences
NASA Astrophysics Data System (ADS)
Davis, E.; Ramamurthy, M. K.; Young, J. W.; Fisher, W. I.; Rew, R. K.
2015-12-01
Unidata's core mission is to serve academic research and education communities by facilitating access and use of real-time weather data. To this end, Unidata develops, distributes, and supports several Free and Open Source Software (FOSS) packages. These packages are largely focused on data management, access, analysis and visualization. This presentation will discuss the lessons Unidata has gathered over thirty years of FOSS development, support, and community building. These lessons include what it takes to be a successful FOSS organization, how to adapt to changing "best practices" and the emergence of new FOSS tools and services, and techniques for dealing with software end-of-life. We will also discuss our approach when supporting a varied user community spanning end users and software developers. Strong user support has been an important key to Unidata's successful community building.
New developments in the McStas neutron instrument simulation package
NASA Astrophysics Data System (ADS)
Willendrup, P. K.; Knudsen, E. B.; Klinkby, E.; Nielsen, T.; Farhi, E.; Filges, U.; Lefmann, K.
2014-07-01
The McStas neutron ray-tracing software package is a versatile tool for building accurate simulators of neutron scattering instruments at reactors, short- and long-pulsed spallation sources such as the European Spallation Source. McStas is extensively used for design and optimization of instruments, virtual experiments, data analysis and user training. McStas was founded as a scientific, open-source collaborative code in 1997. This contribution presents the project at its current state and gives an overview of the main new developments in McStas 2.0 (December 2012) and McStas 2.1 (expected fall 2013), including many new components, component parameter uniformisation, partial loss of backward compatibility, updated source brilliance descriptions, developments toward new tools and user interfaces, web interfaces and a new method for estimating beam losses and background from neutron optics.
Efficient analysis of large-scale genome-wide data with two R packages: bigstatsr and bigsnpr.
Privé, Florian; Aschard, Hugues; Ziyatdinov, Andrey; Blum, Michael G B
2017-03-30
Genome-wide datasets produced for association studies have dramatically increased in size over the past few years, with modern datasets commonly including millions of variants measured in dozens of thousands of individuals. This increase in data size is a major challenge severely slowing down genomic analyses, leading to some software becoming obsolete and researchers having limited access to diverse analysis tools. Here we present two R packages, bigstatsr and bigsnpr, allowing for the analysis of large scale genomic data to be performed within R. To address large data size, the packages use memory-mapping for accessing data matrices stored on disk instead of in RAM. To perform data pre-processing and data analysis, the packages integrate most of the tools that are commonly used, either through transparent system calls to existing software, or through updated or improved implementation of existing methods. In particular, the packages implement fast and accurate computations of principal component analysis and association studies, functions to remove SNPs in linkage disequilibrium and algorithms to learn polygenic risk scores on millions of SNPs. We illustrate applications of the two R packages by analyzing a case-control genomic dataset for celiac disease, performing an association study and computing Polygenic Risk Scores. Finally, we demonstrate the scalability of the R packages by analyzing a simulated genome-wide dataset including 500,000 individuals and 1 million markers on a single desktop computer. https://privefl.github.io/bigstatsr/ & https://privefl.github.io/bigsnpr/. florian.prive@univ-grenoble-alpes.fr & michael.blum@univ-grenoble-alpes.fr. Supplementary materials are available at Bioinformatics online.
49 CFR 173.36 - Hazardous materials in Large Packagings.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Packagings (e.g., 51H) are only authorized for use with flexible inner packagings. (3) Friction. The nature and thickness of the outer packaging must be such that friction during transportation is not likely to... transportation in inner packagings appropriately resistant to an increase of internal pressure likely to develop...
A ChIP-Seq Data Analysis Pipeline Based on Bioconductor Packages.
Park, Seung-Jin; Kim, Jong-Hwan; Yoon, Byung-Ha; Kim, Seon-Young
2017-03-01
Nowadays, huge volumes of chromatin immunoprecipitation-sequencing (ChIP-Seq) data are generated to increase the knowledge on DNA-protein interactions in the cell, and accordingly, many tools have been developed for ChIP-Seq analysis. Here, we provide an example of a streamlined workflow for ChIP-Seq data analysis composed of only four packages in Bioconductor: dada2, QuasR, mosaics, and ChIPseeker. 'dada2' performs trimming of the high-throughput sequencing data. 'QuasR' and 'mosaics' perform quality control and mapping of the input reads to the reference genome and peak calling, respectively. Finally, 'ChIPseeker' performs annotation and visualization of the called peaks. This workflow runs well independently of operating systems (e.g., Windows, Mac, or Linux) and processes the input fastq files into various results in one run. R code is available at github: https://github.com/ddhb/Workflow_of_Chipseq.git.
A ChIP-Seq Data Analysis Pipeline Based on Bioconductor Packages
Park, Seung-Jin; Kim, Jong-Hwan; Yoon, Byung-Ha; Kim, Seon-Young
2017-01-01
Nowadays, huge volumes of chromatin immunoprecipitation-sequencing (ChIP-Seq) data are generated to increase the knowledge on DNA-protein interactions in the cell, and accordingly, many tools have been developed for ChIP-Seq analysis. Here, we provide an example of a streamlined workflow for ChIP-Seq data analysis composed of only four packages in Bioconductor: dada2, QuasR, mosaics, and ChIPseeker. ‘dada2’ performs trimming of the high-throughput sequencing data. ‘QuasR’ and ‘mosaics’ perform quality control and mapping of the input reads to the reference genome and peak calling, respectively. Finally, ‘ChIPseeker’ performs annotation and visualization of the called peaks. This workflow runs well independently of operating systems (e.g., Windows, Mac, or Linux) and processes the input fastq files into various results in one run. R code is available at github: https://github.com/ddhb/Workflow_of_Chipseq.git. PMID:28416945
Ardin, Maude; Cahais, Vincent; Castells, Xavier; Bouaoun, Liacine; Byrnes, Graham; Herceg, Zdenko; Zavadil, Jiri; Olivier, Magali
2016-04-18
The nature of somatic mutations observed in human tumors at single gene or genome-wide levels can reveal information on past carcinogenic exposures and mutational processes contributing to tumor development. While large amounts of sequencing data are being generated, the associated analysis and interpretation of mutation patterns that may reveal clues about the natural history of cancer present complex and challenging tasks that require advanced bioinformatics skills. To make such analyses accessible to a wider community of researchers with no programming expertise, we have developed within the web-based user-friendly platform Galaxy a first-of-its-kind package called MutSpec. MutSpec includes a set of tools that perform variant annotation and use advanced statistics for the identification of mutation signatures present in cancer genomes and for comparing the obtained signatures with those published in the COSMIC database and other sources. MutSpec offers an accessible framework for building reproducible analysis pipelines, integrating existing methods and scripts developed in-house with publicly available R packages. MutSpec may be used to analyse data from whole-exome, whole-genome or targeted sequencing experiments performed on human or mouse genomes. Results are provided in various formats including rich graphical outputs. An example is presented to illustrate the package functionalities, the straightforward workflow analysis and the richness of the statistics and publication-grade graphics produced by the tool. MutSpec offers an easy-to-use graphical interface embedded in the popular Galaxy platform that can be used by researchers with limited programming or bioinformatics expertise to analyse mutation signatures present in cancer genomes. MutSpec can thus effectively assist in the discovery of complex mutational processes resulting from exogenous and endogenous carcinogenic insults.
Packaging of silicon photonic devices: from prototypes to production
NASA Astrophysics Data System (ADS)
Morrissey, Padraic E.; Gradkowski, Kamil; Carroll, Lee; O'Brien, Peter
2018-02-01
The challenges associated with the photonic packaging of silicon devices is often underestimated and remains technically challenging. In this paper, we review some key enabling technologies that will allow us to overcome the current bottleneck in silicon photonic packaging; while also describing the recent developments in standardisation, including the establishment of PIXAPP as the worlds first open-access PIC packaging and assembly Pilot Line. These developments will allow the community to move from low volume prototype photonic packaged devices to large scale volume manufacturing, where the full commercialisation of PIC technology can be realised.
ERIC Educational Resources Information Center
Ahrens, Fred; Mistry, Rajendra
2005-01-01
In product engineering there often arise design analysis problems for which a commercial software package is either unavailable or cost prohibitive. Further, these calculations often require successive iterations that can be time intensive when performed by hand, thus development of a software application is indicated. This case relates to the…
NASA Astrophysics Data System (ADS)
Chen, Shuming; Wang, Dengfeng; Liu, Bo
This paper investigates optimization design of the thickness of the sound package performed on a passenger automobile. The major characteristics indexes for performance selected to evaluate the processes are the SPL of the exterior noise and the weight of the sound package, and the corresponding parameters of the sound package are the thickness of the glass wool with aluminum foil for the first layer, the thickness of the glass fiber for the second layer, and the thickness of the PE foam for the third layer. In this paper, the process is fundamentally with multiple performances, thus, the grey relational analysis that utilizes grey relational grade as performance index is especially employed to determine the optimal combination of the thickness of the different layers for the designed sound package. Additionally, in order to evaluate the weighting values corresponding to various performance characteristics, the principal component analysis is used to show their relative importance properly and objectively. The results of the confirmation experiments uncover that grey relational analysis coupled with principal analysis methods can successfully be applied to find the optimal combination of the thickness for each layer of the sound package material. Therefore, the presented method can be an effective tool to improve the vehicle exterior noise and lower the weight of the sound package. In addition, it will also be helpful for other applications in the automotive industry, such as the First Automobile Works in China, Changan Automobile in China, etc.
Solar heating and cooling technical data and systems analysis
NASA Technical Reports Server (NTRS)
Christensen, D. L.
1976-01-01
The accomplishments of a project to study solar heating and air conditioning are outlined. Presentation materials (data packages, slides, charts, and visual aids) were developed. Bibliographies and source materials on materials and coatings, solar water heaters, systems analysis computer models, solar collectors and solar projects were developed. Detailed MIRADS computer formats for primary data parameters were developed and updated. The following data were included: climatic, architectural, topography, heating and cooling equipment, thermal loads, and economics. Data sources in each of these areas were identified as well as solar radiation data stations and instruments.
InterFace: A software package for face image warping, averaging, and principal components analysis.
Kramer, Robin S S; Jenkins, Rob; Burton, A Mike
2017-12-01
We describe InterFace, a software package for research in face recognition. The package supports image warping, reshaping, averaging of multiple face images, and morphing between faces. It also supports principal components analysis (PCA) of face images, along with tools for exploring the "face space" produced by PCA. The package uses a simple graphical user interface, allowing users to perform these sophisticated image manipulations without any need for programming knowledge. The program is available for download in the form of an app, which requires that users also have access to the (freely available) MATLAB Runtime environment.
Selection of software for mechanical engineering undergraduates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheah, C. T.; Yin, C. S.; Halim, T.
A major problem with the undergraduate mechanical course is the limited exposure of students to software packages coupled with the long learning curve on the existing software packages. This work proposes the use of appropriate software packages for the entire mechanical engineering curriculum to ensure students get sufficient exposure real life design problems. A variety of software packages are highlighted as being suitable for undergraduate work in mechanical engineering, e.g. simultaneous non-linear equations; uncertainty analysis; 3-D modeling software with the FEA; analysis tools for the solution of problems in thermodynamics, fluid mechanics, mechanical system design, and solid mechanics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
This Safety Analysis Report for Packaging for the Oak Ridge Y-12 Plant for the Model DC-1 package with highly enriched uranium (HEU) oxide contents has been prepared in accordance with governing regulations form the Nuclear Regulatory Commission and the Department of Transportation and orders from the Department of energy. The fundamental safety requirements addressed by these regulations and orders pertain to the containment of radioactive material, radiation shielding, and nuclear subcriticality. This report demonstrates how these requirements are met.
Combining multiple tools outperforms individual methods in gene set enrichment analyses.
Alhamdoosh, Monther; Ng, Milica; Wilson, Nicholas J; Sheridan, Julie M; Huynh, Huy; Wilson, Michael J; Ritchie, Matthew E
2017-02-01
Gene set enrichment (GSE) analysis allows researchers to efficiently extract biological insight from long lists of differentially expressed genes by interrogating them at a systems level. In recent years, there has been a proliferation of GSE analysis methods and hence it has become increasingly difficult for researchers to select an optimal GSE tool based on their particular dataset. Moreover, the majority of GSE analysis methods do not allow researchers to simultaneously compare gene set level results between multiple experimental conditions. The ensemble of genes set enrichment analyses (EGSEA) is a method developed for RNA-sequencing data that combines results from twelve algorithms and calculates collective gene set scores to improve the biological relevance of the highest ranked gene sets. EGSEA's gene set database contains around 25 000 gene sets from sixteen collections. It has multiple visualization capabilities that allow researchers to view gene sets at various levels of granularity. EGSEA has been tested on simulated data and on a number of human and mouse datasets and, based on biologists' feedback, consistently outperforms the individual tools that have been combined. Our evaluation demonstrates the superiority of the ensemble approach for GSE analysis, and its utility to effectively and efficiently extrapolate biological functions and potential involvement in disease processes from lists of differentially regulated genes. EGSEA is available as an R package at http://www.bioconductor.org/packages/EGSEA/ . The gene sets collections are available in the R package EGSEAdata from http://www.bioconductor.org/packages/EGSEAdata/ . monther.alhamdoosh@csl.com.au mritchie@wehi.edu.au. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
Developing Listening and Speaking Skills. Learning Package No. 46.
ERIC Educational Resources Information Center
Hyslop, Nancy, Comp.; Smith, Carl, Ed.
Originally developed as part of a project for the Department of Defense Schools (DoDDS) system, this learning package on developing listening and speaking skills is designed for teachers who wish to upgrade or expand their teaching skills on their own. The package includes an overview of the project; a comprehensive search of the ERIC database; a…
Developing Thinking Skills through Literature. Learning Package No. 19.
ERIC Educational Resources Information Center
Collins, Norma; Smith, Carl, Comp.
Originally developed for the Department of Defense Schools (DoDDS) system, this learning package on developing thinking skills through literature is designed for teachers who wish to upgrade or expand their teaching skills on their own. The package includes a comprehensive search of the ERIC database; a lecture giving an overview on the topic; the…
Developing Oral Language. Learning Package No. 1.
ERIC Educational Resources Information Center
Hong, Zhang; Smith, Carl, Comp.
Originally developed for the Department of Defense Schools (DoDDS) system, this learning package on developing oral language is designed for teachers who wish to upgrade or expand their teaching skills on their own. The package includes a comprehensive search of the ERIC database; a lecture giving an overview on the topic; the full text of several…
Hydropy: Python package for hydrological time series handling based on Python Pandas
NASA Astrophysics Data System (ADS)
Van Hoey, Stijn; Balemans, Sophie; Nopens, Ingmar; Seuntjens, Piet
2015-04-01
Most hydrologists are dealing with time series frequently. Reading in time series, transforming them and extracting specific periods for visualisation are part of the daily work. Spreadsheet software is used a lot for these operations, but has some major drawbacks. It is mostly not reproducible, it is prone to errors and not easy to automate, which results in repetitive work when dealing with large amounts of data. Scripting languages like R and Python on the other hand, provide flexibility, enable automation and reproducibility and, hence, increase efficiency. Python has gained popularity over the last years and currently, tools for many aspects of scientific computing are readily available in Python. An increased support in controlling and managing the dependencies between packages (e.g. the Anaconda environment) allows for a wide audience to use the huge variety of available packages. Pandas is a powerful Python package for data analysis and has a lot of functionalities related to time series. As such, the package is of special interest to hydrologists. Some other packages, focussing on hydrology (e.g. Hydroclimpy by Pierre Gerard-Marchant and Hydropy by Javier Rovegno Campos), stopped active development, mainly due to the superior implementation of Pandas. We present a (revised) version of the Hydropy package that is inspired by the aforementioned packages and builds on the power of Pandas. The main idea is to add hydrological domain knowledge to the already existing Pandas functionalities. Besides, the package attempts to make the time series handling intuitive and easy to perform, thus with a clear syntax. Some illustrative examples of the current implementation starting from a Pandas DataFrame named flowdata: Creating the object flow to work with: flow = HydroAnalysis(flowdata) Retrieve only the data during winter (across all years): flow.get_season('winter') Retrieve only the data during summer of 2010: flow.get_season('summer').get_year('2010') which is equivalent to flow.get_year('2010').get_season('summer') Retrieve only the data of July and get the peak values above the 95 percentile: flow.get_season('july').get_highpeaks(above_percentile=0.95) Retrieve only the data between two specified days and selecting only the rising limbs flow.get_date_range('01/10/2008', '15/2/2014').get_climbing() Calculate the annual sum and make a plot of it: flow.frequency_resample('A', 'sum').plot()
Maldives. Package on population education for special interest groups developed.
1995-01-01
The Population Education Program of the Non-Formal Education Center has developed a package of Population Education for Special Interest Groups comprising a learning package and fieldworker's guide. The learning package is especially developed for teaching population education for out-of-school populations. Special interest groups in Maldives include newly married couples, adolescents, and working youth. Produced under the guidance of UNESCO, Bangkok, the package contains 36 different materials such as posters, charts, leaflets, booklets, stories, and illustrated booklets which may be taught in 36 to 45 periods. The materials deal with eight themes, namely, family size and family welfare, population and resources, delayed marriage and parenthood, responsible parenthood, population-related values and beliefs, women in development, AIDS/STD, and respect for old people. Accompanying the learning package is the fieldworker's guide used to teach the package. It contains individual guides for each of the 36 learning materials. The guide gives the titles of the materials, format, objectives of the materials, messages, target groups, and an overview of the content of each learning materials. The methodologies used for teaching the learning materials include role playing, group discussion, questioning, brainstorming, survey, creative writing, problem-solving and evaluation. The package will be used by fieldworkers to conduct island-based population education courses. full text
Current status of circularity for aluminum from household waste in Austria.
Warrings, R; Fellner, J
2018-02-20
Aluminum (Al) represents the metal with the highest consumption growth in the last few decades. Beside its increasing usage in the transport (lightweight construction of vehicles) and building sector, Al is used ever more frequently for household goods like packaging material, which represents a readily available source for secondary aluminum due to its short lifetime. The present paper investigates the extent to which this potential source for recycling of Al is already utilized in Austria and highlights areas for future improvements. Thereto a detailed material flow analysis for Al used in packaging & household non-packaging in 2013 was conducted. In practice, all Al flows starting from market entrance through waste collection and processing until its final recycling or disposal have been investigated. The results indicate that about 25,100 t/a (2.96 kg/cap/a) of Al packaging & household non-packaging arose as waste. At present about 9800 t/a, or 39%, are recycled as secondary Al, of which 26% is regained from separate collection and sorting, 8% from bottom ash and 5% from mechanical treatment. The type of Al packaging & household non-packaging affects the recycling rate: 82% of the total recycled quantities come from rigid packaging & household non-packaging, while only 3% of the total recycled Al derives from flexible materials. A significant amount of Al was lost during thermal waste treatment due to oxidation (10%) and insufficient recovery of Al from both waste incineration bottom ash and municipal solid waste treated in mechanical biological treatment plants (49%). Overall it can be concluded that once Al ends up in commingled waste the recovery of Al becomes less likely and its material quality is reduced. Although Austria can refer to a highly developed recycling system, the Austrian packaging industry, collection and recovery systems and waste management need to increase their efforts to comply with future recycling targets. Copyright © 2018 Elsevier Ltd. All rights reserved.
Variations in algorithm implementation among quantitative texture analysis software packages
NASA Astrophysics Data System (ADS)
Foy, Joseph J.; Mitta, Prerana; Nowosatka, Lauren R.; Mendel, Kayla R.; Li, Hui; Giger, Maryellen L.; Al-Hallaq, Hania; Armato, Samuel G.
2018-02-01
Open-source texture analysis software allows for the advancement of radiomics research. Variations in texture features, however, result from discrepancies in algorithm implementation. Anatomically matched regions of interest (ROIs) that captured normal breast parenchyma were placed in the magnetic resonance images (MRI) of 20 patients at two time points. Six first-order features and six gray-level co-occurrence matrix (GLCM) features were calculated for each ROI using four texture analysis packages. Features were extracted using package-specific default GLCM parameters and using GLCM parameters modified to yield the greatest consistency among packages. Relative change in the value of each feature between time points was calculated for each ROI. Distributions of relative feature value differences were compared across packages. Absolute agreement among feature values was quantified by the intra-class correlation coefficient. Among first-order features, significant differences were found for max, range, and mean, and only kurtosis showed poor agreement. All six second-order features showed significant differences using package-specific default GLCM parameters, and five second-order features showed poor agreement; with modified GLCM parameters, no significant differences among second-order features were found, and all second-order features showed poor agreement. While relative texture change discrepancies existed across packages, these differences were not significant when consistent parameters were used.
Preliminary analysis techniques for ring and stringer stiffened cylindrical shells
NASA Technical Reports Server (NTRS)
Graham, J.
1993-01-01
This report outlines methods of analysis for the buckling of thin-walled circumferentially and longitudinally stiffened cylindrical shells. Methods of analysis for the various failure modes are presented in one cohesive package. Where applicable, more than one method of analysis for a failure mode is presented along with standard practices. The results of this report are primarily intended for use in launch vehicle design in the elastic range. A Microsoft Excel worksheet with accompanying macros has been developed to automate the analysis procedures.
Statistical principle and methodology in the NISAN system.
Asano, C
1979-01-01
The NISAN system is a new interactive statistical analysis program package constructed by an organization of Japanese statisticans. The package is widely available for both statistical situations, confirmatory analysis and exploratory analysis, and is planned to obtain statistical wisdom and to choose optimal process of statistical analysis for senior statisticians. PMID:540594
Practical fundamentals of glass, rubber, and plastic sterile packaging systems.
Sacha, Gregory A; Saffell-Clemmer, Wendy; Abram, Karen; Akers, Michael J
2010-01-01
Sterile product packaging systems consist of glass, rubber, and plastic materials that are in intimate contact with the formulation. These materials can significantly affect the stability of the formulation. The interaction between the packaging materials and the formulation can also affect the appropriate delivery of the product. Therefore, a parenteral formulation actually consists of the packaging system as well as the product that it contains. However, the majority of formulation development time only considers the product that is contained in the packaging system. Little time is spent studying the interaction of the packaging materials with the contents. Interaction between the packaging and the contents only becomes a concern when problems are encountered. For this reason, there are few scientific publications that describe the available packaging materials, their advantages and disadvantages, and their important product attributes. This article was created as a reference for product development and describes some of the packaging materials and systems that are available for parenteral products.
Interactive Multimedia Package in Ameliorating Communicative Skill in English
ERIC Educational Resources Information Center
Singaravelu, G.
2011-01-01
The study enlightens the effectiveness of Interactive-Multimedia Package in developing communicative skill in English at standard VI. Present methods of developing communicative skill are ineffective to the students in improving their communicative competencies in English. Challenging interactive Multimedia Package helps to enhance the…
Recent advances in biopolymers and biopolymer-based nanocomposites for food packaging materials.
Tang, X Z; Kumar, P; Alavi, S; Sandeep, K P
2012-01-01
Plastic packaging for food and non-food applications is non-biodegradable, and also uses up valuable and scarce non-renewable resources like petroleum. With the current focus on exploring alternatives to petroleum and emphasis on reduced environmental impact, research is increasingly being directed at development of biodegradable food packaging from biopolymer-based materials. The proposed paper will present a review of recent developments in biopolymer-based food packaging materials including natural biopolymers (such as starches and proteins), synthetic biopolymers (such as poly lactic acid), biopolymer blends, and nanocomposites based on natural and synthetic biopolymers. The paper will discuss the various techniques that have been used for developing cost-effective biodegradable packaging materials with optimum mechanical strength and oxygen and moisture barrier properties. This is a timely review as there has been a recent renewed interest in research studies, both in the industry and academia, towards development of a new generation of biopolymer-based food packaging materials with possible applications in other areas.
Model reduction methods for control design
NASA Technical Reports Server (NTRS)
Dunipace, K. R.
1988-01-01
Several different model reduction methods are developed and detailed implementation information is provided for those methods. Command files to implement the model reduction methods in a proprietary control law analysis and design package are presented. A comparison and discussion of the various reduction techniques is included.
Zaknun, John J; Rajabi, Hossein; Piepsz, Amy; Roca, Isabel; Dondi, Maurizio
2011-01-01
Under the auspices of the International Atomic Energy Agency, a new-generation, platform-independent, and x86-compatible software package was developed for the analysis of scintigraphic renal dynamic imaging studies. It provides nuclear medicine professionals cost-free access to the most recent developments in the field. The software package is a step forward towards harmonization and standardization. Embedded functionalities render it a suitable tool for education, research, and for receiving distant expert's opinions. Another objective of this effort is to allow introducing clinically useful parameters of drainage, including normalized residual activity and outflow efficiency. Furthermore, it provides an effective teaching tool for young professionals who are being introduced to dynamic kidney studies by selected teaching case studies. The software facilitates a better understanding through practically approaching different variables and settings and their effect on the numerical results. An effort was made to introduce instruments of quality assurance at the various levels of the program's execution, including visual inspection and automatic detection and correction of patient's motion, automatic placement of regions of interest around the kidneys, cortical regions, and placement of reproducible background region on both primary dynamic and on postmicturition studies. The user can calculate the differential renal function through 2 independent methods, the integral or the Rutland-Patlak approaches. Standardized digital reports, storage and retrieval of regions of interest, and built-in database operations allow the generation and tracing of full image reports and of numerical outputs. The software package is undergoing quality assurance procedures to verify the accuracy and the interuser reproducibility with the final aim of launching the program for use by professionals and teaching institutions worldwide. Copyright © 2011 Elsevier Inc. All rights reserved.
Electromagnetic Field Effects in Semiconductor Crystal Growth
NASA Technical Reports Server (NTRS)
Dulikravich, George S.
1996-01-01
This proposed two-year research project was to involve development of an analytical model, a numerical algorithm for its integration, and a software for the analysis of a solidification process under the influence of electric and magnetic fields in microgravity. Due to the complexity of the analytical model that was developed and its boundary conditions, only a preliminary version of the numerical algorithm was developed while the development of the software package was not completed.
An experimental analysis of the effectiveness and sustainability of a Chinese tutoring package.
Wu, Hang; Miller, L Keith
2012-01-01
This experiment evaluated the effects of training tutors to use an instructional package to teach pronunciation and translation of the Chinese language. Tutors' correct use of the package increased from 68% of trials to 92% after training, and student correct pronunciation increased from 45% to 90%, with similar effects for translation. Continued use of the package, high social validity, and extended follow-up suggest that use of the package may be sustainable.
AN EXPERIMENTAL ANALYSIS OF THE EFFECTIVENESS AND SUSTAINABILITY OF A CHINESE TUTORING PACKAGE
Wu, Hang; Miller, L. Keith
2012-01-01
This experiment evaluated the effects of training tutors to use an instructional package to teach pronunciation and translation of the Chinese language. Tutors' correct use of the package increased from 68% of trials to 92% after training, and student correct pronunciation increased from 45% to 90%, with similar effects for translation. Continued use of the package, high social validity, and extended follow-up suggest that use of the package may be sustainable. PMID:22403470
NASA Astrophysics Data System (ADS)
Anwar, R. W.; Sugiarto; Warsiki, E.
2018-03-01
Contamination after the processing of products during storage, distribution and marketing is one of the main causes of food safety issues. Handling of food products after processing can be done during the packaging process. Antimicrobial (AM) active packaging is one of the concept of packaging product development by utilize the interaction between the product and the packaging environment that can delay the bacterial damage by killing or reducing bacterial growth. The active system is formed by incorporating an antimicrobial agent against a packaging matrix that will function as a carrier. Many incorporation methods have been developed in this packaging-making concept which were direct mixing, polishing, and encapsulation. The aims of this research were to examine the different of the AM packaging performances including its stability and effectiveness of its function that would be produced by three different methods. The stability of the packaging function was analyzed by looking at the diffusivity of the active ingredient to the matrix using SEM. The effectiveness was analyzed by the ability of the packaging to prevent the growing of the microbial. The results showed that different incorporation methods resulted on different characteristics of the AM packaging.
Comparison of TAPS Packages for Engineering
ERIC Educational Resources Information Center
Sidhu, S. Manjit
2008-01-01
Purpose: This paper aims to present the development of technology-assisted problem solving (TAPS) packages at University Tenaga Nasional (UNITEN). The project is the further work of the development of interactive multimedia based packages targeted for students having problems in understanding the subject of engineering mechanics dynamics.…
Analysis of the Variable Pressure Growth Chamber using the CASE/A simulation package
NASA Technical Reports Server (NTRS)
Mcfadden, Carl D.; Edeen, Marybeth A.
1992-01-01
A computer simulation of the Variable Pressure Growth Chamber (VPGC), located at the NASA Johnson Space Center, has been developed using the Computer Aided Systems Engineering and Analysis (CASE/A) package. The model has been used to perform several analyses of the VPGC. The analyses consisted of a study of the effects of a human metabolic load on the VPGC and a study of two new configurations for the temperature and humidity control (THC) subsystem in the VPGC. The objective of the human load analysis was to study the effects of a human metabolic load on the air revitalization and THC subsystems. This included the effects on the quantity of carbon dioxide injected and oxygen removed from the chamber and the effects of the additional sensible and latent heat loads. The objective of the configuration analysis was to compare the two new THC configurations against the current THC configuration to determine which had the best performance.
Dunet, Vincent; Klein, Ran; Allenbach, Gilles; Renaud, Jennifer; deKemp, Robert A; Prior, John O
2016-06-01
Several analysis software packages for myocardial blood flow (MBF) quantification from cardiac PET studies exist, but they have not been compared using concordance analysis, which can characterize precision and bias separately. Reproducible measurements are needed for quantification to fully develop its clinical potential. Fifty-one patients underwent dynamic Rb-82 PET at rest and during adenosine stress. Data were processed with PMOD and FlowQuant (Lortie model). MBF and myocardial flow reserve (MFR) polar maps were quantified and analyzed using a 17-segment model. Comparisons used Pearson's correlation ρ (measuring precision), Bland and Altman limit-of-agreement and Lin's concordance correlation ρc = ρ·C b (C b measuring systematic bias). Lin's concordance and Pearson's correlation values were very similar, suggesting no systematic bias between software packages with an excellent precision ρ for MBF (ρ = 0.97, ρc = 0.96, C b = 0.99) and good precision for MFR (ρ = 0.83, ρc = 0.76, C b = 0.92). On a per-segment basis, no mean bias was observed on Bland-Altman plots, although PMOD provided slightly higher values than FlowQuant at higher MBF and MFR values (P < .0001). Concordance between software packages was excellent for MBF and MFR, despite higher values by PMOD at higher MBF values. Both software packages can be used interchangeably for quantification in daily practice of Rb-82 cardiac PET.
Mangaraj, S; K Goswami, T; Mahajan, P V
2015-07-01
MAP is a dynamic system where respiration of the packaged product and gas permeation through the packaging film takes place simultaneously. The desired level of O2 and CO2 in a package is achieved by matching film permeation rates for O2 and CO2 with respiration rate of the packaged product. A mathematical model for MAP of fresh fruits applying enzyme kinetics based respiration equation coupled with the Arrhenious type model was developed. The model was solved numerically using MATLAB programme. The model was used to determine the time to reach to the equilibrium concentration inside the MA package and the level of O2 and CO2 concentration at equilibrium state. The developed model for prediction of equilibrium O2 and CO2 concentration was validated using experimental data for MA packaging of apple, guava and litchi.
Developing a Decision-Making Plan for the Reading Teacher. Learning Package No. 25.
ERIC Educational Resources Information Center
Smith, Carl, Comp.
Originally developed for the Department of Defense Schools (DoDDS) system, this learning package on developing a decision-making plan for the reading teacher is designed for teachers who wish to upgrade or expand their teaching skills on their own. The package includes a comprehensive search of the ERIC database; a lecture giving an overview on…
ERIC Educational Resources Information Center
Mustofa; Yuwana, H. Setya
2016-01-01
Learning literature should be taken to instill recognition, familiarity and enjoyment of literature as a vehicle for character education. Learning literature must be packaged properly so that students interested in compose competence by developing literature learning models. In an effort to assist students in understanding the success of…
ERIC Educational Resources Information Center
Ford, Alan R.; Burns, William A.; Reeve, Scott W.
2004-01-01
A version of the classic gas phase infrared experiment was developed for students at Arkansas State University based on the shortcomings of the rotationally resolved infrared experiment. Chem Spec II is a noncommercial Windows-based software package developed to aid in the potentially complicated problem of assigning quantum numbers to observed…
1985-06-01
of chemical analysis and sensitivity testing on material samples . At this 4 time, these samples must be packaged and...preparation at a rate of three samples per hour. One analyst doing both sample preparation and the HPLC analysis can run 16 samples in an 8-hour day. II... study , sensitivity testing was reviewed to enable recommendations for complete analysis of contaminated soils. Materials handling techniques,
2007-10-01
1984. Complex principal component analysis : Theory and examples. Journal of Climate and Applied Meteorology 23: 1660-1673. Hotelling, H. 1933...Sediments 99. ASCE: 2,566-2,581. Von Storch, H., and A. Navarra. 1995. Analysis of climate variability. Applications of statistical techniques. Berlin...ERDC TN-SWWRP-07-9 October 2007 Regional Morphology Empirical Analysis Package (RMAP): Orthogonal Function Analysis , Background and Examples by
2011-11-01
OXYGEN AND WATER VAPOR TRANSMISSION RATE FOR NON- RETORT MILITARY RATION PACKAGING by Danielle Froio Alan Wright Nicole Favreau and Sarah...ANSI Std. Z39.18 RETORT STORAGE SHELF LIFE RETORT POUCHES SENSORY ANALYSIS OXYGEN CRACKERS PACKAGING SENSORY... Packaging for MRE. (a) MRE Retort Pouch Quad-Laminate Structure; (b) MRE Non- retort Pouch Tri-Laminate Structure
ERIC Educational Resources Information Center
Logan, Robert S.
The authoring process and authoring aids which facilitate development of instructional materials have recently emerged as an area of concern in the field of instructional systems development (ISD). This process includes information gathering, its conversion to learning packages, its revision, and its formal publication. The purpose of this…
Software design for analysis of multichannel intracardial and body surface electrocardiograms.
Potse, Mark; Linnenbank, André C; Grimbergen, Cornelis A
2002-11-01
Analysis of multichannel ECG recordings (body surface maps (BSMs) and intracardial maps) requires special software. We created a software package and a user interface on top of a commercial data analysis package (MATLAB) by a combination of high-level and low-level programming. Our software was created to satisfy the needs of a diverse group of researchers. It can handle a large variety of recording configurations. It allows for interactive usage through a fast and robust user interface, and batch processing for the analysis of large amounts of data. The package is user-extensible, includes routines for both common and experimental data processing tasks, and works on several computer platforms. The source code is made intelligible using software for structured documentation and is available to the users. The package is currently used by more than ten research groups analysing ECG data worldwide.
NASA Technical Reports Server (NTRS)
Cruse, T. A.
1987-01-01
The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.
NASA Technical Reports Server (NTRS)
Cruse, T. A.; Burnside, O. H.; Wu, Y.-T.; Polch, E. Z.; Dias, J. B.
1988-01-01
The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.
Metlagel, Zoltan; Kikkawa, Yayoi S; Kikkawa, Masahide
2007-01-01
Helical image analysis in combination with electron microscopy has been used to study three-dimensional structures of various biological filaments or tubes, such as microtubules, actin filaments, and bacterial flagella. A number of packages have been developed to carry out helical image analysis. Some biological specimens, however, have a symmetry break (seam) in their three-dimensional structure, even though their subunits are mostly arranged in a helical manner. We refer to these objects as "asymmetric helices". All the existing packages are designed for helically symmetric specimens, and do not allow analysis of asymmetric helical objects, such as microtubules with seams. Here, we describe Ruby-Helix, a new set of programs for the analysis of "helical" objects with or without a seam. Ruby-Helix is built on top of the Ruby programming language and is the first implementation of asymmetric helical reconstruction for practical image analysis. It also allows easier and semi-automated analysis, performing iterative unbending and accurate determination of the repeat length. As a result, Ruby-Helix enables us to analyze motor-microtubule complexes with higher throughput to higher resolution.
Space shuttle food system study: Food and beverage package development, modification 8S
NASA Technical Reports Server (NTRS)
1976-01-01
A new, highly utile rehydration package was developed for foods in zero gravity. Rehydratable foods will become more acceptable as a result of their overall rehydration capability and improved palatability. This new package design is greatly enhanced by the specified spacecraft condition of atmospheric pressure; the pressure differential between the atmosphere and the package carries the functional responsibility for rapid food rehydration without excess package manipulation by the consumer. Crew acceptance will further be enhanced by less manipulation, hotter rehydration water temperatures and the ability to hold the foods at preparation temperatures until they are consumed.
ERIC Educational Resources Information Center
Ott, Dana B.
1988-01-01
This article discusses developments in food packaging, processing, and preservation techniques in terms of packaging materials, technologies, consumer benefits, and current and potential food product applications. Covers implications due to consumer life-style changes, cost-effectiveness of packaging materials, and the ecological impact of…
Propensity Score Analysis in R: A Software Review
ERIC Educational Resources Information Center
Keller, Bryan; Tipton, Elizabeth
2016-01-01
In this article, we review four software packages for implementing propensity score analysis in R: "Matching, MatchIt, PSAgraphics," and "twang." After briefly discussing essential elements for propensity score analysis, we apply each package to a data set from the Early Childhood Longitudinal Study in order to estimate the…
User-driven integrated software lives: ``Paleomag'' paleomagnetics analysis on the Macintosh
NASA Astrophysics Data System (ADS)
Jones, Craig H.
2002-12-01
"PaleoMag," a paleomagnetics analysis package originally developed for the Macintosh operating system in 1988, allows examination of demagnetization of individual samples and analysis of directional data from collections of samples. Prior to recent reinvigorated development of the software for both Macintosh and Windows, it was widely used despite not running properly on machines and operating systems sold after 1995. This somewhat surprising situation demonstrates that there is a continued need for integrated analysis software within the earth sciences, in addition to well-developed scripting and batch-mode software. One distinct advantage of software like PaleoMag is in the ability to combine quality control with analysis within a unique graphical environment. Because such demands are frequent within the earth sciences, means of nurturing the development of similar software should be found.
Environmental Assessment of Packaging: The Consumer Point of View
Van Dam YK
1996-09-01
When marketing environmentally responsible packaged products, the producer is confronted with consumer beliefs concerning the environmental friendliness of packaging materials. When making environmentally conscious packaging decisions, these consumer beliefs should be taken into account alongside the technical guidelines. Dutch consumer perceptions of the environmental friendliness of packaged products are reported and compared with the results of a life-cycle analysis assessment. It is shown that consumers judge environmental friendliness mainly from material and returnability. Furthermore, the consumer perception of the environmental friendliness of packaging material is based on the postconsumption waste, whereas the environmental effects of production are ignored. From the consumer beliefs concerning environmental friendliness implications are deduced for packaging policy and for environmental policy.KEY WORDS: Consumer behavior; Environment; Food; Packaging; Perception; Waste
Carter, Faustin Wirkus; Khaire, Trupti S.; Novosad, Valentyn; ...
2016-11-07
We present "scraps" (SuperConducting Analysis and Plotting Software), a Python package designed to aid in the analysis and visualization of large amounts of superconducting resonator data, specifically complex transmission as a function of frequency, acquired at many different temperatures and driving powers. The package includes a least-squares fitting engine as well as a Monte-Carlo Markov Chain sampler for sampling the posterior distribution given priors, marginalizing over nuisance parameters, and estimating covariances. A set of plotting tools for generating publication-quality figures is also provided in the package. Lastly, we discuss the functionality of the software and provide some examples of itsmore » utility on data collected from a niobium-nitride coplanar waveguide resonator fabricated at Argonne National Laboratory.« less
Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool
NASA Technical Reports Server (NTRS)
Maul, William A.; Fulton, Christopher E.
2011-01-01
This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual
Youth Work Training Package Review: More of the Same or Radical Rationalisation?
ERIC Educational Resources Information Center
Corney, Tim; Broadbent, Robyn
2007-01-01
The development of a national youth work training package in Australia began over 15 years ago. The current package sits under the umbrella of the general Community Services Industry Training Package. The first stage of a review of this package has been completed and the subsequent report not only confirms the recent trend towards the…
Development and Use of an Open-Source, User-Friendly Package to Simulate Voltammetry Experiments
ERIC Educational Resources Information Center
Wang, Shuo; Wang, Jing; Gao, Yanjing
2017-01-01
An open-source electrochemistry simulation package has been developed that simulates the electrode processes of four reaction mechanisms and two typical electroanalysis techniques: cyclic voltammetry and chronoamperometry. Unlike other open-source simulation software, this package balances the features with ease of learning and implementation and…
Developing a Package Training System for Industry
ERIC Educational Resources Information Center
Battersby, D. L. N.
1974-01-01
The hotel and catering industry is one of Great Britain's largest. A packaged training system has been developed to satisfy the needs of this industry, an ever-growing occupational field with multiple categories. The material provided in each package outlines short pieces of instruction and helps the trainer create appropriate training. (DS)
ANALYSIS/PLOT: a graphics package for use with the SORT/ANALYSIS data bases
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sady, C.A.
1983-08-01
This report describes a graphics package that is used with the SORT/ANALYSIS data bases. The data listed by the SORT/ANALYSIS program can be presented in pie, bar, line, or Gantt chart form. Instructions for the use of the plotting program and descriptions of the subroutines are given in the report.
CentiServer: A Comprehensive Resource, Web-Based Application and R Package for Centrality Analysis.
Jalili, Mahdi; Salehzadeh-Yazdi, Ali; Asgari, Yazdan; Arab, Seyed Shahriar; Yaghmaie, Marjan; Ghavamzadeh, Ardeshir; Alimoghaddam, Kamran
2015-01-01
Various disciplines are trying to solve one of the most noteworthy queries and broadly used concepts in biology, essentiality. Centrality is a primary index and a promising method for identifying essential nodes, particularly in biological networks. The newly created CentiServer is a comprehensive online resource that provides over 110 definitions of different centrality indices, their computational methods, and algorithms in the form of an encyclopedia. In addition, CentiServer allows users to calculate 55 centralities with the help of an interactive web-based application tool and provides a numerical result as a comma separated value (csv) file format or a mapped graphical format as a graph modeling language (GML) file. The standalone version of this application has been developed in the form of an R package. The web-based application (CentiServer) and R package (centiserve) are freely available at http://www.centiserver.org/.
Meteorological Instruction Software
NASA Technical Reports Server (NTRS)
1990-01-01
At Florida State University and the Naval Postgraduate School, meteorology students have the opportunity to apply theoretical studies to current weather phenomena, even prepare forecasts and see how their predictions stand up utilizing GEMPAK. GEMPAK can display data quickly in both conventional and non-traditional ways, allowing students to view multiple perspectives of the complex three-dimensional atmospheric structure. With GEMPAK, mathematical equations come alive as students do homework and laboratory assignments on the weather events happening around them. Since GEMPAK provides data on a 'today' basis, each homework assignment is new. At the Naval Postgraduate School, students are now using electronically-managed environmental data in the classroom. The School's Departments of Meteorology and Oceanography have developed the Interactive Digital Environment Analysis (IDEA) Laboratory. GEMPAK is the IDEA Lab's general purpose display package; the IDEA image processing package is a modified version of NASA's Device Management System. Bringing the graphic and image processing packages together is NASA's product, the Transportable Application Executive (TAE).
CentiServer: A Comprehensive Resource, Web-Based Application and R Package for Centrality Analysis
Jalili, Mahdi; Salehzadeh-Yazdi, Ali; Asgari, Yazdan; Arab, Seyed Shahriar; Yaghmaie, Marjan; Ghavamzadeh, Ardeshir; Alimoghaddam, Kamran
2015-01-01
Various disciplines are trying to solve one of the most noteworthy queries and broadly used concepts in biology, essentiality. Centrality is a primary index and a promising method for identifying essential nodes, particularly in biological networks. The newly created CentiServer is a comprehensive online resource that provides over 110 definitions of different centrality indices, their computational methods, and algorithms in the form of an encyclopedia. In addition, CentiServer allows users to calculate 55 centralities with the help of an interactive web-based application tool and provides a numerical result as a comma separated value (csv) file format or a mapped graphical format as a graph modeling language (GML) file. The standalone version of this application has been developed in the form of an R package. The web-based application (CentiServer) and R package (centiserve) are freely available at http://www.centiserver.org/ PMID:26571275
Khramtsova, Ekaterina A; Stranger, Barbara E
2017-02-01
Over the last decade, genome-wide association studies (GWAS) have generated vast amounts of analysis results, requiring development of novel tools for data visualization. Quantile–quantile (QQ) plots and Manhattan plots are classical tools which have been utilized to visually summarize GWAS results and identify genetic variants significantly associated with traits of interest. However, static visualizations are limiting in the information that can be shown. Here, we present Assocplots, a Python package for viewing and exploring GWAS results not only using classic static Manhattan and QQ plots, but also through a dynamic extension which allows to interactively visualize the relationships between GWAS results from multiple cohorts or studies. The Assocplots package is open source and distributed under the MIT license via GitHub (https://github.com/khramts/assocplots) along with examples, documentation and installation instructions. ekhramts@medicine.bsd.uchicago.edu or bstranger@medicine.bsd.uchicago.edu
Method to simulate and analyse induced stresses for laser crystal packaging technologies.
Ribes-Pleguezuelo, Pol; Zhang, Site; Beckert, Erik; Eberhardt, Ramona; Wyrowski, Frank; Tünnermann, Andreas
2017-03-20
A method to simulate induced stresses for a laser crystal packaging technique and the consequent study of birefringent effects inside the laser cavities has been developed. The method has been implemented by thermo-mechanical simulations implemented with ANSYS 17.0. ANSYS results were later imported in VirtualLab Fusion software where input/output beams in terms of wavelengths and polarization were analysed. The study has been built in the context of a low-stress soldering technique implemented for glass or crystal optics packaging's called the solderjet bumping technique. The outcome of the analysis showed almost no difference between the input and output laser beams for the laser cavity constructed with an yttrium aluminum garnet active laser crystal, a second harmonic generator beta-barium borate, and the output laser mirror made of fused silica assembled by the low-stress solderjet bumping technique.
NASA Astrophysics Data System (ADS)
Xie, Dexuan
2014-10-01
The Poisson-Boltzmann equation (PBE) is one widely-used implicit solvent continuum model in the calculation of electrostatic potential energy for biomolecules in ionic solvent, but its numerical solution remains a challenge due to its strong singularity and nonlinearity caused by its singular distribution source terms and exponential nonlinear terms. To effectively deal with such a challenge, in this paper, new solution decomposition and minimization schemes are proposed, together with a new PBE analysis on solution existence and uniqueness. Moreover, a PBE finite element program package is developed in Python based on the FEniCS program library and GAMer, a molecular surface and volumetric mesh generation program package. Numerical tests on proteins and a nonlinear Born ball model with an analytical solution validate the new solution decomposition and minimization schemes, and demonstrate the effectiveness and efficiency of the new PBE finite element program package.
Multi-Purpose Crew Vehicle Camera Asset Planning: Imagery Previsualization
NASA Technical Reports Server (NTRS)
Beaulieu, K.
2014-01-01
Using JSC-developed and other industry-standard off-the-shelf 3D modeling, animation, and rendering software packages, the Image Science Analysis Group (ISAG) supports Orion Project imagery planning efforts through dynamic 3D simulation and realistic previsualization of ground-, vehicle-, and air-based camera output.
Brandwein, Michael; Al-Quntar, Abed; Goldberg, Hila; Mosheyev, Gregory; Goffer, Moshe; Marin-Iniesta, Fulgencio; López-Gómez, Antonio; Steinberg, Doron
2016-01-01
Various surfaces associated with the storage and packing of food are known to harbor distinct bacterial pathogens. Conspicuously absent among the plethora of studies implicating food packaging materials and machinery is the study of corrugated cardboard packaging, the worldwide medium for transporting fresh produce. In this study, we observed the microbial communities of three different store-bought fruits and vegetables, along with their analog cardboard packaging using high throughput sequencing technology. We further developed an anti-biofilm polymer meant to coat corrugated cardboard surfaces and mediate bacterial biofilm growth on said surfaces. Integration of a novel thiazolidinedione derivative into the acrylic emulsion polymers was assessed using Energy Dispersive X-ray Spectrometry (EDS) analysis and surface topography was visualized and quantified on corrugated cardboard surfaces. Biofilm growth was measured using q-PCR targeting the gene encoding 16s rRNA. Additionally, architectural structure of the biofilm was observed using SEM. The uniform integration of the thiazolidinedione derivative TZD-6 was confirmed, and it was determined via q-PCR to reduce biofilm growth by ~80% on tested surfaces. A novel and effective method for reducing microbial load and preventing contamination on food packaging is thereby proposed. PMID:26909074
M4SF-17LL010301071: Thermodynamic Database Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zavarin, M.; Wolery, T. J.
2017-09-05
This progress report (Level 4 Milestone Number M4SF-17LL010301071) summarizes research conducted at Lawrence Livermore National Laboratory (LLNL) within the Argillite Disposal R&D Work Package Number M4SF-17LL01030107. The DR Argillite Disposal R&D control account is focused on the evaluation of important processes in the analysis of disposal design concepts and related materials for nuclear fuel disposal in clay-bearing repository media. The objectives of this work package are to develop model tools for evaluating impacts of THMC process on long-term disposal of spent fuel in argillite rocks, and to establish the scientific basis for high thermal limits. This work is contributing tomore » the GDSA model activities to identify gaps, develop process models, provide parameter feeds and support requirements providing the capability for a robust repository performance assessment model by 2020.« less
Development of Fuel Shuffling Module for PHISICS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allan Mabe; Andrea Alfonsi; Cristian Rabiti
2013-06-01
PHISICS (Parallel and Highly Innovative Simulation for the INL Code System) [4] code toolkit has been in development at the Idaho National Laboratory. This package is intended to provide a modern analysis tool for reactor physics investigation. It is designed with the mindset to maximize accuracy for a given availability of computational resources and to give state of the art tools to the modern nuclear engineer. This is obtained by implementing several different algorithms and meshing approaches among which the user will be able to choose, in order to optimize his computational resources and accuracy needs. The software is completelymore » modular in order to simplify the independent development of modules by different teams and future maintenance. The package is coupled with the thermo-hydraulic code RELAP5-3D [3]. In the following the structure of the different PHISICS modules is briefly recalled, focusing on the new shuffling module (SHUFFLE), object of this paper.« less
Safety analysis report for the SR-101 inert reservoir package
DOT National Transportation Integrated Search
1998-11-01
Department of Energy (DOE) AL Weapons Surety Division (WSD) requires the SR-101 Inert Reservoir Package to : meet applicable hazardous material transportation requirements. This Safety Analysis Report (SAR) is based on : requirements in place at the ...
Varsos, Constantinos; Patkos, Theodore; Pavloudi, Christina; Gougousis, Alexandros; Ijaz, Umer Zeeshan; Filiopoulou, Irene; Pattakos, Nikolaos; Vanden Berghe, Edward; Fernández-Guerra, Antonio; Faulwetter, Sarah; Chatzinikolaou, Eva; Pafilis, Evangelos; Bekiari, Chryssoula; Doerr, Martin; Arvanitidis, Christos
2016-01-01
Abstract Background Parallel data manipulation using R has previously been addressed by members of the R community, however most of these studies produce ad hoc solutions that are not readily available to the average R user. Our targeted users, ranging from the expert ecologist/microbiologists to computational biologists, often experience difficulties in finding optimal ways to exploit the full capacity of their computational resources. In addition, improving performance of commonly used R scripts becomes increasingly difficult especially with large datasets. Furthermore, the implementations described here can be of significant interest to expert bioinformaticians or R developers. Therefore, our goals can be summarized as: (i) description of a complete methodology for the analysis of large datasets by combining capabilities of diverse R packages, (ii) presentation of their application through a virtual R laboratory (RvLab) that makes execution of complex functions and visualization of results easy and readily available to the end-user. New information In this paper, the novelty stems from implementations of parallel methodologies which rely on the processing of data on different levels of abstraction and the availability of these processes through an integrated portal. Parallel implementation R packages, such as the pbdMPI (Programming with Big Data – Interface to MPI) package, are used to implement Single Program Multiple Data (SPMD) parallelization on primitive mathematical operations, allowing for interplay with functions of the vegan package. The dplyr and RPostgreSQL R packages are further integrated offering connections to dataframe like objects (databases) as secondary storage solutions whenever memory demands exceed available RAM resources. The RvLab is running on a PC cluster, using version 3.1.2 (2014-10-31) on a x86_64-pc-linux-gnu (64-bit) platform, and offers an intuitive virtual environmet interface enabling users to perform analysis of ecological and microbial communities based on optimized vegan functions. A beta version of the RvLab is available after registration at: https://portal.lifewatchgreece.eu/ PMID:27932907
Facilitating hydrological data analysis workflows in R: the RHydro package
NASA Astrophysics Data System (ADS)
Buytaert, Wouter; Moulds, Simon; Skoien, Jon; Pebesma, Edzer; Reusser, Dominik
2015-04-01
The advent of new technologies such as web-services and big data analytics holds great promise for hydrological data analysis and simulation. Driven by the need for better water management tools, it allows for the construction of much more complex workflows, that integrate more and potentially more heterogeneous data sources with longer tool chains of algorithms and models. With the scientific challenge of designing the most adequate processing workflow comes the technical challenge of implementing the workflow with a minimal risk for errors. A wide variety of new workbench technologies and other data handling systems are being developed. At the same time, the functionality of available data processing languages such as R and Python is increasing at an accelerating pace. Because of the large diversity of scientific questions and simulation needs in hydrology, it is unlikely that one single optimal method for constructing hydrological data analysis workflows will emerge. Nevertheless, languages such as R and Python are quickly gaining popularity because they combine a wide array of functionality with high flexibility and versatility. The object-oriented nature of high-level data processing languages makes them particularly suited for the handling of complex and potentially large datasets. In this paper, we explore how handling and processing of hydrological data in R can be facilitated further by designing and implementing a set of relevant classes and methods in the experimental R package RHydro. We build upon existing efforts such as the sp and raster packages for spatial data and the spacetime package for spatiotemporal data to define classes for hydrological data (HydroST). In order to handle simulation data from hydrological models conveniently, a HM class is defined. Relevant methods are implemented to allow for an optimal integration of the HM class with existing model fitting and simulation functionality in R. Lastly, we discuss some of the design challenges of the RHydro package, including integration with big data technologies, web technologies, and emerging data models in hydrology.
Varsos, Constantinos; Patkos, Theodore; Oulas, Anastasis; Pavloudi, Christina; Gougousis, Alexandros; Ijaz, Umer Zeeshan; Filiopoulou, Irene; Pattakos, Nikolaos; Vanden Berghe, Edward; Fernández-Guerra, Antonio; Faulwetter, Sarah; Chatzinikolaou, Eva; Pafilis, Evangelos; Bekiari, Chryssoula; Doerr, Martin; Arvanitidis, Christos
2016-01-01
Parallel data manipulation using R has previously been addressed by members of the R community, however most of these studies produce ad hoc solutions that are not readily available to the average R user. Our targeted users, ranging from the expert ecologist/microbiologists to computational biologists, often experience difficulties in finding optimal ways to exploit the full capacity of their computational resources. In addition, improving performance of commonly used R scripts becomes increasingly difficult especially with large datasets. Furthermore, the implementations described here can be of significant interest to expert bioinformaticians or R developers. Therefore, our goals can be summarized as: (i) description of a complete methodology for the analysis of large datasets by combining capabilities of diverse R packages, (ii) presentation of their application through a virtual R laboratory (RvLab) that makes execution of complex functions and visualization of results easy and readily available to the end-user. In this paper, the novelty stems from implementations of parallel methodologies which rely on the processing of data on different levels of abstraction and the availability of these processes through an integrated portal. Parallel implementation R packages, such as the pbdMPI (Programming with Big Data - Interface to MPI) package, are used to implement Single Program Multiple Data (SPMD) parallelization on primitive mathematical operations, allowing for interplay with functions of the vegan package. The dplyr and RPostgreSQL R packages are further integrated offering connections to dataframe like objects (databases) as secondary storage solutions whenever memory demands exceed available RAM resources. The RvLab is running on a PC cluster, using version 3.1.2 (2014-10-31) on a x86_64-pc-linux-gnu (64-bit) platform, and offers an intuitive virtual environmet interface enabling users to perform analysis of ecological and microbial communities based on optimized vegan functions. A beta version of the RvLab is available after registration at: https://portal.lifewatchgreece.eu/.
NASA Astrophysics Data System (ADS)
Yussup, N.; Rahman, N. A. A.; Ibrahim, M. M.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.
2017-01-01
Neutron Activation Analysis (NAA) process has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s. Most of the procedures established especially from sample registration to sample analysis are performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient. Hence, a software to support the system automation is developed to provide an effective method to replace redundant manual data entries and produce faster sample analysis and calculation process. This paper describes the design and development of automation software for NAA process which consists of three sub-programs. The sub-programs are sample registration, hardware control and data acquisition; and sample analysis. The data flow and connection between the sub-programs will be explained. The software is developed by using National Instrument LabView development package.
Developments in blade shape design for a Darrieus vertical axis wind turbine
NASA Astrophysics Data System (ADS)
Ashwill, T. D.; Leonard, T. M.
1986-09-01
A new computer program package has been developed that determines the troposkein shape for a Darrieus Vertical Axis Wind Turbine Blade with any geometrical configuration or rotation rate. This package allows users to interact and develop a buildable blade whose shape closely approximates the troposkein. Use of this package can significantly reduce flatwise mean bending stresses in the blade and increase fatigue life.
Victor, Ken G; Levac, Lauren; Timmins, Michael; Veale, James
2017-01-01
USP <1207.1> Section 3.5 states that "A deterministic leak test method having the ability to detect leaks at the product's maximum allowable leakage limit is preferred when establishing the inherent integrity of a container-closure system." Ideally, container closure integrity of parenteral packaging would be evaluated by measuring a physical property that is sensitive to the presence of any package defect that breaches package integrity by increasing its leakage above its maximum allowable leakage limit. The primary goals of the work presented herein were to demonstrate the viability of the nondestructive, deterministic method known as laser-based gas headspace analysis for evaluating container closure integrity and to provide a physical model for predicting leak rates for a variety of container volumes, headspace conditions, and defect sizes. The results demonstrate that laser-based headspace analysis provides sensitive, accurate, and reproducible measurements of the gas ingress into glass vial-stopper package assemblies that are under either diffusive or effusive leak conditions. Two different types of positive controls were examined. First, laser-drilled micro-holes in thin metal disks that were crimped on top of 15R glass vials served as positive controls with a well-characterized defect geometry. For these, a strong correlation was observed between the measured ingress parameter and the size of the defect for both diffusive and effusive conditions. Second, laser-drilled holes in the wall of glass vials served as controls that more closely simulate real-world defects. Due to their complex defect geometries, their diffusive and effusive ingress parameters did not necessarily correlate; this is an important observation that has significant implications for standardizing the characterization of container defects. Regardless, laser-based headspace analysis could readily differentiate positive and negative controls for all leak conditions, and the results provide a guide for method development of container closure integrity tests. LAY ABSTRACT: The new USP 39 <1207>, "Package Integrity Evaluation-Sterile Products", states in section 3.4.1: "tracer gas tests performed using … laser-based gas headspace analysis [have] been shown to be sensitive enough to quantitatively analyze leakage through the smallest leak paths found to pose the smallest chance of liquid leakage or microbial ingress in rigid packaging." In addition, USP <1207> also states that "for such methods, the limit of detection can be mathematically predicted on the basis of gas flow kinetics." Using the above statements as a foundation, this paper presents a theoretical basis for predicting the gas ingress through well-defined defects in product vials sealed under a variety of headspace conditions. These calculated predictions were experimentally validated by comparing them to measurements of changes in the headspace oxygen content or total pressure for several different positive controls using laser-based headspace analysis. The results demonstrated that laser-based headspace analysis can, by readily differentiating between negative controls and positive controls with a range of defect sizes on the micron scale, be used to assess container closure integrity. The work also demontrated that caution must be used when attempting to correlate a leak rate to an idealized defect-size parameter. © PDA, Inc. 2017.
Food packages for Space Shuttle
NASA Technical Reports Server (NTRS)
Fohey, M. F.; Sauer, R. L.; Westover, J. B.; Rockafeller, E. F.
1978-01-01
The paper reviews food packaging techniques used in space flight missions and describes the system developed for the Space Shuttle. Attention is directed to bite-size food cubes used in Gemini, Gemini rehydratable food packages, Apollo spoon-bowl rehydratable packages, thermostabilized flex pouch for Apollo, tear-top commercial food cans used in Skylab, polyethylene beverage containers, Skylab rehydratable food package, Space Shuttle food package configuration, duck-bill septum rehydration device, and a drinking/dispensing nozzle for Space Shuttle liquids. Constraints and testing of packaging is considered, a comparison of food package materials is presented, and typical Shuttle foods and beverages are listed.
Ge, Changfeng; Cheng, Yujie; Shen, Yan
2013-01-01
This study demonstrated an attempt to predict temperatures of a perishable product such as vaccine inside an insulated packaging container during transport through finite element analysis (FEA) modeling. In order to use the standard FEA software for simulation, an equivalent heat conduction coefficient is proposed and calculated to describe the heat transfer of the air trapped inside the insulated packaging container. The three-dimensional, insulated packaging container is regarded as a combination of six panels, and the heat flow at each side panel is a one-dimension diffusion process. The transit-thermal analysis was applied to simulate the heat transition process from ambient environment to inside the container. Field measurements were carried out to collect the temperature during transport, and the collected data were compared to the FEA simulation results. Insulated packaging containers are used to transport temperature-sensitive products such as vaccine and other pharmaceutical products. The container is usually made of an extruded polystyrene foam filled with gel packs. World Health Organization guidelines recommend that all vaccines except oral polio vaccine be distributed in an environment where the temperature ranges between +2 to +8 °C. The primary areas of concern in designing the packaging for vaccine are how much of the foam thickness and gel packs should be used in order to keep the temperature in a desired range, and how to prevent the vaccine from exposure to freezing temperatures. This study uses numerical simulation to predict temperature change within an insulated packaging container in vaccine cold chain. It is our hope that this simulation will provide the vaccine industries with an alternative engineering tool to validate vaccine packaging and project thermal equilibrium within the insulated packaging container.
POLYSITE - An interactive package for the selection and refinement of Landsat image training sites
NASA Technical Reports Server (NTRS)
Mack, Marilyn J. P.
1986-01-01
A versatile multifunction package, POLYSITE, developed for Goddard's Land Analysis System, is described which simplifies the process of interactively selecting and correcting the sites used to study Landsat TM and MSS images. Image switching between the zoomed and nonzoomed image, color and shape cursor change and location display, and bit plane erase or color change, are global functions which are active at all times. Local functions possibly include manipulation of intensive study areas, new site definition, mensuration, and new image copying. The program is illustrated with the example of a full TM maser scene of metropolitan Washington, DC.
Realistic Simulations of Coronagraphic Observations with WFIRST
NASA Astrophysics Data System (ADS)
Rizzo, Maxime; Zimmerman, Neil; Roberge, Aki; Lincowski, Andrew; Arney, Giada; Stark, Chris; Jansen, Tiffany; Turnbull, Margaret; WFIRST Science Investigation Team (Turnbull)
2018-01-01
We present a framework to simulate observing scenarios with the WFIRST Coronagraphic Instrument (CGI). The Coronagraph and Rapid Imaging Spectrograph in Python (crispy) is an open-source package that can be used to create CGI data products for analysis and development of post-processing routines. The software convolves time-varying coronagraphic PSFs with realistic astrophysical scenes which contain a planetary architecture, a consistent dust structure, and a background field composed of stars and galaxies. The focal plane can be read out by a WFIRST electron-multiplying CCD model directly, or passed through a WFIRST integral field spectrograph model first. Several elementary post-processing routines are provided as part of the package.
Chimera: a Bioconductor package for secondary analysis of fusion products.
Beccuti, Marco; Carrara, Matteo; Cordero, Francesca; Lazzarato, Fulvio; Donatelli, Susanna; Nadalin, Francesca; Policriti, Alberto; Calogero, Raffaele A
2014-12-15
Chimera is a Bioconductor package that organizes, annotates, analyses and validates fusions reported by different fusion detection tools; current implementation can deal with output from bellerophontes, chimeraScan, deFuse, fusionCatcher, FusionFinder, FusionHunter, FusionMap, mapSplice, Rsubread, tophat-fusion and STAR. The core of Chimera is a fusion data structure that can store fusion events detected with any of the aforementioned tools. Fusions are then easily manipulated with standard R functions or through the set of functionalities specifically developed in Chimera with the aim of supporting the user in managing fusions and discriminating false-positive results. © The Author 2014. Published by Oxford University Press.
[Evaluation of the quality of poultry meat and its processing for vacuum packaging].
Swiderski, F; Russel, S; Waszkiewicz-Robak, B; Cholewińska, E
1997-01-01
The aim of study was to evaluate the quality of poultry meat, roasted and smoked chicken and poultry pie packing under low and high vacuum. All investigated products were stored at +4 degrees C and evaluated by microbiological analysis. It was showed that packing under low and high vacuum inhibited development of aerobic microorganisms, proteolytic bacteria, yeasts and moulds. Vacuum-packaged storage of poultry meat and its products stimulated activity of anaerobic, nonsporeforming bacteria. The fast spoilage of fresh poultry meat was observed both under vacuum and conventional storage. The microbiology quality of poultry products depended on technology of production and microbiological quality of raw material.
IRISpy: Analyzing IRIS Data in Python
NASA Astrophysics Data System (ADS)
Ryan, Daniel; Christe, Steven; Mumford, Stuart; Baruah, Ankit; Timothy, Shelbe; Pereira, Tiago; De Pontieu, Bart
2017-08-01
IRISpy is a new community-developed open-source software library for analysing IRIS level 2 data. It is written in Python, a free, cross-platform, general-purpose, high-level programming language. A wide array of scientific computing software packages have already been developed in Python, from numerical computation (NumPy, SciPy, etc.), to visualization and plotting (matplotlib), to solar-physics-specific data analysis (SunPy). IRISpy is currently under development as a SunPy-affiliated package which means it depends on the SunPy library, follows similar standards and conventions, and is developed with the support of of the SunPy development team. IRISpy’s has two primary data objects, one for analyzing slit-jaw imager data and another for analyzing spectrograph data. Both objects contain basic slicing, indexing, plotting, and animating functionality to allow users to easily inspect, reduce and analyze the data. As part of this functionality the objects can output SunPy Maps, TimeSeries, Spectra, etc. of relevant data slices for easier inspection and analysis. Work is also ongoing to provide additional data analysis functionality including derivation of systematic measurement errors (e.g. readout noise), exposure time correction, residual wavelength calibration, radiometric calibration, and fine scale pointing corrections. IRISpy’s code base is publicly available through github.com and can be contributed to by anyone. In this poster we demonstrate IRISpy’s functionality and future goals of the project. We also encourage interested users to become involved in further developing IRISpy.
methylPipe and compEpiTools: a suite of R packages for the integrative analysis of epigenomics data.
Kishore, Kamal; de Pretis, Stefano; Lister, Ryan; Morelli, Marco J; Bianchi, Valerio; Amati, Bruno; Ecker, Joseph R; Pelizzola, Mattia
2015-09-29
Numerous methods are available to profile several epigenetic marks, providing data with different genome coverage and resolution. Large epigenomic datasets are then generated, and often combined with other high-throughput data, including RNA-seq, ChIP-seq for transcription factors (TFs) binding and DNase-seq experiments. Despite the numerous computational tools covering specific steps in the analysis of large-scale epigenomics data, comprehensive software solutions for their integrative analysis are still missing. Multiple tools must be identified and combined to jointly analyze histone marks, TFs binding and other -omics data together with DNA methylation data, complicating the analysis of these data and their integration with publicly available datasets. To overcome the burden of integrating various data types with multiple tools, we developed two companion R/Bioconductor packages. The former, methylPipe, is tailored to the analysis of high- or low-resolution DNA methylomes in several species, accommodating (hydroxy-)methyl-cytosines in both CpG and non-CpG sequence context. The analysis of multiple whole-genome bisulfite sequencing experiments is supported, while maintaining the ability of integrating targeted genomic data. The latter, compEpiTools, seamlessly incorporates the results obtained with methylPipe and supports their integration with other epigenomics data. It provides a number of methods to score these data in regions of interest, leading to the identification of enhancers, lncRNAs, and RNAPII stalling/elongation dynamics. Moreover, it allows a fast and comprehensive annotation of the resulting genomic regions, and the association of the corresponding genes with non-redundant GeneOntology terms. Finally, the package includes a flexible method based on heatmaps for the integration of various data types, combining annotation tracks with continuous or categorical data tracks. methylPipe and compEpiTools provide a comprehensive Bioconductor-compliant solution for the integrative analysis of heterogeneous epigenomics data. These packages are instrumental in providing biologists with minimal R skills a complete toolkit facilitating the analysis of their own data, or in accelerating the analyses performed by more experienced bioinformaticians.
GOplot: an R package for visually combining expression data with functional analysis.
Walter, Wencke; Sánchez-Cabo, Fátima; Ricote, Mercedes
2015-09-01
Despite the plethora of methods available for the functional analysis of omics data, obtaining comprehensive-yet detailed understanding of the results remains challenging. This is mainly due to the lack of publicly available tools for the visualization of this type of information. Here we present an R package called GOplot, based on ggplot2, for enhanced graphical representation. Our package takes the output of any general enrichment analysis and generates plots at different levels of detail: from a general overview to identify the most enriched categories (bar plot, bubble plot) to a more detailed view displaying different types of information for molecules in a given set of categories (circle plot, chord plot, cluster plot). The package provides a deeper insight into omics data and allows scientists to generate insightful plots with only a few lines of code to easily communicate the findings. The R package GOplot is available via CRAN-The Comprehensive R Archive Network: http://cran.r-project.org/web/packages/GOplot. The shiny web application of the Venn diagram can be found at: https://wwalter.shinyapps.io/Venn/. A detailed manual of the package with sample figures can be found at https://wencke.github.io/ fscabo@cnic.es or mricote@cnic.es. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
77 FR 14445 - Leakage Tests on Packages for Shipment of Radioactive Material
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-09
... NUCLEAR REGULATORY COMMISSION [NRC-2011-0045] Leakage Tests on Packages for Shipment of..., ``Leakage Tests on Packages for Radioactive Material.'' ADDRESSES: You can access publicly available... Materials--Leakage Tests on Packages for Shipment'' approved February 1998. The NRC staff developed and...
An Object-Oriented Serial DSMC Simulation Package
NASA Astrophysics Data System (ADS)
Liu, Hongli; Cai, Chunpei
2011-05-01
A newly developed three-dimensional direct simulation Monte Carlo (DSMC) simulation package, named GRASP ("Generalized Rarefied gAs Simulation Package"), is reported in this paper. This package utilizes the concept of simulation engine, many C++ features and software design patterns. The package has an open architecture which can benefit further development and maintenance of the code. In order to reduce the engineering time for three-dimensional models, a hybrid grid scheme, combined with a flexible data structure compiled by C++ language, are implemented in this package. This scheme utilizes a local data structure based on the computational cell to achieve high performance on workstation processors. This data structure allows the DSMC algorithm to be very efficiently parallelized with domain decomposition and it provides much flexibility in terms of grid types. This package can utilize traditional structured, unstructured or hybrid grids within the framework of a single code to model arbitrarily complex geometries and to simulate rarefied gas flows. Benchmark test cases indicate that this package has satisfactory accuracy for complex rarefied gas flows.
Generic repository design concepts and thermal analysis (FY11).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Howard, Robert; Dupont, Mark; Blink, James A.
2011-08-01
Reference concepts for geologic disposal of used nuclear fuel and high-level radioactive waste in the U.S. are developed, including geologic settings and engineered barriers. Repository thermal analysis is demonstrated for a range of waste types from projected future, advanced nuclear fuel cycles. The results show significant differences among geologic media considered (clay/shale, crystalline rock, salt), and also that waste package size and waste loading must be limited to meet targeted maximum temperature values. In this study, the UFD R&D Campaign has developed a set of reference geologic disposal concepts for a range of waste types that could potentially be generatedmore » in advanced nuclear FCs. A disposal concept consists of three components: waste inventory, geologic setting, and concept of operations. Mature repository concepts have been developed in other countries for disposal of spent LWR fuel and HLW from reprocessing UNF, and these serve as starting points for developing this set. Additional design details and EBS concepts will be considered as the reference disposal concepts evolve. The waste inventory considered in this study includes: (1) direct disposal of SNF from the LWR fleet, including Gen III+ advanced LWRs being developed through the Nuclear Power 2010 Program, operating in a once-through cycle; (2) waste generated from reprocessing of LWR UOX UNF to recover U and Pu, and subsequent direct disposal of used Pu-MOX fuel (also used in LWRs) in a modified-open cycle; and (3) waste generated by continuous recycling of metal fuel from fast reactors operating in a TRU burner configuration, with additional TRU material input supplied from reprocessing of LWR UOX fuel. The geologic setting provides the natural barriers, and establishes the boundary conditions for performance of engineered barriers. The composition and physical properties of the host medium dictate design and construction approaches, and determine hydrologic and thermal responses of the disposal system. Clay/shale, salt, and crystalline rock media are selected as the basis for reference mined geologic disposal concepts in this study, consistent with advanced international repository programs, and previous investigations in the U.S. The U.S. pursued deep geologic disposal programs in crystalline rock, shale, salt, and volcanic rock in the years leading up to the Nuclear Waste Policy Act, or NWPA (Rechard et al. 2011). The 1987 NWPA amendment act focused the U.S. program on unsaturated, volcanic rock at the Yucca Mountain site, culminating in the 2008 license application. Additional work on unsaturated, crystalline rock settings (e.g., volcanic tuff) is not required to support this generic study. Reference disposal concepts are selected for the media listed above and for deep borehole disposal, drawing from recent work in the U.S. and internationally. The main features of the repository concepts are discussed in Section 4.5 and summarized in Table ES-1. Temperature histories at the waste package surface and a specified distance into the host rock are calculated for combinations of waste types and reference disposal concepts, specifying waste package emplacement modes. Target maximum waste package surface temperatures are identified, enabling a sensitivity study to inform the tradeoff between the quantity of waste per disposal package, and decay storage duration, with respect to peak temperature at the waste package surface. For surface storage duration on the order of 100 years or less, waste package sizes for direct disposal of SNF are effectively limited to 4-PWR configurations (or equivalent size and output). Thermal results are summarized, along with recommendations for follow-on work including adding additional reference concepts, verification and uncertainty analysis for thermal calculations, developing descriptions of surface facilities and other system details, and cost estimation to support system-level evaluations.« less
Implementation and use of direct-flow connections in a coupled ground-water and surface-water model
Swain, Eric D.
1994-01-01
The U.S. Geological Survey's MODFLOW finite-difference ground-water flow model has been coupled with three surface-water packages - the MODBRANCH, River, and Stream packages - to simulate surface water and its interaction with ground water. Prior to the development of the coupling packages, the only interaction between these modeling packages was that leakage values could be passed between MODFLOW and the three surface-water packages. To facilitate wider and more flexible uses of the models, a computer program was developed and added to MODFLOW to allow direct flows or stages to be passed between any of the packages and MODFLOW. The flows or stages calculated in one package can be set as boundary discharges or stages to be used in another package. Several modeling packages can be used in the same simulation depending upon the level of sophistication needed in the various reaches being modeled. This computer program is especially useful when any of the River, Stream, or MODBRANCH packages are used to model a river flowing directly into or out of wetlands in direct connection with the aquifer and represented in the model as an aquifer block. A field case study is shown to illustrate an application.
NASA Technical Reports Server (NTRS)
Willis, Jerry; Willis, Dee Anna; Walsh, Clare; Stephens, Elizabeth; Murphy, Timothy; Price, Jerry; Stevens, William; Jackson, Kevin; Villareal, James A.; Way, Bob
1994-01-01
An important part of NASA's mission involves the secondary application of its technologies in the public and private sectors. One current application under development is LiteraCity, a simulation-based instructional package for adults who do not have functional reading skills. Using fuzzy logic routines and other technologies developed by NASA's Information Systems Directorate and hypermedia sound, graphics, and animation technologies the project attempts to overcome the limited impact of adult literacy assessment and instruction by involving the adult in an interactive simulation of real-life literacy activities. The project uses a recursive instructional development model and authentic instruction theory. This paper describes one component of a project to design, develop, and produce a series of computer-based, multimedia instructional packages. The packages are being developed for use in adult literacy programs, particularly in correctional education centers. They use the concepts of authentic instruction and authentic assessment to guide development. All the packages to be developed are instructional simulations. The first is a simulation of 'finding a friend a job.'
'spup' - an R package for uncertainty propagation in spatial environmental modelling
NASA Astrophysics Data System (ADS)
Sawicka, Kasia; Heuvelink, Gerard
2016-04-01
Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability, including case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected static and interactive visualization methods that are understandable by non-experts with limited background in statistics can be used to summarize and visualize uncertainty about the measured input, model parameters and output of the uncertainty propagation. We demonstrate that the 'spup' package is an effective and easy tool to apply and can be used in multi-disciplinary research and model-based decision support.
Large space telescope, phase A. Volume 4: Scientific instrument package
NASA Technical Reports Server (NTRS)
1972-01-01
The design and characteristics of the scientific instrument package for the Large Space Telescope are discussed. The subjects include: (1) general scientific objectives, (2) package system analysis, (3) scientific instrumentation, (4) imaging photoelectric sensors, (5) environmental considerations, and (6) reliability and maintainability.
A permanent seismic station beneath the Ocean Bottom
NASA Astrophysics Data System (ADS)
Harris, David; Cessaro, Robert K.; Duennebier, Fred K.; Byrne, David A.
1987-03-01
The Hawaii Institute of Geophysics began development of the Ocean Subbottom Seisometer (OSS) system in 1978, and OSS systems were installed in four locations between 1979 and 1982. The OSS system is a permanent, deep ocean borehole seismic recording system composed of a borehole sensor package (tool), an electromechanical cable, recorder package, and recovery system. Installed near the bottom of a borehole (drilled by the D/V Glomar Challenger), the tool contains three orthogonal, 4.5-Hz geophones, two orthogonal tilt meters; and a temperature sensor. Signals from these sensors are multiplexed, digitized (with a floating point technique), and telemetered through approximately 10 km of electromechanical cable to a recorder package located near the ocean bottom. Electrical power for the tool is supplied from the recorder package. The digital seismic signals are demultiplexed, converted back to analog form, processed through an automatic gain control (AGC) circuit, and recorded along with a time code on magnetic tape cassettes in the recorder package. Data may be recorded continuously for up to two months in the self-contained recorder package. Data may also be recorded in real time (digital formal) during the installation and subsequent recorder package servicing. The recorder package is connected to a submerged recovery buoy by a length of bouyant polypropylene rope. The anchor on the recovery buoy is released by activating either of the acoustical command releases. The polypropylene rope may also be seized with a grappling hook to effect recovery. The recorder package may be repeatedly serviced as long as the tool remains functional A wide range of data has been recovered from the OSS system. Recovered analog records include signals from natural seismic sources such as earthquakes (teleseismic and local), man-made seismic sources such as refraction seismic shooting (explosives and air cannons), and nuclear tests. Lengthy continuous recording has permitted analysis of wideband noise levels, and the slowly varying parameters, temperature and tilt.
MEMS packaging: state of the art and future trends
NASA Astrophysics Data System (ADS)
Bossche, Andre; Cotofana, Carmen V. B.; Mollinger, Jeff R.
1998-07-01
Now that the technology for Integrated sensor and MEMS devices has become sufficiently mature to allow mass production, it is expected that the prices of bare chips will drop dramatically. This means that the package prices will become a limiting factor in market penetration, unless low cost packaging solutions become available. This paper will discuss the developments in packaging technology. Both single-chip and multi-chip packaging solutions will be addressed. It first starts with a discussion on the different requirements that have to be met; both from a device point of view (open access paths to the environment, vacuum cavities, etc.) and from the application point of view (e.g. environmental hostility). Subsequently current technologies are judged on their applicability for MEMS and sensor packaging and a forecast is given for future trends. It is expected that the large majority of sensing devices will be applied in relative friendly environments for which plastic packages would suffice. Therefore, on the short term an important role is foreseen for recently developed plastic packaging techniques such as precision molding and precision dispensing. Just like in standard electronic packaging, complete wafer level packaging methods for sensing devices still have a long way to go before they can compete with the highly optimized and automated plastic packaging processes.
Using Cell-ID 1.4 with R for Microscope-Based Cytometry
Bush, Alan; Chernomoretz, Ariel; Yu, Richard; Gordon, Andrew
2012-01-01
This unit describes a method for quantifying various cellular features (e.g., volume, total and subcellular fluorescence localization) from sets of microscope images of individual cells. It includes procedures for tracking cells over time. One purposefully defocused transmission image (sometimes referred to as bright-field or BF) is acquired to segment the image and locate each cell. Fluorescent images (one for each of the color channels to be analyzed) are then acquired by conventional wide-field epifluorescence or confocal microscopy. This method uses the image processing capabilities of Cell-ID (Gordon et al., 2007, as updated here) and data analysis by the statistical programming framework R (R-Development-Team, 2008), which we have supplemented with a package of routines for analyzing Cell-ID output. Both Cell-ID and the analysis package are open-source. PMID:23026908
Preliminary Shielding Analysis for HCCB TBM Transport
NASA Astrophysics Data System (ADS)
Miao, Peng; Zhao, Fengchao; Cao, Qixiang; Zhang, Guoshu; Feng, Kaiming
2015-09-01
A preliminary shielding analysis on the transport of the Chinese helium cooled ceramic breeder test blanket module (HCCB TBM) from France back to China after being irradiated in ITER is presented in this contribution. Emphasis was placed on irradiation safety during transport. The dose rate calculated by MCNP/4C for the conceptual package design satisfies the relevant dose limits from IAEA that the dose rate 3 m away from the surface of the package containing low specific activity III materials should be less than 10 mSv/h. The change with location and the time evolution of dose rates after shutdown have also been studied. This will be helpful for devising the detailed transport plan of HCCB TBM back to China in the near future. supported by the Major State Basic Research Development Program of China (973 Program) (No. 2013GB108000)
Three-dimensional reconstruction for coherent diffraction patterns obtained by XFEL.
Nakano, Miki; Miyashita, Osamu; Jonic, Slavica; Song, Changyong; Nam, Daewoong; Joti, Yasumasa; Tama, Florence
2017-07-01
The three-dimensional (3D) structural analysis of single particles using an X-ray free-electron laser (XFEL) is a new structural biology technique that enables observations of molecules that are difficult to crystallize, such as flexible biomolecular complexes and living tissue in the state close to physiological conditions. In order to restore the 3D structure from the diffraction patterns obtained by the XFEL, computational algorithms are necessary as the orientation of the incident beam with respect to the sample needs to be estimated. A program package for XFEL single-particle analysis based on the Xmipp software package, that is commonly used for image processing in 3D cryo-electron microscopy, has been developed. The reconstruction program has been tested using diffraction patterns of an aerosol nanoparticle obtained by tomographic coherent X-ray diffraction microscopy.
The Analysis of the Regression-Discontinuity Design in R
ERIC Educational Resources Information Center
Thoemmes, Felix; Liao, Wang; Jin, Ze
2017-01-01
This article describes the analysis of regression-discontinuity designs (RDDs) using the R packages rdd, rdrobust, and rddtools. We discuss similarities and differences between these packages and provide directions on how to use them effectively. We use real data from the Carolina Abecedarian Project to show how an analysis of an RDD can be…
Lutz, Sharon M; Thwing, Annie; Schmiege, Sarah; Kroehl, Miranda; Baker, Christopher D; Starling, Anne P; Hokanson, John E; Ghosh, Debashis
2017-07-19
In mediation analysis if unmeasured confounding is present, the estimates for the direct and mediated effects may be over or under estimated. Most methods for the sensitivity analysis of unmeasured confounding in mediation have focused on the mediator-outcome relationship. The Umediation R package enables the user to simulate unmeasured confounding of the exposure-mediator, exposure-outcome, and mediator-outcome relationships in order to see how the results of the mediation analysis would change in the presence of unmeasured confounding. We apply the Umediation package to the Genetic Epidemiology of Chronic Obstructive Pulmonary Disease (COPDGene) study to examine the role of unmeasured confounding due to population stratification on the effect of a single nucleotide polymorphism (SNP) in the CHRNA5/3/B4 locus on pulmonary function decline as mediated by cigarette smoking. Umediation is a flexible R package that examines the role of unmeasured confounding in mediation analysis allowing for normally distributed or Bernoulli distributed exposures, outcomes, mediators, measured confounders, and unmeasured confounders. Umediation also accommodates multiple measured confounders, multiple unmeasured confounders, and allows for a mediator-exposure interaction on the outcome. Umediation is available as an R package at https://github.com/SharonLutz/Umediation A tutorial on how to install and use the Umediation package is available in the Additional file 1.
2013-01-01
Background Smoking cessation is a high-priority intervention to prevent CVD events and deaths in developing countries. While several interventions to stop smoking have been proved successful, the question of how to increase their effectiveness and practicality in developing countries remains. In this study, a newly devised evidence-based smoking cessation service package will be compared with the existing service in a randomized controlled trial within the community setting of Thailand. Method/Design This randomized control trial will recruit 440 current smokers at CVD risk because of being diabetic and/or hypertensive. Informed, consented participants will be randomly allocated into the new service-package arm and the routine service arm. The study will take place in the non-communicable disease clinics of the Maetha District Hospital, Lampang, northern Thailand. The new smoking-cessation service-package comprises (1) regular patient motivation and coaching from the same primary care nurse over a 3-month period; (2) monthly application of piCO + smokerlyzer to sustain motivation of smoker’s quitting attempt and provide positive feedback over a 3-month period; (3) assistance by an assigned family member; (4) nicotine replacement chewing gum to relieve withdrawal symptoms. This new service will be compared with the traditional routine service comprising the 5A approach in a 1-year follow-up. Participants who consent to participate in the study but refuse to attempt quitting smoking will be allocated to the non-randomized arm, where they will be just followed up and monitored. Primary outcome of the study is smoking cessation rate at 1-year follow-up proven by breath analysis measuring carbomonoxide in parts per million in expired air. Secondary outcomes are smoking cessation rate at the 6-month follow-up, blood pressure and heart rate, CVD risk according to the Framingham general cardiovascular risk score, CVD events and deaths at the 12-month follow-up, and the cost-effectiveness of the health service packages. Intention-to-treat analysis will be followed. Factors influencing smoking cessation will be analyzed by the structure equation model. Discussion This multicomponent intervention, accessible at primary healthcare clinics, and focusing on the individual as well as the family and social environment, is unique and expected to work effectively. Trial registration Current Controlled Trials ISRCTN89315117 PMID:24308874
Aung, Myo Nyein; Yuasa, Motoyuki; Lorga, Thaworn; Moolphate, Saiyud; Fukuda, Hiroshi; Kitajima, Tsutomu; Yokokawa, Hirohide; Minematsu, Kazuo; Tanimura, Susumu; Hiratsuka, Yoshimune; Ono, Koichi; Naunboonruang, Prissana; Thinuan, Payom; Kawai, Sachio; Suya, Yaoyanee; Chumvicharana, Somboon; Marui, Eiji
2013-12-05
Smoking cessation is a high-priority intervention to prevent CVD events and deaths in developing countries. While several interventions to stop smoking have been proved successful, the question of how to increase their effectiveness and practicality in developing countries remains. In this study, a newly devised evidence-based smoking cessation service package will be compared with the existing service in a randomized controlled trial within the community setting of Thailand. This randomized control trial will recruit 440 current smokers at CVD risk because of being diabetic and/or hypertensive. Informed, consented participants will be randomly allocated into the new service-package arm and the routine service arm. The study will take place in the non-communicable disease clinics of the Maetha District Hospital, Lampang, northern Thailand. The new smoking-cessation service-package comprises (1) regular patient motivation and coaching from the same primary care nurse over a 3-month period; (2) monthly application of piCO + smokerlyzer to sustain motivation of smoker's quitting attempt and provide positive feedback over a 3-month period; (3) assistance by an assigned family member; (4) nicotine replacement chewing gum to relieve withdrawal symptoms. This new service will be compared with the traditional routine service comprising the 5A approach in a 1-year follow-up. Participants who consent to participate in the study but refuse to attempt quitting smoking will be allocated to the non-randomized arm, where they will be just followed up and monitored. Primary outcome of the study is smoking cessation rate at 1-year follow-up proven by breath analysis measuring carbomonoxide in parts per million in expired air. Secondary outcomes are smoking cessation rate at the 6-month follow-up, blood pressure and heart rate, CVD risk according to the Framingham general cardiovascular risk score, CVD events and deaths at the 12-month follow-up, and the cost-effectiveness of the health service packages. Intention-to-treat analysis will be followed. Factors influencing smoking cessation will be analyzed by the structure equation model. This multicomponent intervention, accessible at primary healthcare clinics, and focusing on the individual as well as the family and social environment, is unique and expected to work effectively. Current Controlled Trials ISRCTN89315117.
PWL 1.0 Personal WaveLab: an object-oriented workbench for seismogram analysis on Windows systems
NASA Astrophysics Data System (ADS)
Bono, Andrea; Badiali, Lucio
2005-02-01
Personal WaveLab 1.0 wants to be the starting point for an ex novo development of seismic time-series analysis procedures for Windows-based personal computers. Our objective is two-fold. Firstly, being itself a stand-alone application, it allows to do "basic" digital or digitised seismic waveform analysis. Secondly, thanks to its architectural characteristics it can be the basis for the development of more complex and power featured applications. An expanded version of PWL, called SisPick!, is currently in use at the Istituto Nazionale di Geofisica e Vulcanologia (Italian Institute of Geophysics and Volcanology) for real-time monitoring with purposes of Civil Protection. This means that about 90 users tested the application for more than 1 year, making its features more robust and efficient. SisPick! was also employed in the United Nations Nyragongo Project, in Congo, and during the Stromboli emergency in summer of 2002. The main appeals of the application package are: ease of use, object-oriented design, good computational speed, minimal need of disk space and the complete absence of third-party developed components (including ActiveX). Windows environment spares the user scripting or complex interaction with the system. The system is in constant development to answer the needs and suggestions of its users. Microsoft Visual Basic 6 source code, installation package, test data sets and documentation are available at no cost.
Issues in Television-Centered Instruction.
ERIC Educational Resources Information Center
Richardson, Penelope L.
Current research on the adult learner and on instruction through media has grave flaws, and reviews of research in five areas are needed to assist instructional developers and adopters in making wise decisions. These include a critical analysis of existing telecourse packages, as well as reviews of research on the motivation of various subgroups…
A Study of Imputation Algorithms. Working Paper Series.
ERIC Educational Resources Information Center
Hu, Ming-xiu; Salvucci, Sameena
Many imputation techniques and imputation software packages have been developed over the years to deal with missing data. Different methods may work well under different circumstances, and it is advisable to conduct a sensitivity analysis when choosing an imputation method for a particular survey. This study reviewed about 30 imputation methods…
Project for Global Education: Annotated Bibliography.
ERIC Educational Resources Information Center
Institute for World Order, New York, NY.
Over 260 books, textbooks, articles, pamphlets, periodicals, films, and multi-media packages appropriate for the analysis of global issues at the college level are briefly annotated. Entries include classic books and articles as well as a number of recent (1976-1981) publications. The purpose is to assist students and educators in developing a…
USDA-ARS?s Scientific Manuscript database
Recently, a variant of stochastic dominance called stochastic efficiency with respect to a function (SERF) has been developed and applied. Unlike traditional stochastic dominance approaches, SERF uses the concept of certainty equivalents (CEs) to rank a set of risk-efficient alternatives instead of...
An R Package for Open, Reproducible Analysis of Urban Water Systems, With Application to Chicago
Urban water systems consist of natural and engineered flows of water interacting in complex ways. System complexity can be understood via mass conservative models that account for the interrelationships among all major flows and storages. We have developed a generic urban water s...
TCC: an R package for comparing tag count data with robust normalization strategies
2013-01-01
Background Differential expression analysis based on “next-generation” sequencing technologies is a fundamental means of studying RNA expression. We recently developed a multi-step normalization method (called TbT) for two-group RNA-seq data with replicates and demonstrated that the statistical methods available in four R packages (edgeR, DESeq, baySeq, and NBPSeq) together with TbT can produce a well-ranked gene list in which true differentially expressed genes (DEGs) are top-ranked and non-DEGs are bottom ranked. However, the advantages of the current TbT method come at the cost of a huge computation time. Moreover, the R packages did not have normalization methods based on such a multi-step strategy. Results TCC (an acronym for Tag Count Comparison) is an R package that provides a series of functions for differential expression analysis of tag count data. The package incorporates multi-step normalization methods, whose strategy is to remove potential DEGs before performing the data normalization. The normalization function based on this DEG elimination strategy (DEGES) includes (i) the original TbT method based on DEGES for two-group data with or without replicates, (ii) much faster methods for two-group data with or without replicates, and (iii) methods for multi-group comparison. TCC provides a simple unified interface to perform such analyses with combinations of functions provided by edgeR, DESeq, and baySeq. Additionally, a function for generating simulation data under various conditions and alternative DEGES procedures consisting of functions in the existing packages are provided. Bioinformatics scientists can use TCC to evaluate their methods, and biologists familiar with other R packages can easily learn what is done in TCC. Conclusion DEGES in TCC is essential for accurate normalization of tag count data, especially when up- and down-regulated DEGs in one of the samples are extremely biased in their number. TCC is useful for analyzing tag count data in various scenarios ranging from unbiased to extremely biased differential expression. TCC is available at http://www.iu.a.u-tokyo.ac.jp/~kadota/TCC/ and will appear in Bioconductor (http://bioconductor.org/) from ver. 2.13. PMID:23837715
Sorensen, Asta V; Bernard, Shulamit L
2012-02-01
Learning (quality improvement) collaboratives are effective vehicles for driving coordinated organizational improvements. A central element of a learning collaborative is the change package-a catalogue of strategies, change concepts, and action steps that guide participants in their improvement efforts. Despite a vast literature describing learning collaboratives, little to no information is available on how the guiding strategies, change concepts, and action items are identified and developed to a replicable and actionable format that can be used to make measurable improvements within participating organizations. The process for developing the change package for the Health Resources and Services Administration's (HRSA) Patient Safety and Clinical Pharmacy Services Collaborative entailed environmental scan and identification of leading practices, case studies, interim debriefing meetings, data synthesis, and a technical expert panel meeting. Data synthesis involved end-of-day debriefings, systematic qualitative analyses, and the use of grounded theory and inductive data analysis techniques. This approach allowed systematic identification of innovative patient safety and clinical pharmacy practices that could be adopted in diverse environments. A case study approach enabled the research team to study practices in their natural environments. Use of grounded theory and inductive data analysis techniques enabled identification of strategies, change concepts, and actionable items that might not have been captured using different approaches. Use of systematic processes and qualitative methods in identification and translation of innovative practices can greatly accelerate the diffusion of innovations and practice improvements. This approach is effective whether or not an individual organization is part of a learning collaborative.
Delidding and resealing hybrid microelectronic packages
NASA Astrophysics Data System (ADS)
Luce, W. F.
1982-05-01
The objective of this single phase MM and T contract was to develop the manufacturing technology necessary for the precision removal (delidding) and replacement (resealing) of covers on hermetically sealed hybrid microelectronic packages. The equipment and processes developed provide a rework technique which does not degrade the reliability of the package of the enclosed circuitry. A qualification test was conducted on 88 functional hybrid packages, with excellent results. A petition will be filed, accompanied by this report, requesting Mil-M-38510 be amended to allow this rework method.
The challenges of packaging combination devices.
Mankel, George
2008-01-01
This article focuses on the development of a packaging format for drug eluting stents where the package not only has to meet the needs of the stent, but also the needs of the drug incorporated into its polymer coating. The package has to allow the transfer of ethylene oxide gas for sterilisation, but when in storage, must provide a barrier to keep out moisture and oxygen. A pouch and commercial scale manufacturing process were developed to incorporate this dual function into one item.
Research and Development of Fully Automatic Alien Smoke Stack and Packaging System
NASA Astrophysics Data System (ADS)
Yang, Xudong; Ge, Qingkuan; Peng, Tao; Zuo, Ping; Dong, Weifu
2017-12-01
The problem of low efficiency of manual sorting packaging for the current tobacco distribution center, which developed a set of safe efficient and automatic type of alien smoke stack and packaging system. The functions of fully automatic alien smoke stack and packaging system adopt PLC control technology, servo control technology, robot technology, image recognition technology and human-computer interaction technology. The characteristics, principles, control process and key technology of the system are discussed in detail. Through the installation and commissioning fully automatic alien smoke stack and packaging system has a good performance and has completed the requirements for shaped cigarette.
Contamination in food from packaging material.
Lau, O W; Wong, S K
2000-06-16
Packaging has become an indispensible element in the food manufacturing process, and different types of additives, such as antioxidants, stabilizers, lubricants, anti-static and anti-blocking agents, have also been developed to improve the performance of polymeric packaging materials. Recently the packaging has been found to represent a source of contamination itself through the migration of substances from the packaging into food. Various analytical methods have been developed to analyze the migrants in the foodstuff, and migration evaluation procedures based on theoretical prediction of migration from plastic food contact material were also introduced recently. In this paper, the regulatory control, analytical methodology, factors affecting the migration and migration evaluation are reviewed.
Whiting, Stephen; Postma, Sjoerd; Jamshaid de Lorenzo, Ayesha; Aumua, Audrey
2016-01-01
The Solomon Islands Government is pursuing integrated care with the goal of improving the quality of health service delivery to rural populations. Under the auspices of Universal Health Coverage, integrated service delivery packages were developed which defined the clinical and public health services that should be provided at different levels of the health system. The process of developing integrated service delivery packages helped to identify key policy decisions the government needed to make in order to improve service quality and efficiency. The integrated service delivery packages have instigated the revision of job descriptions and are feeding into the development of a human resource plan for health. They are also being used to guide infrastructure development and health system planning and should lead to better management of resources. The integrated service delivery packages have become a key tool to operationalise the government’s policy to move towards a more efficient, equitable, quality and sustainable health system. PMID:28321177
Cuadros-Inostroza, Alvaro; Caldana, Camila; Redestig, Henning; Kusano, Miyako; Lisec, Jan; Peña-Cortés, Hugo; Willmitzer, Lothar; Hannah, Matthew A
2009-12-16
Metabolite profiling, the simultaneous quantification of multiple metabolites in an experiment, is becoming increasingly popular, particularly with the rise of systems-level biology. The workhorse in this field is gas-chromatography hyphenated with mass spectrometry (GC-MS). The high-throughput of this technology coupled with a demand for large experiments has led to data pre-processing, i.e. the quantification of metabolites across samples, becoming a major bottleneck. Existing software has several limitations, including restricted maximum sample size, systematic errors and low flexibility. However, the biggest limitation is that the resulting data usually require extensive hand-curation, which is subjective and can typically take several days to weeks. We introduce the TargetSearch package, an open source tool which is a flexible and accurate method for pre-processing even very large numbers of GC-MS samples within hours. We developed a novel strategy to iteratively correct and update retention time indices for searching and identifying metabolites. The package is written in the R programming language with computationally intensive functions written in C for speed and performance. The package includes a graphical user interface to allow easy use by those unfamiliar with R. TargetSearch allows fast and accurate data pre-processing for GC-MS experiments and overcomes the sample number limitations and manual curation requirements of existing software. We validate our method by carrying out an analysis against both a set of known chemical standard mixtures and of a biological experiment. In addition we demonstrate its capabilities and speed by comparing it with other GC-MS pre-processing tools. We believe this package will greatly ease current bottlenecks and facilitate the analysis of metabolic profiling data.
2009-01-01
Background Metabolite profiling, the simultaneous quantification of multiple metabolites in an experiment, is becoming increasingly popular, particularly with the rise of systems-level biology. The workhorse in this field is gas-chromatography hyphenated with mass spectrometry (GC-MS). The high-throughput of this technology coupled with a demand for large experiments has led to data pre-processing, i.e. the quantification of metabolites across samples, becoming a major bottleneck. Existing software has several limitations, including restricted maximum sample size, systematic errors and low flexibility. However, the biggest limitation is that the resulting data usually require extensive hand-curation, which is subjective and can typically take several days to weeks. Results We introduce the TargetSearch package, an open source tool which is a flexible and accurate method for pre-processing even very large numbers of GC-MS samples within hours. We developed a novel strategy to iteratively correct and update retention time indices for searching and identifying metabolites. The package is written in the R programming language with computationally intensive functions written in C for speed and performance. The package includes a graphical user interface to allow easy use by those unfamiliar with R. Conclusions TargetSearch allows fast and accurate data pre-processing for GC-MS experiments and overcomes the sample number limitations and manual curation requirements of existing software. We validate our method by carrying out an analysis against both a set of known chemical standard mixtures and of a biological experiment. In addition we demonstrate its capabilities and speed by comparing it with other GC-MS pre-processing tools. We believe this package will greatly ease current bottlenecks and facilitate the analysis of metabolic profiling data. PMID:20015393
Zafeiraki, Effrosyni; Costopoulou, Danae; Vassiliadou, Irene; Bakeas, Evangelos; Leondiadis, Leondios
2014-01-01
Perfluorinated compounds (PFCs) are used in food packaging materials as coatings/additives for oil and moisture resistance. In the current study, foodstuff-packaging materials collected from the Greek market, made of paper, paperboard or aluminum foil were analyzed for the determination of PFCs. For the analysis of the samples, pressurized liquid extraction (PLE), liquid chromatography–tandem mass spectrometry (LC–MS/MS) and isotope dilution method were applied to develop a specific and sensitive method of analysis for the quantification of 12 PFCs: perfluorobutanoic acid (PFBA), perfluoropentanoic acid (PFPeA), perfluorohexanoic acid (PFHxA), perfluoroheptanoic acid (PFHpA), perfluorooctanoic acid (PFOA), perfluorononanoic acid (PFNA), perfluorodecanoic acid (PFDA), perfluoroundecanoic acid (PFUnDA), perfluorododecanoic acid (PFDoA), perfluorobutane sulfonate (PFBS), perfluorohexane sulfonate (PFHxS) and perfluorooctane sulfonate (PFOS) and the qualitative detection of 5 more: perfluorotridecanoic acid (PFTrDA), perfluorotetradecanoic acid (PFTeDA), perflyohexadecanoic acid (PFHxDA), perfluorooctadecanoic acid (PFODA) and perfluorodecane sulfonate (PFDS). No PFCs were quantified in aluminum foil wrappers, baking paper materials or beverage cups. PFTrDA, PFTeDA and PFHxDA were detected in fast food boxes. In the ice cream cup sample only PFHxA was found. On the other hand, several PFCs were quantified and detected in fast food wrappers, while the highest levels of PFCs were found in the microwave popcorn bag. PFOA and PFOS were not detected in any of the samples. Compared to other studies from different countries, very low concentrations of PFCs were detected in the packaging materials analyzed. Our results suggest that probably no serious danger for consumers’ health can be associated with PFCs contamination of packaging materials used in Greece.
The Integration of an API619 Screw Compressor Package into the Industrial Internet of Things
NASA Astrophysics Data System (ADS)
Milligan, W. J.; Poli, G.; Harrison, D. K.
2017-08-01
The Industrial Internet of Things (IIoT) is the industrial subset of the Internet of Things (IoT). IIoT incorporates big data technology, harnessing the instrumentation data, machine to machine communication and automation technologies that have existed in industrial settings for years. As industry in general trends towards the IIoT and as the screw compressor packages developed by Howden Compressors are designed with a minimum design life of 25 years, it is imperative this technology is embedded immediately. This paper provides the reader with a description on the Industrial Internet of Things before moving onto describing the scope of the problem for an organisation like Howden Compressors who deploy multiple compressor technologies across multiple locations and focuses on the critical measurements particular to high specification screw compressor packages. A brief analysis of how this differs from high volume package manufacturers deploying similar systems is offered. Then follows a description on how the measured information gets from the tip of the instrument in the process pipework or drive train through the different layers, with a description of each layer, into the final presentation layer. The functions available within the presentation layer are taken in turn and the benefits analysed with specific focus on efficiency and availability. The paper concludes with how packagers adopting the IIoT can not only optimise their package but by utilising the machine learning technology and pattern detection applications can adopt completely new business models.
Modular vaccine packaging increases packing efficiency
Norman, Bryan A.; Rajgopal, Jayant; Lim, Jung; Gorham, Katrin; Haidari, Leila; Brown, Shawn T.; Lee, Bruce Y.
2015-01-01
Background Within a typical vaccine supply chain, vaccines are packaged into individual cylindrical vials (each containing one or more doses) that are bundled together in rectangular “inner packs” for transport via even larger groupings such as cold boxes and vaccine carriers. The variability of vaccine inner pack and vial size may hinder efficient vaccine distribution because it constrains packing of cold boxes and vaccine carriers to quantities that are often inappropriate or suboptimal in the context of country-specific vaccination guidelines. Methods We developed in Microsoft Excel (Microsoft Corp., Redmond, WA) a spreadsheet model that evaluated the impact of different packing schemes for the Benin routine regimen plus the introduction of the Rotarix vaccine. Specifically, we used the model to compare the current packing scheme to that of a proposed modular packing scheme. Results Conventional packing of a Dometic RCW25 that aims to maximize fully-immunized children (FICs) results in 123 FICs and a packing efficiency of 81.93% compared to a maximum of 155 FICs and 94.1% efficiency for an alternative modular packaging system. Conclusions Our analysis suggests that modular packaging systems could offer significant advantages over conventional vaccine packaging systems with respect to space efficiency and potential FICs, when they are stored in standard vaccine carrying devices. This allows for more vaccines to be stored within the same volume while also simplifying the procedures used by field workers to pack storage devices. Ultimately, modular packaging systems could be a simple way to help increase vaccine coverage worldwide. PMID:25957666