IQM: An Extensible and Portable Open Source Application for Image and Signal Analysis in Java
Kainz, Philipp; Mayrhofer-Reinhartshuber, Michael; Ahammer, Helmut
2015-01-01
Image and signal analysis applications are substantial in scientific research. Both open source and commercial packages provide a wide range of functions for image and signal analysis, which are sometimes supported very well by the communities in the corresponding fields. Commercial software packages have the major drawback of being expensive and having undisclosed source code, which hampers extending the functionality if there is no plugin interface or similar option available. However, both variants cannot cover all possible use cases and sometimes custom developments are unavoidable, requiring open source applications. In this paper we describe IQM, a completely free, portable and open source (GNU GPLv3) image and signal analysis application written in pure Java. IQM does not depend on any natively installed libraries and is therefore runnable out-of-the-box. Currently, a continuously growing repertoire of 50 image and 16 signal analysis algorithms is provided. The modular functional architecture based on the three-tier model is described along the most important functionality. Extensibility is achieved using operator plugins, and the development of more complex workflows is provided by a Groovy script interface to the JVM. We demonstrate IQM’s image and signal processing capabilities in a proof-of-principle analysis and provide example implementations to illustrate the plugin framework and the scripting interface. IQM integrates with the popular ImageJ image processing software and is aiming at complementing functionality rather than competing with existing open source software. Machine learning can be integrated into more complex algorithms via the WEKA software package as well, enabling the development of transparent and robust methods for image and signal analysis. PMID:25612319
IQM: an extensible and portable open source application for image and signal analysis in Java.
Kainz, Philipp; Mayrhofer-Reinhartshuber, Michael; Ahammer, Helmut
2015-01-01
Image and signal analysis applications are substantial in scientific research. Both open source and commercial packages provide a wide range of functions for image and signal analysis, which are sometimes supported very well by the communities in the corresponding fields. Commercial software packages have the major drawback of being expensive and having undisclosed source code, which hampers extending the functionality if there is no plugin interface or similar option available. However, both variants cannot cover all possible use cases and sometimes custom developments are unavoidable, requiring open source applications. In this paper we describe IQM, a completely free, portable and open source (GNU GPLv3) image and signal analysis application written in pure Java. IQM does not depend on any natively installed libraries and is therefore runnable out-of-the-box. Currently, a continuously growing repertoire of 50 image and 16 signal analysis algorithms is provided. The modular functional architecture based on the three-tier model is described along the most important functionality. Extensibility is achieved using operator plugins, and the development of more complex workflows is provided by a Groovy script interface to the JVM. We demonstrate IQM's image and signal processing capabilities in a proof-of-principle analysis and provide example implementations to illustrate the plugin framework and the scripting interface. IQM integrates with the popular ImageJ image processing software and is aiming at complementing functionality rather than competing with existing open source software. Machine learning can be integrated into more complex algorithms via the WEKA software package as well, enabling the development of transparent and robust methods for image and signal analysis.
ERIC Educational Resources Information Center
Thompson, Douglas E.
2013-01-01
In today's complex music software packages, many features can remain unexplored and unused. Software plug-ins--available in most every music software package, yet easily overlooked in the software's basic operations--are one such feature. In this article, I introduce readers to plug-ins and offer tips for purchasing plug-ins I have…
T-MATS Toolbox for the Modeling and Analysis of Thermodynamic Systems
NASA Technical Reports Server (NTRS)
Chapman, Jeffryes W.
2014-01-01
The Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) is a MATLABSimulink (The MathWorks Inc.) plug-in for creating and simulating thermodynamic systems and controls. The package contains generic parameterized components that can be combined with a variable input iterative solver and optimization algorithm to create complex system models, such as gas turbines.
AGScan: a pluggable microarray image quantification software based on the ImageJ library.
Cathelin, R; Lopez, F; Klopp, Ch
2007-01-15
Many different programs are available to analyze microarray images. Most programs are commercial packages, some are free. In the latter group only few propose automatic grid alignment and batch mode. More often than not a program implements only one quantification algorithm. AGScan is an open source program that works on all major platforms. It is based on the ImageJ library [Rasband (1997-2006)] and offers a plug-in extension system to add new functions to manipulate images, align grid and quantify spots. It is appropriate for daily laboratory use and also as a framework for new algorithms. The program is freely distributed under X11 Licence. The install instructions can be found in the user manual. The software can be downloaded from http://mulcyber.toulouse.inra.fr/projects/agscan/. The questions and plug-ins can be sent to the contact listed below.
Xi-cam: Flexible High Throughput Data Processing for GISAXS
NASA Astrophysics Data System (ADS)
Pandolfi, Ronald; Kumar, Dinesh; Venkatakrishnan, Singanallur; Sarje, Abinav; Krishnan, Hari; Pellouchoud, Lenson; Ren, Fang; Fournier, Amanda; Jiang, Zhang; Tassone, Christopher; Mehta, Apurva; Sethian, James; Hexemer, Alexander
With increasing capabilities and data demand for GISAXS beamlines, supporting software is under development to handle larger data rates, volumes, and processing needs. We aim to provide a flexible and extensible approach to GISAXS data treatment as a solution to these rising needs. Xi-cam is the CAMERA platform for data management, analysis, and visualization. The core of Xi-cam is an extensible plugin-based GUI platform which provides users an interactive interface to processing algorithms. Plugins are available for SAXS/GISAXS data and data series visualization, as well as forward modeling and simulation through HipGISAXS. With Xi-cam's advanced mode, data processing steps are designed as a graph-based workflow, which can be executed locally or remotely. Remote execution utilizes HPC or de-localized resources, allowing for effective reduction of high-throughput data. Xi-cam is open-source and cross-platform. The processing algorithms in Xi-cam include parallel cpu and gpu processing optimizations, also taking advantage of external processing packages such as pyFAI. Xi-cam is available for download online.
Bioclipse: an open source workbench for chemo- and bioinformatics.
Spjuth, Ola; Helmus, Tobias; Willighagen, Egon L; Kuhn, Stefan; Eklund, Martin; Wagener, Johannes; Murray-Rust, Peter; Steinbeck, Christoph; Wikberg, Jarl E S
2007-02-22
There is a need for software applications that provide users with a complete and extensible toolkit for chemo- and bioinformatics accessible from a single workbench. Commercial packages are expensive and closed source, hence they do not allow end users to modify algorithms and add custom functionality. Existing open source projects are more focused on providing a framework for integrating existing, separately installed bioinformatics packages, rather than providing user-friendly interfaces. No open source chemoinformatics workbench has previously been published, and no successful attempts have been made to integrate chemo- and bioinformatics into a single framework. Bioclipse is an advanced workbench for resources in chemo- and bioinformatics, such as molecules, proteins, sequences, spectra, and scripts. It provides 2D-editing, 3D-visualization, file format conversion, calculation of chemical properties, and much more; all fully integrated into a user-friendly desktop application. Editing supports standard functions such as cut and paste, drag and drop, and undo/redo. Bioclipse is written in Java and based on the Eclipse Rich Client Platform with a state-of-the-art plugin architecture. This gives Bioclipse an advantage over other systems as it can easily be extended with functionality in any desired direction. Bioclipse is a powerful workbench for bio- and chemoinformatics as well as an advanced integration platform. The rich functionality, intuitive user interface, and powerful plugin architecture make Bioclipse the most advanced and user-friendly open source workbench for chemo- and bioinformatics. Bioclipse is released under Eclipse Public License (EPL), an open source license which sets no constraints on external plugin licensing; it is totally open for both open source plugins as well as commercial ones. Bioclipse is freely available at http://www.bioclipse.net.
DOE Office of Scientific and Technical Information (OSTI.GOV)
SADE is a software package for rapidly assembling analytic pipelines to manipulate data. The packages consists of the engine that manages the data and coordinates the movement of data between the tasks performing a function? a set of core libraries consisting of plugins that perform common tasks? and a framework to extend the system supporting the development of new plugins. Currently through configuration files, a pipeline can be defined that maps the routing of data through a series of plugins. Pipelines can be run in a batch mode or can process streaming data? they can be executed from the commandmore » line or run through a Windows background service. There currently exists over a hundred plugins, over fifty pipeline configurations? and the software is now being used by about a half-dozen projects.« less
Beyond filtered backprojection: A reconstruction software package for ion beam microtomography data
NASA Astrophysics Data System (ADS)
Habchi, C.; Gordillo, N.; Bourret, S.; Barberet, Ph.; Jovet, C.; Moretto, Ph.; Seznec, H.
2013-01-01
A new version of the TomoRebuild data reduction software package is presented, for the reconstruction of scanning transmission ion microscopy tomography (STIMT) and particle induced X-ray emission tomography (PIXET) images. First, we present a state of the art of the reconstruction codes available for ion beam microtomography. The algorithm proposed here brings several advantages. It is a portable, multi-platform code, designed in C++ with well-separated classes for easier use and evolution. Data reduction is separated in different steps and the intermediate results may be checked if necessary. Although no additional graphic library or numerical tool is required to run the program as a command line, a user friendly interface was designed in Java, as an ImageJ plugin. All experimental and reconstruction parameters may be entered either through this plugin or directly in text format files. A simple standard format is proposed for the input of experimental data. Optional graphic applications using the ROOT interface may be used separately to display and fit energy spectra. Regarding the reconstruction process, the filtered backprojection (FBP) algorithm, already present in the previous version of the code, was optimized so that it is about 10 times as fast. In addition, Maximum Likelihood Expectation Maximization (MLEM) and its accelerated version Ordered Subsets Expectation Maximization (OSEM) algorithms were implemented. A detailed user guide in English is available. A reconstruction example of experimental data from a biological sample is given. It shows the capability of the code to reduce noise in the sinograms and to deal with incomplete data, which puts a new perspective on tomography using low number of projections or limited angle.
Volumetric neuroimage analysis extensions for the MIPAV software package.
Bazin, Pierre-Louis; Cuzzocreo, Jennifer L; Yassa, Michael A; Gandler, William; McAuliffe, Matthew J; Bassett, Susan S; Pham, Dzung L
2007-09-15
We describe a new collection of publicly available software tools for performing quantitative neuroimage analysis. The tools perform semi-automatic brain extraction, tissue classification, Talairach alignment, and atlas-based measurements within a user-friendly graphical environment. They are implemented as plug-ins for MIPAV, a freely available medical image processing software package from the National Institutes of Health. Because the plug-ins and MIPAV are implemented in Java, both can be utilized on nearly any operating system platform. In addition to the software plug-ins, we have also released a digital version of the Talairach atlas that can be used to perform regional volumetric analyses. Several studies are conducted applying the new tools to simulated and real neuroimaging data sets.
Castaño-Díez, Daniel; Kudryashev, Mikhail; Arheit, Marcel; Stahlberg, Henning
2012-05-01
Dynamo is a new software package for subtomogram averaging of cryo Electron Tomography (cryo-ET) data with three main goals: first, Dynamo allows user-transparent adaptation to a variety of high-performance computing platforms such as GPUs or CPU clusters. Second, Dynamo implements user-friendliness through GUI interfaces and scripting resources. Third, Dynamo offers user-flexibility through a plugin API. Besides the alignment and averaging procedures, Dynamo includes native tools for visualization and analysis of results and data, as well as support for third party visualization software, such as Chimera UCSF or EMAN2. As a demonstration of these functionalities, we studied bacterial flagellar motors and showed automatically detected classes with absent and present C-rings. Subtomogram averaging is a common task in current cryo-ET pipelines, which requires extensive computational resources and follows a well-established workflow. However, due to the data diversity, many existing packages offer slight variations of the same algorithm to improve results. One of the main purposes behind Dynamo is to provide explicit tools to allow the user the insertion of custom designed procedures - or plugins - to replace or complement the native algorithms in the different steps of the processing pipeline for subtomogram averaging without the burden of handling parallelization. Custom scripts that implement new approaches devised by the user are integrated into the Dynamo data management system, so that they can be controlled by the GUI or the scripting capacities. Dynamo executables do not require licenses for third party commercial software. Sources, executables and documentation are freely distributed on http://www.dynamo-em.org. Copyright © 2012 Elsevier Inc. All rights reserved.
SASfit: a tool for small-angle scattering data analysis using a library of analytical expressions.
Breßler, Ingo; Kohlbrecher, Joachim; Thünemann, Andreas F
2015-10-01
SASfit is one of the mature programs for small-angle scattering data analysis and has been available for many years. This article describes the basic data processing and analysis workflow along with recent developments in the SASfit program package (version 0.94.6). They include (i) advanced algorithms for reduction of oversampled data sets, (ii) improved confidence assessment in the optimized model parameters and (iii) a flexible plug-in system for custom user-provided models. A scattering function of a mass fractal model of branched polymers in solution is provided as an example for implementing a plug-in. The new SASfit release is available for major platforms such as Windows, Linux and MacOS. To facilitate usage, it includes comprehensive indexed documentation as well as a web-based wiki for peer collaboration and online videos demonstrating basic usage. The use of SASfit is illustrated by interpretation of the small-angle X-ray scattering curves of monomodal gold nanoparticles (NIST reference material 8011) and bimodal silica nanoparticles (EU reference material ERM-FD-102).
Parks, Donovan H.; Mankowski, Timothy; Zangooei, Somayyeh; Porter, Michael S.; Armanini, David G.; Baird, Donald J.; Langille, Morgan G. I.; Beiko, Robert G.
2013-01-01
GenGIS is free and open source software designed to integrate biodiversity data with a digital map and information about geography and habitat. While originally developed with microbial community analyses and phylogeography in mind, GenGIS has been applied to a wide range of datasets. A key feature of GenGIS is the ability to test geographic axes that can correspond to routes of migration or gradients that influence community similarity. Here we introduce GenGIS version 2, which extends the linear gradient tests introduced in the first version to allow comprehensive testing of all possible linear geographic axes. GenGIS v2 also includes a new plugin framework that supports the development and use of graphically driven analysis packages: initial plugins include implementations of linear regression and the Mantel test, calculations of alpha-diversity (e.g., Shannon Index) for all samples, and geographic visualizations of dissimilarity matrices. We have also implemented a recently published method for biomonitoring reference condition analysis (RCA), which compares observed species richness and diversity to predicted values to determine whether a given site has been impacted. The newest version of GenGIS supports vector data in addition to raster files. We demonstrate the new features of GenGIS by performing a full gradient analysis of an Australian kangaroo apple data set, by using plugins and embedded statistical commands to analyze human microbiome sample data, and by applying RCA to a set of samples from Atlantic Canada. GenGIS release versions, tutorials and documentation are freely available at http://kiwi.cs.dal.ca/GenGIS, and source code is available at https://github.com/beiko-lab/gengis. PMID:23922841
Integrated communications and optical navigation system
NASA Astrophysics Data System (ADS)
Mueller, J.; Pajer, G.; Paluszek, M.
2013-12-01
The Integrated Communications and Optical Navigation System (ICONS) is a flexible navigation system for spacecraft that does not require global positioning system (GPS) measurements. The navigation solution is computed using an Unscented Kalman Filter (UKF) that can accept any combination of range, range-rate, planet chord width, landmark, and angle measurements using any celestial object. Both absolute and relative orbit determination is supported. The UKF employs a full nonlinear dynamical model of the orbit including gravity models and disturbance models. The ICONS package also includes attitude determination algorithms using the UKF algorithm with the Inertial Measurement Unit (IMU). The IMU is used as the dynamical base for the attitude determination algorithms. This makes the sensor a more capable plug-in replacement for a star tracker, thus reducing the integration and test cost of adding this sensor to a spacecraft. Recent additions include an integrated optical communications system which adds communications, and integrated range and range rate measurement and timing. The paper includes test results from trajectories based on the NASA New Horizons spacecraft.
A precise goniometer/tensiometer using a low cost single-board computer
NASA Astrophysics Data System (ADS)
Favier, Benoit; Chamakos, Nikolaos T.; Papathanasiou, Athanasios G.
2017-12-01
Measuring the surface tension and the Young contact angle of a droplet is extremely important for many industrial applications. Here, considering the booming interest for small and cheap but precise experimental instruments, we have constructed a low-cost contact angle goniometer/tensiometer, based on a single-board computer (Raspberry Pi). The device runs an axisymmetric drop shape analysis (ADSA) algorithm written in Python. The code, here named DropToolKit, was developed in-house. We initially present the mathematical framework of our algorithm and then we validate our software tool against other well-established ADSA packages, including the commercial ramé-hart DROPimage Advanced as well as the DropAnalysis plugin in ImageJ. After successfully testing for various combinations of liquids and solid surfaces, we concluded that our prototype device would be highly beneficial for industrial applications as well as for scientific research in wetting phenomena compared to the commercial solutions.
A New Network Modeling Tool for the Ground-based Nuclear Explosion Monitoring Community
NASA Astrophysics Data System (ADS)
Merchant, B. J.; Chael, E. P.; Young, C. J.
2013-12-01
Network simulations have long been used to assess the performance of monitoring networks to detect events for such purposes as planning station deployments and network resilience to outages. The standard tool has been the SAIC-developed NetSim package. With correct parameters, NetSim can produce useful simulations; however, the package has several shortcomings: an older language (FORTRAN), an emphasis on seismic monitoring with limited support for other technologies, limited documentation, and a limited parameter set. Thus, we are developing NetMOD (Network Monitoring for Optimal Detection), a Java-based tool designed to assess the performance of ground-based networks. NetMOD's advantages include: coded in a modern language that is multi-platform, utilizes modern computing performance (e.g. multi-core processors), incorporates monitoring technologies other than seismic, and includes a well-validated default parameter set for the IMS stations. NetMOD is designed to be extendable through a plugin infrastructure, so new phenomenological models can be added. Development of the Seismic Detection Plugin is being pursued first. Seismic location and infrasound and hydroacoustic detection plugins will follow. By making NetMOD an open-release package, it can hopefully provide a common tool that the monitoring community can use to produce assessments of monitoring networks and to verify assessments made by others.
NASA Astrophysics Data System (ADS)
Mozaffari, Ahmad; Vajedi, Mahyar; Chehresaz, Maryyeh; Azad, Nasser L.
2016-03-01
The urgent need to meet increasingly tight environmental regulations and new fuel economy requirements has motivated system science researchers and automotive engineers to take advantage of emerging computational techniques to further advance hybrid electric vehicle and plug-in hybrid electric vehicle (PHEV) designs. In particular, research has focused on vehicle powertrain system design optimization, to reduce the fuel consumption and total energy cost while improving the vehicle's driving performance. In this work, two different natural optimization machines, namely the synchronous self-learning Pareto strategy and the elitism non-dominated sorting genetic algorithm, are implemented for component sizing of a specific power-split PHEV platform with a Toyota plug-in Prius as the baseline vehicle. To do this, a high-fidelity model of the Toyota plug-in Prius is employed for the numerical experiments using the Autonomie simulation software. Based on the simulation results, it is demonstrated that Pareto-based algorithms can successfully optimize the design parameters of the vehicle powertrain.
Integrating segmentation methods from the Insight Toolkit into a visualization application.
Martin, Ken; Ibáñez, Luis; Avila, Lisa; Barré, Sébastien; Kaspersen, Jon H
2005-12-01
The Insight Toolkit (ITK) initiative from the National Library of Medicine has provided a suite of state-of-the-art segmentation and registration algorithms ideally suited to volume visualization and analysis. A volume visualization application that effectively utilizes these algorithms provides many benefits: it allows access to ITK functionality for non-programmers, it creates a vehicle for sharing and comparing segmentation techniques, and it serves as a visual debugger for algorithm developers. This paper describes the integration of image processing functionalities provided by the ITK into VolView, a visualization application for high performance volume rendering. A free version of this visualization application is publicly available and is available in the online version of this paper. The process for developing ITK plugins for VolView according to the publicly available API is described in detail, and an application of ITK VolView plugins to the segmentation of Abdominal Aortic Aneurysms (AAAs) is presented. The source code of the ITK plugins is also publicly available and it is included in the online version.
Avogadro: an advanced semantic chemical editor, visualization, and analysis platform
2012-01-01
Background The Avogadro project has developed an advanced molecule editor and visualizer designed for cross-platform use in computational chemistry, molecular modeling, bioinformatics, materials science, and related areas. It offers flexible, high quality rendering, and a powerful plugin architecture. Typical uses include building molecular structures, formatting input files, and analyzing output of a wide variety of computational chemistry packages. By using the CML file format as its native document type, Avogadro seeks to enhance the semantic accessibility of chemical data types. Results The work presented here details the Avogadro library, which is a framework providing a code library and application programming interface (API) with three-dimensional visualization capabilities; and has direct applications to research and education in the fields of chemistry, physics, materials science, and biology. The Avogadro application provides a rich graphical interface using dynamically loaded plugins through the library itself. The application and library can each be extended by implementing a plugin module in C++ or Python to explore different visualization techniques, build/manipulate molecular structures, and interact with other programs. We describe some example extensions, one which uses a genetic algorithm to find stable crystal structures, and one which interfaces with the PackMol program to create packed, solvated structures for molecular dynamics simulations. The 1.0 release series of Avogadro is the main focus of the results discussed here. Conclusions Avogadro offers a semantic chemical builder and platform for visualization and analysis. For users, it offers an easy-to-use builder, integrated support for downloading from common databases such as PubChem and the Protein Data Bank, extracting chemical data from a wide variety of formats, including computational chemistry output, and native, semantic support for the CML file format. For developers, it can be easily extended via a powerful plugin mechanism to support new features in organic chemistry, inorganic complexes, drug design, materials, biomolecules, and simulations. Avogadro is freely available under an open-source license from http://avogadro.openmolecules.net. PMID:22889332
Avogadro: an advanced semantic chemical editor, visualization, and analysis platform.
Hanwell, Marcus D; Curtis, Donald E; Lonie, David C; Vandermeersch, Tim; Zurek, Eva; Hutchison, Geoffrey R
2012-08-13
The Avogadro project has developed an advanced molecule editor and visualizer designed for cross-platform use in computational chemistry, molecular modeling, bioinformatics, materials science, and related areas. It offers flexible, high quality rendering, and a powerful plugin architecture. Typical uses include building molecular structures, formatting input files, and analyzing output of a wide variety of computational chemistry packages. By using the CML file format as its native document type, Avogadro seeks to enhance the semantic accessibility of chemical data types. The work presented here details the Avogadro library, which is a framework providing a code library and application programming interface (API) with three-dimensional visualization capabilities; and has direct applications to research and education in the fields of chemistry, physics, materials science, and biology. The Avogadro application provides a rich graphical interface using dynamically loaded plugins through the library itself. The application and library can each be extended by implementing a plugin module in C++ or Python to explore different visualization techniques, build/manipulate molecular structures, and interact with other programs. We describe some example extensions, one which uses a genetic algorithm to find stable crystal structures, and one which interfaces with the PackMol program to create packed, solvated structures for molecular dynamics simulations. The 1.0 release series of Avogadro is the main focus of the results discussed here. Avogadro offers a semantic chemical builder and platform for visualization and analysis. For users, it offers an easy-to-use builder, integrated support for downloading from common databases such as PubChem and the Protein Data Bank, extracting chemical data from a wide variety of formats, including computational chemistry output, and native, semantic support for the CML file format. For developers, it can be easily extended via a powerful plugin mechanism to support new features in organic chemistry, inorganic complexes, drug design, materials, biomolecules, and simulations. Avogadro is freely available under an open-source license from http://avogadro.openmolecules.net.
CytoCluster: A Cytoscape Plugin for Cluster Analysis and Visualization of Biological Networks.
Li, Min; Li, Dongyan; Tang, Yu; Wu, Fangxiang; Wang, Jianxin
2017-08-31
Nowadays, cluster analysis of biological networks has become one of the most important approaches to identifying functional modules as well as predicting protein complexes and network biomarkers. Furthermore, the visualization of clustering results is crucial to display the structure of biological networks. Here we present CytoCluster, a cytoscape plugin integrating six clustering algorithms, HC-PIN (Hierarchical Clustering algorithm in Protein Interaction Networks), OH-PIN (identifying Overlapping and Hierarchical modules in Protein Interaction Networks), IPCA (Identifying Protein Complex Algorithm), ClusterONE (Clustering with Overlapping Neighborhood Expansion), DCU (Detecting Complexes based on Uncertain graph model), IPC-MCE (Identifying Protein Complexes based on Maximal Complex Extension), and BinGO (the Biological networks Gene Ontology) function. Users can select different clustering algorithms according to their requirements. The main function of these six clustering algorithms is to detect protein complexes or functional modules. In addition, BinGO is used to determine which Gene Ontology (GO) categories are statistically overrepresented in a set of genes or a subgraph of a biological network. CytoCluster can be easily expanded, so that more clustering algorithms and functions can be added to this plugin. Since it was created in July 2013, CytoCluster has been downloaded more than 9700 times in the Cytoscape App store and has already been applied to the analysis of different biological networks. CytoCluster is available from http://apps.cytoscape.org/apps/cytocluster.
CytoCluster: A Cytoscape Plugin for Cluster Analysis and Visualization of Biological Networks
Li, Min; Li, Dongyan; Tang, Yu; Wang, Jianxin
2017-01-01
Nowadays, cluster analysis of biological networks has become one of the most important approaches to identifying functional modules as well as predicting protein complexes and network biomarkers. Furthermore, the visualization of clustering results is crucial to display the structure of biological networks. Here we present CytoCluster, a cytoscape plugin integrating six clustering algorithms, HC-PIN (Hierarchical Clustering algorithm in Protein Interaction Networks), OH-PIN (identifying Overlapping and Hierarchical modules in Protein Interaction Networks), IPCA (Identifying Protein Complex Algorithm), ClusterONE (Clustering with Overlapping Neighborhood Expansion), DCU (Detecting Complexes based on Uncertain graph model), IPC-MCE (Identifying Protein Complexes based on Maximal Complex Extension), and BinGO (the Biological networks Gene Ontology) function. Users can select different clustering algorithms according to their requirements. The main function of these six clustering algorithms is to detect protein complexes or functional modules. In addition, BinGO is used to determine which Gene Ontology (GO) categories are statistically overrepresented in a set of genes or a subgraph of a biological network. CytoCluster can be easily expanded, so that more clustering algorithms and functions can be added to this plugin. Since it was created in July 2013, CytoCluster has been downloaded more than 9700 times in the Cytoscape App store and has already been applied to the analysis of different biological networks. CytoCluster is available from http://apps.cytoscape.org/apps/cytocluster. PMID:28858211
Optimal control of a repowered vehicle: Plug-in fuel cell against plug-in hybrid electric powertrain
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tribioli, L., E-mail: laura.tribioli@unicusano.it; Cozzolino, R.; Barbieri, M.
2015-03-10
This paper describes two different powertrain configurations for the repowering of a conventional vehicle, equipped with an internal combustion engine (ICE). A model of a mid-sized ICE-vehicle is realized and then modified to model both a parallel plug-in hybrid electric powertrain and a proton electrolyte membrane (PEM) fuel cell (FC) hybrid powertrain. The vehicle behavior under the application of an optimal control algorithm for the energy management is analyzed for the different scenarios and results are compared.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This fact sheet highlights the Toyota Prius plug-in HEV, a plug-in hybrid electric car in the advanced technology vehicle fleet at the National Renewable Energy Laboratory (NREL). In partnership with the University of Colorado, NREL uses the vehicle for grid-integration studies and for testing new hardware and charge-management algorithms. NREL's advanced technology vehicle fleet features promising technologies to increase efficiency and reduce emissions without sacrificing safety or comfort. The fleet serves as a technology showcase, helping visitors learn about innovative vehicles that are available today or are in development. Vehicles in the fleet are representative of current, advanced, prototype, andmore » emerging technologies.« less
Application Programming in AWIPS II
NASA Technical Reports Server (NTRS)
Smit, Matt; McGrath, Kevin; Burks, Jason; Carcione, Brian
2012-01-01
Since its inception almost 8 years ago, NASA's Short-term Prediction Research and Transition (SPoRT) Center has integrated NASA data into the National Weather Service's decision support system (DSS) the Advanced Weather Interactive Processing System (AWIPS). SPoRT has, in some instances, had to shape and transform data sets into various formats and manipulate configurations to visualize them in AWIPS. With the advent of the next generation of DSS, AWIPS II, developers will be able to develop their own plugins to handle any type of data. Raytheon is developing AWIPS II to be a more extensible package written mainly in Java, and built around a Service Oriented Architecture. A plugin architecture will allow users to install their own code modules, and (if all the rules have been properly followed) they will work hand-in-hand with AWIPS II as if it were originally built in. Users can bring in new datasets with existing plugins, tweak plugins to handle a nuance or desired new functionality, or create an entirely new visualization layout for a new dataset. SPoRT is developing plugins to ensure its existing NASA data will be ready for AWIPS II when it is delivered, and to prepare for the future of new instruments on upcoming satellites.
Rajendiran, Nivedita; Durrant, Jacob D
2018-05-05
Molecular dynamics (MD) simulations provide critical insights into many biological mechanisms. Programs such as VMD, Chimera, and PyMOL can produce impressive simulation visualizations, but they lack many advanced rendering algorithms common in the film and video-game industries. In contrast, the modeling program Blender includes such algorithms but cannot import MD-simulation data. MD trajectories often require many gigabytes of memory/disk space, complicating Blender import. We present Pyrite, a Blender plugin that overcomes these limitations. Pyrite allows researchers to visualize MD simulations within Blender, with full access to Blender's cutting-edge rendering techniques. We expect Pyrite-generated images to appeal to students and non-specialists alike. A copy of the plugin is available at http://durrantlab.com/pyrite/, released under the terms of the GNU General Public License Version 3. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
SlideJ: An ImageJ plugin for automated processing of whole slide images.
Della Mea, Vincenzo; Baroni, Giulia L; Pilutti, David; Di Loreto, Carla
2017-01-01
The digital slide, or Whole Slide Image, is a digital image, acquired with specific scanners, that represents a complete tissue sample or cytological specimen at microscopic level. While Whole Slide image analysis is recognized among the most interesting opportunities, the typical size of such images-up to Gpixels- can be very demanding in terms of memory requirements. Thus, while algorithms and tools for processing and analysis of single microscopic field images are available, Whole Slide images size makes the direct use of such tools prohibitive or impossible. In this work a plugin for ImageJ, named SlideJ, is proposed with the objective to seamlessly extend the application of image analysis algorithms implemented in ImageJ for single microscopic field images to a whole digital slide analysis. The plugin has been complemented by examples of macro in the ImageJ scripting language to demonstrate its use in concrete situations.
SlideJ: An ImageJ plugin for automated processing of whole slide images
Baroni, Giulia L.; Pilutti, David; Di Loreto, Carla
2017-01-01
The digital slide, or Whole Slide Image, is a digital image, acquired with specific scanners, that represents a complete tissue sample or cytological specimen at microscopic level. While Whole Slide image analysis is recognized among the most interesting opportunities, the typical size of such images—up to Gpixels- can be very demanding in terms of memory requirements. Thus, while algorithms and tools for processing and analysis of single microscopic field images are available, Whole Slide images size makes the direct use of such tools prohibitive or impossible. In this work a plugin for ImageJ, named SlideJ, is proposed with the objective to seamlessly extend the application of image analysis algorithms implemented in ImageJ for single microscopic field images to a whole digital slide analysis. The plugin has been complemented by examples of macro in the ImageJ scripting language to demonstrate its use in concrete situations. PMID:28683129
Monte Carlo Particle Lists: MCPL
NASA Astrophysics Data System (ADS)
Kittelmann, T.; Klinkby, E.; Knudsen, E. B.; Willendrup, P.; Cai, X. X.; Kanaki, K.
2017-09-01
A binary format with lists of particle state information, for interchanging particles between various Monte Carlo simulation applications, is presented. Portable C code for file manipulation is made available to the scientific community, along with converters and plugins for several popular simulation packages.
An algorithm for automated ROI definition in water or epoxy-filled NEMA NU-2 image quality phantoms.
Pierce, Larry A; Byrd, Darrin W; Elston, Brian F; Karp, Joel S; Sunderland, John J; Kinahan, Paul E
2016-01-08
Drawing regions of interest (ROIs) in positron emission tomography/computed tomography (PET/CT) scans of the National Electrical Manufacturers Association (NEMA) NU-2 Image Quality (IQ) phantom is a time-consuming process that allows for interuser variability in the measurements. In order to reduce operator effort and allow batch processing of IQ phantom images, we propose a fast, robust, automated algorithm for performing IQ phantom sphere localization and analysis. The algorithm is easily altered to accommodate different configurations of the IQ phantom. The proposed algorithm uses information from both the PET and CT image volumes in order to overcome the challenges of detecting the smallest spheres in the PET volume. This algorithm has been released as an open-source plug-in to the Osirix medical image viewing software package. We test the algorithm under various noise conditions, positions within the scanner, air bubbles in the phantom spheres, and scanner misalignment conditions. The proposed algorithm shows run-times between 3 and 4 min and has proven to be robust under all tested conditions, with expected sphere localization deviations of less than 0.2 mm and variations of PET ROI mean and maximum values on the order of 0.5% and 2%, respectively, over multiple PET acquisitions. We conclude that the proposed algorithm is stable when challenged with a variety of physical and imaging anomalies, and that the algorithm can be a valuable tool for those who use the NEMA NU-2 IQ phantom for PET/CT scanner acceptance testing and QA/QC.
MSNoise: A framework for Continuous Seismic Noise Analysis
NASA Astrophysics Data System (ADS)
Lecocq, Thomas; Caudron, Corentin; De Plaen, Raphaël; Mordret, Aurélien
2016-04-01
MSNoise is an Open and Free Python package known to be the only complete integrated workflow designed to analyse ambient seismic noise and study relative velocity changes (dv/v) in the crust. It is based on state of the art and well maintained Python modules, among which ObsPy plays an important role. To our knowledge, it is officially used for continuous monitoring at least in three notable places: the Observatory of the Piton de la Fournaise volcano (OVPF, France), the Auckland Volcanic Field (New Zealand) and on the South Napa earthquake (Berkeley, USA). It is also used by many researchers to process archive data to focus e.g. on fault zones, intraplate Europe, geothermal exploitations or Antarctica. We first present the general working of MSNoise, originally written in 2010 to automatically scan data archives and process seismic data in order to produce dv/v time series. We demonstrate that its modularity provides a new potential to easily test new algorithms for each processing step. For example, one could experiment new methods of cross-correlation (done by default in the frequency domain), stacking (default is linear stacking, averaging), or dv/v estimation (default is moving window cross-spectrum "MWCS", so-called "doublet"), etc. We present the last major evolution of MSNoise from a "single workflow: data archive to dv/v" to a framework system that allows plugins and modules to be developed and integrated into the MSNoise ecosystem. Small-scale plugins will be shown as examples, such as "continuous PPSD" (à la McNamarra & Buland) or "Seismic Amplitude Ratio Analysis" (Taisne, Caudron). We will also present the new MSNoise-TOMO package, using MSNoise as a "cross-correlation" toolbox and demystifying surface wave tomography ! Finally, the poster will be a meeting point for all those using or willing to use MSNoise, to meet the developer, exchange ideas and wishes !
Component Pin Recognition Using Algorithms Based on Machine Learning
NASA Astrophysics Data System (ADS)
Xiao, Yang; Hu, Hong; Liu, Ze; Xu, Jiangchang
2018-04-01
The purpose of machine vision for a plug-in machine is to improve the machine’s stability and accuracy, and recognition of the component pin is an important part of the vision. This paper focuses on component pin recognition using three different techniques. The first technique involves traditional image processing using the core algorithm for binary large object (BLOB) analysis. The second technique uses the histogram of oriented gradients (HOG), to experimentally compare the effect of the support vector machine (SVM) and the adaptive boosting machine (AdaBoost) learning meta-algorithm classifiers. The third technique is the use of an in-depth learning method known as convolution neural network (CNN), which involves identifying the pin by comparing a sample to its training. The main purpose of the research presented in this paper is to increase the knowledge of learning methods used in the plug-in machine industry in order to achieve better results.
Cellular Consequences of Telomere Shortening in Histologically Normal Breast Tissues
2013-09-01
using the open source, JAVA -based image analysis software package ImageJ (http://rsb.info.nih.gov/ij/) and a custom designed plugin (“Telometer...Tabulated data were stored in a MySQL (http://www.mysql.com) database and viewed through Microsoft Access (Microsoft Corp.). Statistical Analysis For
Android: Call C Functions with the Native Development Kit (NDK)
2016-09-01
guide is intended to assist programmers with how to attach an NDK plugin to an Android Integrated Development Environment and how to call C functions...written in Java to interact with native C/C++. This guide is intended to take programmers through adding an NDK package into an Android Studio
Analytical Plug-In Method for Kernel Density Estimator Applied to Genetic Neutrality Study
NASA Astrophysics Data System (ADS)
Troudi, Molka; Alimi, Adel M.; Saoudi, Samir
2008-12-01
The plug-in method enables optimization of the bandwidth of the kernel density estimator in order to estimate probability density functions (pdfs). Here, a faster procedure than that of the common plug-in method is proposed. The mean integrated square error (MISE) depends directly upon [InlineEquation not available: see fulltext.] which is linked to the second-order derivative of the pdf. As we intend to introduce an analytical approximation of [InlineEquation not available: see fulltext.], the pdf is estimated only once, at the end of iterations. These two kinds of algorithm are tested on different random variables having distributions known for their difficult estimation. Finally, they are applied to genetic data in order to provide a better characterisation in the mean of neutrality of Tunisian Berber populations.
Cytoprophet: a Cytoscape plug-in for protein and domain interaction networks inference.
Morcos, Faruck; Lamanna, Charles; Sikora, Marcin; Izaguirre, Jesús
2008-10-01
Cytoprophet is a software tool that allows prediction and visualization of protein and domain interaction networks. It is implemented as a plug-in of Cytoscape, an open source software framework for analysis and visualization of molecular networks. Cytoprophet implements three algorithms that predict new potential physical interactions using the domain composition of proteins and experimental assays. The algorithms for protein and domain interaction inference include maximum likelihood estimation (MLE) using expectation maximization (EM); the set cover approach maximum specificity set cover (MSSC) and the sum-product algorithm (SPA). After accepting an input set of proteins with Uniprot ID/Accession numbers and a selected prediction algorithm, Cytoprophet draws a network of potential interactions with probability scores and GO distances as edge attributes. A network of domain interactions between the domains of the initial protein list can also be generated. Cytoprophet was designed to take advantage of the visual capabilities of Cytoscape and be simple to use. An example of inference in a signaling network of myxobacterium Myxococcus xanthus is presented and available at Cytoprophet's website. http://cytoprophet.cse.nd.edu.
Technique and cue selection for graphical presentation of generic hyperdimensional data
NASA Astrophysics Data System (ADS)
Howard, Lee M.; Burton, Robert P.
2013-12-01
Several presentation techniques have been created for visualization of data with more than three variables. Packages have been written, each of which implements a subset of these techniques. However, these packages generally fail to provide all the features needed by the user during the visualization process. Further, packages generally limit support for presentation techniques to a few techniques. A new package called Petrichor accommodates all necessary and useful features together in one system. Any presentation technique may be added easily through an extensible plugin system. Features are supported by a user interface that allows easy interaction with data. Annotations allow users to mark up visualizations and share information with others. By providing a hyperdimensional graphics package that easily accommodates presentation techniques and includes a complete set of features, including those that are rarely or never supported elsewhere, the user is provided with a tool that facilitates improved interaction with multivariate data to extract and disseminate information.
StackSplit - a plugin for multi-event shear wave splitting analyses in SplitLab
NASA Astrophysics Data System (ADS)
Grund, Michael
2017-04-01
The SplitLab package (Wüstefeld et al., Computers and Geosciences, 2008), written in MATLAB, is a powerful and widely used tool for analysing seismological shear wave splitting of single event measurements. However, in many cases, especially temporary station deployments close to seaside or for recordings affected by strong anthropogenic noise, only multi-event approaches provide stable and reliable splitting results. In order to extend the original SplitLab environment for such analyses, I present the StackSplit plugin that can easily be implemented within the well accepted main program. StackSplit grants easy access to several different analysis approaches within SplitLab, including a new multiple waveform based inversion method as well as the most established standard stacking procedures. The possibility to switch between different analysis approaches at any time allows the user for the most flexible processing of individual multi-event splitting measurements for a single recording station. Besides the provided functions of the plugin, no other external program is needed for the multi-event analyses since StackSplit performs within the available SplitLab structure.
Plugin-docking system for autonomous charging using particle filter
NASA Astrophysics Data System (ADS)
Koyasu, Hiroshi; Wada, Masayoshi
2017-03-01
Autonomous charging of the robot battery is one of the key functions for the sake of expanding working areas of the robots. To realize it, most of existing systems use custom docking stations or artificial markers. By the other words, they can only charge on a few specific outlets. If the limit can be removed, working areas of the robots significantly expands. In this paper, we describe a plugin-docking system for the autonomous charging, which does not require any custom docking stations or artificial markers. A single camera is used for recognizing the 3D position of an outlet socket. A particle filter-based image tracking algorithm which is robust to the illumination change is applied. The algorithm is implemented on a robot with an omnidirectional moving system. The experimental results show the effectiveness of our system.
MSNoise: Not Only dv/v! A Framework for Continuous Seismic Data Analysis
NASA Astrophysics Data System (ADS)
Mordret, A.; Lecocq, T.; De Plaen, R.; Caudron, C.; Brenguier, F.
2015-12-01
MSNoise is an Open and Free Python package known to be the only complete integrated workflow designed to analyse ambient seismic noise and study relative velocity changes (dv/v) in the crust. It is based on state of the art and well maintained Python modules, among which ObsPy plays an important role. To our knowledge, it is officially used for continuous monitoring at least in three notable places: the Observatory of the Piton de la Fournaise volcano (OVPF, France), the Auckland Volcanic Field (New Zealand) and on the South Napa earthquake (Berkeley, USA). It is also used by many researchers to process archive data, e.g. focussing on fault zones, intraplate Europe, geothermal exploitations or Antarctica. We first present the general working of MSNoise, originally written in 2010 to automatically scan data archives and process seismic data in order to produce dv/v time series. We demonstrate that its modularity provides a new potential to easily test new algorithms for each processing step. For example, to experiment new methods of cross-correlation (done by default in the frequency domain), stacking (default is linear stacking, averaging), or dt/t or dv/v estimation (default is moving window cross-spectrum "MWCS", so-called "doublet"), etc. Finally, we present the last major evolution of MSNoise, from a "single workflow: data archive to dv/v" to a framework system that allows plugins and modules to be developed and integrated into the MSNoise ecosystem. Examples of plugins in development such as continuous PPSD (à la McNamarra & Buland) or continuous RSAM/SSAM (Endo & Murray, Stephens) will be presented.
Sarpe, Vladimir; Rafiei, Atefeh; Hepburn, Morgan; Ostan, Nicholas; Schryvers, Anthony B.; Schriemer, David C.
2016-01-01
The Mass Spec Studio package was designed to support the extraction of hydrogen-deuterium exchange and covalent labeling data for a range of mass spectrometry (MS)-based workflows, to integrate with restraint-driven protein modeling activities. In this report, we present an extension of the underlying Studio framework and provide a plug-in for crosslink (XL) detection. To accommodate flexibility in XL methods and applications, while maintaining efficient data processing, the plug-in employs a peptide library reduction strategy via a presearch of the tandem-MS data. We demonstrate that prescoring linear unmodified peptide tags using a probabilistic approach substantially reduces search space by requiring both crosslinked peptides to generate sparse data attributable to their linear forms. The method demonstrates highly sensitive crosslink peptide identification with a low false positive rate. Integration with a Haddock plug-in provides a resource that can combine multiple sources of data for protein modeling activities. We generated a structural model of porcine transferrin bound to TbpB, a membrane-bound receptor essential for iron acquisition in Actinobacillus pleuropneumoniae. Using mutational data and crosslinking restraints, we confirm the mechanism by which TbpB recognizes the iron-loaded form of transferrin, and note the requirement for disparate sources of restraint data for accurate model construction. The software plugin is freely available at www.msstudio.ca. PMID:27412762
Sarpe, Vladimir; Rafiei, Atefeh; Hepburn, Morgan; Ostan, Nicholas; Schryvers, Anthony B; Schriemer, David C
2016-09-01
The Mass Spec Studio package was designed to support the extraction of hydrogen-deuterium exchange and covalent labeling data for a range of mass spectrometry (MS)-based workflows, to integrate with restraint-driven protein modeling activities. In this report, we present an extension of the underlying Studio framework and provide a plug-in for crosslink (XL) detection. To accommodate flexibility in XL methods and applications, while maintaining efficient data processing, the plug-in employs a peptide library reduction strategy via a presearch of the tandem-MS data. We demonstrate that prescoring linear unmodified peptide tags using a probabilistic approach substantially reduces search space by requiring both crosslinked peptides to generate sparse data attributable to their linear forms. The method demonstrates highly sensitive crosslink peptide identification with a low false positive rate. Integration with a Haddock plug-in provides a resource that can combine multiple sources of data for protein modeling activities. We generated a structural model of porcine transferrin bound to TbpB, a membrane-bound receptor essential for iron acquisition in Actinobacillus pleuropneumoniae Using mutational data and crosslinking restraints, we confirm the mechanism by which TbpB recognizes the iron-loaded form of transferrin, and note the requirement for disparate sources of restraint data for accurate model construction. The software plugin is freely available at www.msstudio.ca. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hart, William Eugene
These slides describe different strategies for installing Python software. Although I am a big fan of Python software development, robust strategies for software installation remains a challenge. This talk describes several different installation scenarios. The Good: the user has administrative privileges - Installing on Windows with an installer executable, Installing with Linux application utility, Installing a Python package from the PyPI repository, and Installing a Python package from source. The Bad: the user does not have administrative privileges - Using a virtual environment to isolate package installations, and Using an installer executable on Windows with a virtual environment. The Ugly:more » the user needs to install an extension package from source - Installing a Python extension package from source, and PyCoinInstall - Managing builds for Python extension packages. The last item referring to PyCoinInstall describes a utility being developed for the COIN-OR software, which is used within the operations research community. COIN-OR includes a variety of Python and C++ software packages, and this script uses a simple plug-in system to support the management of package builds and installation.« less
The Effect of Plug-in Electric Vehicles on Harmonic Analysis of Smart Grid
NASA Astrophysics Data System (ADS)
Heidarian, T.; Joorabian, M.; Reza, A.
2015-12-01
In this paper, the effect of plug-in electric vehicles is studied on the smart distribution system with a standard IEEE 30-bus network. At first, harmonic power flow analysis is performed by Newton-Raphson method and by considering distorted substation voltage. Afterward, proper sizes of capacitors is selected by cuckoo optimization algorithm to reduce the power losses and cost and by imposing acceptable limit for total harmonic distortion and RMS voltages. It is proposed that the impact of generated current harmonics by electric vehicle battery chargers should be factored into overall load control strategies of smart appliances. This study is generalized to the different hours of a day by using daily load curve, and then optimum time for charging of electric vehicles batteries in the parking lots are determined by cuckoo optimization algorithm. The results show that injecting harmonic currents of plug-in electric vehicles causes a drop in the voltage profile and increases power loss. Moreover, charging the vehicle batteries has more impact on increasing the power losses rather than the harmonic currents effect. Also, the findings showed that the current harmonics has a great influence on increasing of THD. Finally, optimum working times of all parking lots was obtained for the utilization cost reduction.
Xi-cam: a versatile interface for data visualization and analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pandolfi, Ronald J.; Allan, Daniel B.; Arenholz, Elke
Xi-cam is an extensible platform for data management, analysis and visualization.Xi-camaims to provide a flexible and extensible approach to synchrotron data treatment as a solution to rising demands for high-volume/high-throughput processing pipelines. The core ofXi-camis an extensible plugin-based graphical user interface platform which provides users with an interactive interface to processing algorithms. Plugins are available for SAXS/WAXS/GISAXS/GIWAXS, tomography and NEXAFS data. WithXi-cam's `advanced' mode, data processing steps are designed as a graph-based workflow, which can be executed live, locally or remotely. Remote execution utilizes high-performance computing or de-localized resources, allowing for the effective reduction of high-throughput data.Xi-cam's plugin-based architecture targetsmore » cross-facility and cross-technique collaborative development, in support of multi-modal analysis.Xi-camis open-source and cross-platform, and available for download on GitHub.« less
Xi-cam: a versatile interface for data visualization and analysis
Pandolfi, Ronald J.; Allan, Daniel B.; Arenholz, Elke; ...
2018-05-31
Xi-cam is an extensible platform for data management, analysis and visualization.Xi-camaims to provide a flexible and extensible approach to synchrotron data treatment as a solution to rising demands for high-volume/high-throughput processing pipelines. The core ofXi-camis an extensible plugin-based graphical user interface platform which provides users with an interactive interface to processing algorithms. Plugins are available for SAXS/WAXS/GISAXS/GIWAXS, tomography and NEXAFS data. WithXi-cam's `advanced' mode, data processing steps are designed as a graph-based workflow, which can be executed live, locally or remotely. Remote execution utilizes high-performance computing or de-localized resources, allowing for the effective reduction of high-throughput data.Xi-cam's plugin-based architecture targetsmore » cross-facility and cross-technique collaborative development, in support of multi-modal analysis.Xi-camis open-source and cross-platform, and available for download on GitHub.« less
Karthikeyan, M; Krishnan, S; Pandey, Anil Kumar; Bender, Andreas; Tropsha, Alexander
2008-04-01
We present the application of a Java remote method invocation (RMI) based open source architecture to distributed chemical computing. This architecture was previously employed for distributed data harvesting of chemical information from the Internet via the Google application programming interface (API; ChemXtreme). Due to its open source character and its flexibility, the underlying server/client framework can be quickly adopted to virtually every computational task that can be parallelized. Here, we present the server/client communication framework as well as an application to distributed computing of chemical properties on a large scale (currently the size of PubChem; about 18 million compounds), using both the Marvin toolkit as well as the open source JOELib package. As an application, for this set of compounds, the agreement of log P and TPSA between the packages was compared. Outliers were found to be mostly non-druglike compounds and differences could usually be explained by differences in the underlying algorithms. ChemStar is the first open source distributed chemical computing environment built on Java RMI, which is also easily adaptable to user demands due to its "plug-in architecture". The complete source codes as well as calculated properties along with links to PubChem resources are available on the Internet via a graphical user interface at http://moltable.ncl.res.in/chemstar/.
Nyström, Pär; Falck-Ytter, Terje; Gredebäck, Gustaf
2016-06-01
This article describes a new open source scientific workflow system, the TimeStudio Project, dedicated to the behavioral and brain sciences. The program is written in MATLAB and features a graphical user interface for the dynamic pipelining of computer algorithms developed as TimeStudio plugins. TimeStudio includes both a set of general plugins (for reading data files, modifying data structures, visualizing data structures, etc.) and a set of plugins specifically developed for the analysis of event-related eyetracking data as a proof of concept. It is possible to create custom plugins to integrate new or existing MATLAB code anywhere in a workflow, making TimeStudio a flexible workbench for organizing and performing a wide range of analyses. The system also features an integrated sharing and archiving tool for TimeStudio workflows, which can be used to share workflows both during the data analysis phase and after scientific publication. TimeStudio thus facilitates the reproduction and replication of scientific studies, increases the transparency of analyses, and reduces individual researchers' analysis workload. The project website ( http://timestudioproject.com ) contains the latest releases of TimeStudio, together with documentation and user forums.
2011-01-01
Background Image segmentation is a crucial step in quantitative microscopy that helps to define regions of tissues, cells or subcellular compartments. Depending on the degree of user interactions, segmentation methods can be divided into manual, automated or semi-automated approaches. 3D image stacks usually require automated methods due to their large number of optical sections. However, certain applications benefit from manual or semi-automated approaches. Scenarios include the quantification of 3D images with poor signal-to-noise ratios or the generation of so-called ground truth segmentations that are used to evaluate the accuracy of automated segmentation methods. Results We have developed Gebiss; an ImageJ plugin for the interactive segmentation, visualisation and quantification of 3D microscopic image stacks. We integrated a variety of existing plugins for threshold-based segmentation and volume visualisation. Conclusions We demonstrate the application of Gebiss to the segmentation of nuclei in live Drosophila embryos and the quantification of neurodegeneration in Drosophila larval brains. Gebiss was developed as a cross-platform ImageJ plugin and is freely available on the web at http://imaging.bii.a-star.edu.sg/projects/gebiss/. PMID:21668958
2013-06-01
benefitting from rapid, automated discrimination of specific predefined signals , and is free-standing (requiring no other plugins or packages). The...previously labeled dataset, and comparing two labeled datasets. 15. SUBJECT TERMS Artifact, signal detection, EEG, MATLAB, toolbox 16. SECURITY... CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT UU 18. NUMBER OF PAGES 56 19a. NAME OF RESPONSIBLE PERSON W. David Hairston a. REPORT
Supervisory Power Management Control Algorithms for Hybrid Electric Vehicles. A Survey
Malikopoulos, Andreas
2014-03-31
The growing necessity for environmentally benign hybrid propulsion systems has led to the development of advanced power management control algorithms to maximize fuel economy and minimize pollutant emissions. This paper surveys the control algorithms for hybrid electric vehicles (HEVs) and plug-in HEVs (PHEVs) that have been reported in the literature to date. The exposition ranges from parallel, series, and power split HEVs and PHEVs and includes a classification of the algorithms in terms of their implementation and the chronological order of their appearance. Remaining challenges and potential future research directions are also discussed.
Integrated Graphics Operations and Analysis Lab Development of Advanced Computer Graphics Algorithms
NASA Technical Reports Server (NTRS)
Wheaton, Ira M.
2011-01-01
The focus of this project is to aid the IGOAL in researching and implementing algorithms for advanced computer graphics. First, this project focused on porting the current International Space Station (ISS) Xbox experience to the web. Previously, the ISS interior fly-around education and outreach experience only ran on an Xbox 360. One of the desires was to take this experience and make it into something that can be put on NASA s educational site for anyone to be able to access. The current code works in the Unity game engine which does have cross platform capability but is not 100% compatible. The tasks for an intern to complete this portion consisted of gaining familiarity with Unity and the current ISS Xbox code, porting the Xbox code to the web as is, and modifying the code to work well as a web application. In addition, a procedurally generated cloud algorithm will be developed. Currently, the clouds used in AGEA animations and the Xbox experiences are a texture map. The desire is to create a procedurally generated cloud algorithm to provide dynamically generated clouds for both AGEA animations and the Xbox experiences. This task consists of gaining familiarity with AGEA and the plug-in interface, developing the algorithm, creating an AGEA plug-in to implement the algorithm inside AGEA, and creating a Unity script to implement the algorithm for the Xbox. This portion of the project was unable to be completed in the time frame of the internship; however, the IGOAL will continue to work on it in the future.
Wang, Junhua; Kong, Yumeng; Fu, Ting; Stipancic, Joshua
2017-01-01
This paper presents the use of the Aimsun microsimulation program to simulate vehicle violating behaviors and observe their impact on road traffic crash risk. Plugins for violations of speeding, slow driving, and abrupt stopping were developed using Aimsun's API and SDK module. A safety analysis plugin for investigating probability of rear-end collisions was developed, and a method for analyzing collision risk is proposed. A Fuzzy C-mean Clustering algorithm was developed to identify high risk states in different road segments over time. Results of a simulation experiment based on the G15 Expressway in Shanghai showed that abrupt stopping had the greatest impact on increasing collision risk, and the impact of violations increased with traffic volume. The methodology allows for the evaluation and monitoring of risks, alerting of road hazards, and identification of hotspots, and could be applied to the operations of existing facilities or planning of future ones.
Kong, Yumeng; Stipancic, Joshua
2017-01-01
This paper presents the use of the Aimsun microsimulation program to simulate vehicle violating behaviors and observe their impact on road traffic crash risk. Plugins for violations of speeding, slow driving, and abrupt stopping were developed using Aimsun’s API and SDK module. A safety analysis plugin for investigating probability of rear-end collisions was developed, and a method for analyzing collision risk is proposed. A Fuzzy C-mean Clustering algorithm was developed to identify high risk states in different road segments over time. Results of a simulation experiment based on the G15 Expressway in Shanghai showed that abrupt stopping had the greatest impact on increasing collision risk, and the impact of violations increased with traffic volume. The methodology allows for the evaluation and monitoring of risks, alerting of road hazards, and identification of hotspots, and could be applied to the operations of existing facilities or planning of future ones. PMID:28886141
Interactive visualization tools for the structural biologist.
Porebski, Benjamin T; Ho, Bosco K; Buckle, Ashley M
2013-10-01
In structural biology, management of a large number of Protein Data Bank (PDB) files and raw X-ray diffraction images often presents a major organizational problem. Existing software packages that manipulate these file types were not designed for these kinds of file-management tasks. This is typically encountered when browsing through a folder of hundreds of X-ray images, with the aim of rapidly inspecting the diffraction quality of a data set. To solve this problem, a useful functionality of the Macintosh operating system (OSX) has been exploited that allows custom visualization plugins to be attached to certain file types. Software plugins have been developed for diffraction images and PDB files, which in many scenarios can save considerable time and effort. The direct visualization of diffraction images and PDB structures in the file browser can be used to identify key files of interest simply by scrolling through a list of files.
Dill: an algorithm and a symbolic software package for doing classical supersymmetry calculations
NASA Astrophysics Data System (ADS)
Luc̆ić, Vladan
1995-11-01
An algorithm is presented that formalizes different steps in a classical Supersymmetric (SUSY) calculation. Based on the algorithm Dill, a symbolic software package, that can perform the calculations, is developed in the Mathematica programming language. While the algorithm is quite general, the package is created for the 4 - D, N = 1 model. Nevertheless, with little modification, the package could be used for other SUSY models. The package has been tested and some of the results are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ushizima, Daniela M.; Bianchi, Andrea G. C.; DeBianchi, Christina
We introduce a computational analysis workflow to access properties of solid objects using nondestructive imaging techniques that rely on X-ray imaging. The goal is to process and quantify structures from material science sample cross sections. The algorithms can differentiate the porous media (high density material) from the void (background, low density media) using a Boolean classifier, so that we can extract features, such as volume, surface area, granularity spectrum, porosity, among others. Our workflow, Quant-CT, leverages several algorithms from ImageJ, such as statistical region merging and 3D object counter. It also includes schemes for bilateral filtering that use a 3Dmore » kernel, for parallel processing of sub-stacks, and for handling over-segmentation using histogram similarities. The Quant-CT supports fast user interaction, providing the ability for the user to train the algorithm via subsamples to feed its core algorithms with automated parameterization. Quant-CT plugin is currently available for testing by personnel at the Advanced Light Source and Earth Sciences Divisions and Energy Frontier Research Center (EFRC), LBNL, as part of their research on porous materials. The goal is to understand the processes in fluid-rock systems for the geologic sequestration of CO2, and to develop technology for the safe storage of CO2 in deep subsurface rock formations. We describe our implementation, and demonstrate our plugin on porous material images. This paper targets end-users, with relevant information for developers to extend its current capabilities.« less
Interactive deformation registration of endorectal prostate MRI using ITK thin plate splines.
Cheung, M Rex; Krishnan, Karthik
2009-03-01
Magnetic resonance imaging with an endorectal coil allows high-resolution imaging of prostate cancer and the surrounding normal organs. These anatomic details can be used to direct radiotherapy. However, organ deformation introduced by the endorectal coil makes it difficult to register magnetic resonance images for treatment planning. In this study, plug-ins for the volume visualization software VolView were implemented on the basis of algorithms from the National Library of Medicine's Insight Segmentation and Registration Toolkit (ITK). Magnetic resonance images of a phantom simulating human pelvic structures were obtained with and without the endorectal coil balloon inflated. The prostate not deformed by the endorectal balloon was registered to the deformed prostate using an ITK thin plate spline (TPS). This plug-in allows the use of crop planes to limit the deformable registration in the region of interest around the prostate. These crop planes restricted the support of the TPS to the area around the prostate, where most of the deformation occurred. The region outside the crop planes was anchored by grid points. The TPS was more accurate in registering the local deformation of the prostate compared with a TPS variant, the elastic body spline. The TPS was also applied to register an in vivo T(2)-weighted endorectal magnetic resonance image. The intraprostatic tumor was accurately registered. This could potentially guide the boosting of intraprostatic targets. The source and target landmarks were placed graphically. This TPS plug-in allows the registration to be undone. The landmarks could be added, removed, and adjusted in real time and in three dimensions between repeated registrations. This interactive TPS plug-in allows a user to obtain a high level of accuracy satisfactory to a specific application efficiently. Because it is open-source software, the imaging community will be able to validate and improve the algorithm.
NDAS Hardware Translation Layer Development
NASA Technical Reports Server (NTRS)
Nazaretian, Ryan N.; Holladay, Wendy T.
2011-01-01
The NASA Data Acquisition System (NDAS) project is aimed to replace all DAS software for NASA s Rocket Testing Facilities. There must be a software-hardware translation layer so the software can properly talk to the hardware. Since the hardware from each test stand varies, drivers for each stand have to be made. These drivers will act more like plugins for the software. If the software is being used in E3, then the software should point to the E3 driver package. If the software is being used at B2, then the software should point to the B2 driver package. The driver packages should also be filled with hardware drivers that are universal to the DAS system. For example, since A1, A2, and B2 all use the Preston 8300AU signal conditioners, then the driver for those three stands should be the same and updated collectively.
Motion Law Analysis and Structural Optimization of the Ejection Device of Tray Seeder
NASA Astrophysics Data System (ADS)
Luo, Xin; Hu, Bin; Dong, Chunwang; Huang, Lili
An ejection mechanism consisting four reset springs, an electromagnet and a seed disk was designed for tray seeder. The motion conditions of seeds in the seed disk were theoretical analyzed and intensity and height of seed ejection were calculated. The motions of the seeds and seed disk were multi-body dynamic simulated using Cosmos modules plug-in SolidWorks software package. The simulation results showed the consistence with the theoretical analysis.
Fiji: an open-source platform for biological-image analysis.
Schindelin, Johannes; Arganda-Carreras, Ignacio; Frise, Erwin; Kaynig, Verena; Longair, Mark; Pietzsch, Tobias; Preibisch, Stephan; Rueden, Curtis; Saalfeld, Stephan; Schmid, Benjamin; Tinevez, Jean-Yves; White, Daniel James; Hartenstein, Volker; Eliceiri, Kevin; Tomancak, Pavel; Cardona, Albert
2012-06-28
Fiji is a distribution of the popular open-source software ImageJ focused on biological-image analysis. Fiji uses modern software engineering practices to combine powerful software libraries with a broad range of scripting languages to enable rapid prototyping of image-processing algorithms. Fiji facilitates the transformation of new algorithms into ImageJ plugins that can be shared with end users through an integrated update system. We propose Fiji as a platform for productive collaboration between computer science and biology research communities.
Amber Plug-In for Protein Shop
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oliva, Ricardo
2004-05-10
The Amber Plug-in for ProteinShop has two main components: an AmberEngine library to compute the protein energy models, and a module to solve the energy minimization problem using an optimization algorithm in the OPTI-+ library. Together, these components allow the visualization of the protein folding process in ProteinShop. AmberEngine is a object-oriented library to compute molecular energies based on the Amber model. The main class is called ProteinEnergy. Its main interface methods are (1) "init" to initialize internal variables needed to compute the energy. (2) "eval" to evaluate the total energy given a vector of coordinates. Additional methods allow themore » user to evaluate the individual components of the energy model (bond, angle, dihedral, non-bonded-1-4, and non-bonded energies) and to obtain the energy of each individual atom. The Amber Engine library source code includes examples and test routines that illustrate the use of the library in stand alone programs. The energy minimization module uses the AmberEngine library and the nonlinear optimization library OPT++. OPT++ is open source software available under the GNU Lesser General Public License. The minimization module currently makes use of the LBFGS optimization algorithm in OPT++ to perform the energy minimization. Future releases may give the user a choice of other algorithms available in OPT++.« less
Azahar: a PyMOL plugin for construction, visualization and analysis of glycan molecules
NASA Astrophysics Data System (ADS)
Arroyuelo, Agustina; Vila, Jorge A.; Martin, Osvaldo A.
2016-08-01
Glycans are key molecules in many physiological and pathological processes. As with other molecules, like proteins, visualization of the 3D structures of glycans adds valuable information for understanding their biological function. Hence, here we introduce Azahar, a computing environment for the creation, visualization and analysis of glycan molecules. Azahar is implemented in Python and works as a plugin for the well known PyMOL package (Schrodinger in The PyMOL molecular graphics system, version 1.3r1, 2010). Besides the already available visualization and analysis options provided by PyMOL, Azahar includes 3 cartoon-like representations and tools for 3D structure caracterization such as a comformational search using a Monte Carlo with minimization routine and also tools to analyse single glycans or trajectories/ensembles including the calculation of radius of gyration, Ramachandran plots and hydrogen bonds. Azahar is freely available to download from http://www.pymolwiki.org/index.php/Azahar and the source code is available at https://github.com/agustinaarroyuelo/Azahar.
Shi, Xu; Barnes, Robert O; Chen, Li; Shajahan-Haq, Ayesha N; Hilakivi-Clarke, Leena; Clarke, Robert; Wang, Yue; Xuan, Jianhua
2015-07-15
Identification of protein interaction subnetworks is an important step to help us understand complex molecular mechanisms in cancer. In this paper, we develop a BMRF-Net package, implemented in Java and C++, to identify protein interaction subnetworks based on a bagging Markov random field (BMRF) framework. By integrating gene expression data and protein-protein interaction data, this software tool can be used to identify biologically meaningful subnetworks. A user friendly graphic user interface is developed as a Cytoscape plugin for the BMRF-Net software to deal with the input/output interface. The detailed structure of the identified networks can be visualized in Cytoscape conveniently. The BMRF-Net package has been applied to breast cancer data to identify significant subnetworks related to breast cancer recurrence. The BMRF-Net package is available at http://sourceforge.net/projects/bmrfcjava/. The package is tested under Ubuntu 12.04 (64-bit), Java 7, glibc 2.15 and Cytoscape 3.1.0. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
A New Architecture for Extending the Capabilities of the Copernicus Trajectory Optimization Program
NASA Technical Reports Server (NTRS)
Williams, Jacob
2015-01-01
This paper describes a new plugin architecture developed for the Copernicus spacecraft trajectory optimization program. Details of the software architecture design and development are described, as well as examples of how the capability can be used to extend the tool in order to expand the type of trajectory optimization problems that can be solved. The inclusion of plugins is a significant update to Copernicus, allowing user-created algorithms to be incorporated into the tool for the first time. The initial version of the new capability was released to the Copernicus user community with version 4.1 in March 2015, and additional refinements and improvements were included in the recent 4.2 release. It is proving quite useful, enabling Copernicus to solve problems that it was not able to solve before.
Pteros 2.0: Evolution of the fast parallel molecular analysis library for C++ and python.
Yesylevskyy, Semen O
2015-07-15
Pteros is the high-performance open-source library for molecular modeling and analysis of molecular dynamics trajectories. Starting from version 2.0 Pteros is available for C++ and Python programming languages with very similar interfaces. This makes it suitable for writing complex reusable programs in C++ and simple interactive scripts in Python alike. New version improves the facilities for asynchronous trajectory reading and parallel execution of analysis tasks by introducing analysis plugins which could be written in either C++ or Python in completely uniform way. The high level of abstraction provided by analysis plugins greatly simplifies prototyping and implementation of complex analysis algorithms. Pteros is available for free under Artistic License from http://sourceforge.net/projects/pteros/. © 2015 Wiley Periodicals, Inc.
Java Application Shell: A Framework for Piecing Together Java Applications
NASA Technical Reports Server (NTRS)
Miller, Philip; Powers, Edward I. (Technical Monitor)
2001-01-01
This session describes the architecture of Java Application Shell (JAS), a Swing-based framework for developing interactive Java applications. Java Application Shell is being developed by Commerce One, Inc. for NASA Goddard Space Flight Center Code 588. The purpose of JAS is to provide a framework for the development of Java applications, providing features that enable the development process to be more efficient, consistent and flexible. Fundamentally, JAS is based upon an architecture where an application is considered a collection of 'plugins'. In turn, a plug-in is a collection of Swing actions defined using XML and packaged in a jar file. Plug-ins may be local to the host platform or remotely-accessible through HTTP. Local and remote plugins are automatically discovered by JAS upon application startup; plugins may also be loaded dynamically without having to re-start the application. Using Extensible Markup Language (XML) to define actions, as opposed to hardcoding them in application logic, allows easier customization of application-specific operations by separating application logic from presentation. Through XML, a developer defines an action that may appear on any number of menus, toolbars, and buttons. Actions maintain and propagate enable/disable states and specify icons, tool-tips, titles, etc. Furthermore, JAS allows actions to be implemented using various scripting languages through the use of IBM's Bean Scripting Framework. Scripted action implementation is seamless to the end-user. In addition to action implementation, scripts may be used for application and unit-level testing. In the case of application-level testing, JAS has hooks to assist a script in simulating end-user input. JAS also provides property and user preference management, JavaHelp, Undo/Redo, Multi-Document Interface, Single-Document Interface, printing, and logging. Finally, Jini technology has also been included into the framework by means of a Jini services browser and the ability to associate services with actions. Several Java technologies have been incorporated into JAS, including Swing, Internal Frames, Java Beans, XML, JavaScript, JavaHelp, and Jini. Additional information is contained in the original extended abstract.
Dynamic PROOF clusters with PoD: architecture and user experience
NASA Astrophysics Data System (ADS)
Manafov, Anar
2011-12-01
PROOF on Demand (PoD) is a tool-set, which sets up a PROOF cluster on any resource management system. PoD is a user oriented product with an easy to use GUI and a command-line interface. It is fully automated. No administrative privileges or special knowledge is required to use it. PoD utilizes a plug-in system, to use different job submission front-ends. The current PoD distribution is shipped with LSF, Torque (PBS), Grid Engine, Condor, gLite, and SSH plug-ins. The product is to be extended. We therefore plan to implement a plug-in for AliEn Grid as well. Recently developed algorithms made it possible to efficiently maintain two types of connections: packet-forwarding and native PROOF connections. This helps to properly handle most kinds of workers, with and without firewalls. PoD maintains the PROOF environment automatically and, for example, prevents resource misusage in case when workers idle for too long. As PoD matures as a product and provides more plug-ins, it's used as a standard for setting up dynamic PROOF clusters in many different institutions. The GSI Analysis Facility (GSIAF) is in production since 2007. The static PROOF cluster has been phased out end of 2009. GSIAF is now completely based on PoD. Users create private dynamic PROOF clusters on the general purpose batch farm. This provides an easier resource sharing between interactive local batch and Grid usage. The main user communities are FAIR and ALICE.
Development and validation of an open source quantification tool for DSC-MRI studies.
Gordaliza, P M; Mateos-Pérez, J M; Montesinos, P; Guzmán-de-Villoria, J A; Desco, M; Vaquero, J J
2015-03-01
This work presents the development of an open source tool for the quantification of dynamic susceptibility-weighted contrast-enhanced (DSC) perfusion studies. The development of this tool is motivated by the lack of open source tools implemented on open platforms to allow external developers to implement their own quantification methods easily and without the need of paying for a development license. This quantification tool was developed as a plugin for the ImageJ image analysis platform using the Java programming language. A modular approach was used in the implementation of the components, in such a way that the addition of new methods can be done without breaking any of the existing functionalities. For the validation process, images from seven patients with brain tumors were acquired and quantified with the presented tool and with a widely used clinical software package. The resulting perfusion parameters were then compared. Perfusion parameters and the corresponding parametric images were obtained. When no gamma-fitting is used, an excellent agreement with the tool used as a gold-standard was obtained (R(2)>0.8 and values are within 95% CI limits in Bland-Altman plots). An open source tool that performs quantification of perfusion studies using magnetic resonance imaging has been developed and validated using a clinical software package. It works as an ImageJ plugin and the source code has been published with an open source license. Copyright © 2015 Elsevier Ltd. All rights reserved.
AlgoRun: a Docker-based packaging system for platform-agnostic implemented algorithms.
Hosny, Abdelrahman; Vera-Licona, Paola; Laubenbacher, Reinhard; Favre, Thibauld
2016-08-01
There is a growing need in bioinformatics for easy-to-use software implementations of algorithms that are usable across platforms. At the same time, reproducibility of computational results is critical and often a challenge due to source code changes over time and dependencies. The approach introduced in this paper addresses both of these needs with AlgoRun, a dedicated packaging system for implemented algorithms, using Docker technology. Implemented algorithms, packaged with AlgoRun, can be executed through a user-friendly interface directly from a web browser or via a standardized RESTful web API to allow easy integration into more complex workflows. The packaged algorithm includes the entire software execution environment, thereby eliminating the common problem of software dependencies and the irreproducibility of computations over time. AlgoRun-packaged algorithms can be published on http://algorun.org, a centralized searchable directory to find existing AlgoRun-packaged algorithms. AlgoRun is available at http://algorun.org and the source code under GPL license is available at https://github.com/algorun laubenbacher@uchc.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Community-driven computational biology with Debian Linux.
Möller, Steffen; Krabbenhöft, Hajo Nils; Tille, Andreas; Paleino, David; Williams, Alan; Wolstencroft, Katy; Goble, Carole; Holland, Richard; Belhachemi, Dominique; Plessy, Charles
2010-12-21
The Open Source movement and its technologies are popular in the bioinformatics community because they provide freely available tools and resources for research. In order to feed the steady demand for updates on software and associated data, a service infrastructure is required for sharing and providing these tools to heterogeneous computing environments. The Debian Med initiative provides ready and coherent software packages for medical informatics and bioinformatics. These packages can be used together in Taverna workflows via the UseCase plugin to manage execution on local or remote machines. If such packages are available in cloud computing environments, the underlying hardware and the analysis pipelines can be shared along with the software. Debian Med closes the gap between developers and users. It provides a simple method for offering new releases of software and data resources, thus provisioning a local infrastructure for computational biology. For geographically distributed teams it can ensure they are working on the same versions of tools, in the same conditions. This contributes to the world-wide networking of researchers.
A High Fuel Consumption Efficiency Management Scheme for PHEVs Using an Adaptive Genetic Algorithm
Lee, Wah Ching; Tsang, Kim Fung; Chi, Hao Ran; Hung, Faan Hei; Wu, Chung Kit; Chui, Kwok Tai; Lau, Wing Hong; Leung, Yat Wah
2015-01-01
A high fuel efficiency management scheme for plug-in hybrid electric vehicles (PHEVs) has been developed. In order to achieve fuel consumption reduction, an adaptive genetic algorithm scheme has been designed to adaptively manage the energy resource usage. The objective function of the genetic algorithm is implemented by designing a fuzzy logic controller which closely monitors and resembles the driving conditions and environment of PHEVs, thus trading off between petrol versus electricity for optimal driving efficiency. Comparison between calculated results and publicized data shows that the achieved efficiency of the fuzzified genetic algorithm is better by 10% than existing schemes. The developed scheme, if fully adopted, would help reduce over 600 tons of CO2 emissions worldwide every day. PMID:25587974
Hajiaghababa, Fatemeh; Marateb, Hamid R; Kermani, Saeed
2018-06-01
Cochlear implants (CIs) are electronic devices restoring partial hearing to deaf individuals with profound hearing loss. In this paper, a new plug-in for traditional IIR filter-banks (FBs) is presented for cochlear implants based on wavelet neural networks (WNNs). Having provided such a plug-in for commercially available CIs, it is possible not only to use available hardware in the market but also to optimize their performance compared with the-state-of-the-art. An online database of Dutch diphone perception was used in our study. The weights of the WNNs were tuned using particle swarm optimization (PSO) on a training set (speech-shaped noise (SSN) of 2 dB SNR), while its performance was assessed on a test set in terms of objective and composite measures in the hold-out validation framework. The cost function was defined based on the combination of mean square error (MSE), short‑time objective intelligibility (STOI) criteria on the training set. Variety of performance indices were used including segmental signal- to -noise ratio (SNRseg), MSE, STOI, log-likelihood ratio (LLR), weighted spectral slope (WSS), and composite measures C sig , C bak and C ovl . Meanwhile, the following CI speech processing techniques were used for comparison: traditional FBs, dual resonance nonlinear (DRNL) and simple dual path nonlinear (SPDN) models. The average SNRseg, MSE, and LLR values for the WNN in the entire data set were 2.496 ± 2.794, 0.086 ± 0.025 and 2.323 ± 0.281, respectively. The proposed method significantly improved MSE, SNR, SNRseg, LLR, C sig C bak and C ovl compared with the other three methods (repeated-measures analysis of variance (ANOVA); P < 0.05). The average running time of the proposed algorithm (written in Matlab R2013a) on the training and test sets for each consonant or vowel on an Intel dual-core 2.10 GHz CPU with 2GB of RAM was 9.91 ± 0.87 (s) and 0.19 ± 0.01 (s), respectively. The proposed algorithm is accurate and precise and is thus a promising new plug-in for traditional CIs. Although the tuned algorithm is relatively fast, it is necessary to use efficient vectorized implementations for real-time CI speech signal processing. Copyright © 2018 Elsevier B.V. All rights reserved.
TLSpy: An Open-Source Addition to Terrestrial Lidar Workflows
NASA Astrophysics Data System (ADS)
Frechette, J. D.; Weissmann, G. S.; Wawrzyniec, T. F.
2008-12-01
Terrestrial lidar scanners (TLS) that capture three dimensional (3D) geometry with cm scale precision present many new opportunities in the Earth Sciences and related fields. However, the lack of domain specific tools impedes full and efficient utilization of the information contained in these datasets. Most processing and analysis is performed using a variety of manufacturing, surveying, airborne lidar, and GIS software. Although much overlap exists, inevitably some needs are not addressed by these applications. TLSpy provides a plugin driven framework with 3D visualization capabilities that encourages researchers to fill these gaps. The goal is to free researchers from the intellectual overhead imposed by user and data interface design, enabling rapid development of TLS specific processing and analysis algorithms. We present two plugins as examples of problems that TLSpy is being applied to. The first plugin corrects for the strong influence of target orientation on TLS measured reflectance intensities. It calculates the distribution of incidence angles and intensities in an input scan and assists the user in fitting a reflectance model to the distribution. The model is then used to normalize input intensities, minimizing the impact of surface orientation and simplifying the extraction of quantitative data from reflectance measurements. Although reasonable default models can be determined the large number of factors influencing reflectance values require that the plugin be designed for maximum flexibility, allowing the user to adjust all model parameters and define new reflectance models as needed. The second plugin helps eliminate multipath reflections from water surfaces. Characterized by a lower intensity mirror image of the subaerial bank appearing below the water surface, these reflections are a common problem in scans containing water. These erroneous reflections can be removed by manually selecting points that lie on the waterline, fitting a plane to the points, and deleting points below that plane. This plugin simplifies the process by automatically identifying waterline points using characteristic changes in geometry and intensity. Automatic identification is often faster and more reliable than manual identification, however, manual control is retained as a fallback for degenerate cases.
NASA Astrophysics Data System (ADS)
Pinner, J. W., IV
2016-02-01
Data from shipboard oceanographic sensors come in various formats and collection typically requires multiple data acquisition software packages running on multiple workstations throughout the vessel. Technicians must then corral all or a subset of the resulting data files so that they may be used by shipboard scientists. On many vessels the process of corralling files into a single cruise data package may change from cruise to cruise or even from technician to technician. It is these inconsistencies in the final cruise data packages that pose the greatest challenge when attempting to automate the process of cataloging cruise data for submission to data archives. A second challenge with the management of shipboard data is ensuring it's quality. Problems with sensors may go unnoticed simply because the technician/scientist was unaware the data from a sensor was absent, invalid, or out of range. The Open Vessel Data Management project (OpenVDM) is a ship-wide data management solution developed to address these issues. In the past three years OpenVDM has successfully demonstrated it's ability to adapt to the needs of vessels with different capabilities/missions while delivering a consistent cruise data package to scientists and adhering to the recommendations and best practices set forth by 3rd party data management groups such as R2R. In the last year OpenVDM has implemented a plugin architecture for monitoring data quality. This allowed vessel operators to develop custom data quality tests tailored to their vessel's unique raw datasets. Data quality test are performed in near-real-time and the results are readily available within a web-interface. This plugin architecture allows 3rd party data quality workgroups like SAMOS to migrate their data quality tests to the vessel and provide immediate determination of data quality. OpenVDM is currently operating aboard three vessels. The R/V Endeavor, operated by the University of Rhode Island, is a regional-class UNOLS research vessel operating under the traditional NFS, P.I. driven model. The E/V Nautilus, operated by the Ocean Exploration Trust specializes in ROV-based, telepresence-enabled oceanographic research. The R/V Falkor operated by the Schmidt Ocean Institute is an ocean research platform focusing on cutting-edge technology development.
Motmot, an open-source toolkit for realtime video acquisition and analysis.
Straw, Andrew D; Dickinson, Michael H
2009-07-22
Video cameras sense passively from a distance, offer a rich information stream, and provide intuitively meaningful raw data. Camera-based imaging has thus proven critical for many advances in neuroscience and biology, with applications ranging from cellular imaging of fluorescent dyes to tracking of whole-animal behavior at ecologically relevant spatial scales. Here we present 'Motmot': an open-source software suite for acquiring, displaying, saving, and analyzing digital video in real-time. At the highest level, Motmot is written in the Python computer language. The large amounts of data produced by digital cameras are handled by low-level, optimized functions, usually written in C. This high-level/low-level partitioning and use of select external libraries allow Motmot, with only modest complexity, to perform well as a core technology for many high-performance imaging tasks. In its current form, Motmot allows for: (1) image acquisition from a variety of camera interfaces (package motmot.cam_iface), (2) the display of these images with minimal latency and computer resources using wxPython and OpenGL (package motmot.wxglvideo), (3) saving images with no compression in a single-pass, low-CPU-use format (package motmot.FlyMovieFormat), (4) a pluggable framework for custom analysis of images in realtime and (5) firmware for an inexpensive USB device to synchronize image acquisition across multiple cameras, with analog input, or with other hardware devices (package motmot.fview_ext_trig). These capabilities are brought together in a graphical user interface, called 'FView', allowing an end user to easily view and save digital video without writing any code. One plugin for FView, 'FlyTrax', which tracks the movement of fruit flies in real-time, is included with Motmot, and is described to illustrate the capabilities of FView. Motmot enables realtime image processing and display using the Python computer language. In addition to the provided complete applications, the architecture allows the user to write relatively simple plugins, which can accomplish a variety of computer vision tasks and be integrated within larger software systems. The software is available at http://code.astraw.com/projects/motmot.
Variational Trajectory Optimization Tool Set: Technical description and user's manual
NASA Technical Reports Server (NTRS)
Bless, Robert R.; Queen, Eric M.; Cavanaugh, Michael D.; Wetzel, Todd A.; Moerder, Daniel D.
1993-01-01
The algorithms that comprise the Variational Trajectory Optimization Tool Set (VTOTS) package are briefly described. The VTOTS is a software package for solving nonlinear constrained optimal control problems from a wide range of engineering and scientific disciplines. The VTOTS package was specifically designed to minimize the amount of user programming; in fact, for problems that may be expressed in terms of analytical functions, the user needs only to define the problem in terms of symbolic variables. This version of the VTOTS does not support tabular data; thus, problems must be expressed in terms of analytical functions. The VTOTS package consists of two methods for solving nonlinear optimal control problems: a time-domain finite-element algorithm and a multiple shooting algorithm. These two algorithms, under the VTOTS package, may be run independently or jointly. The finite-element algorithm generates approximate solutions, whereas the shooting algorithm provides a more accurate solution to the optimization problem. A user's manual, some examples with results, and a brief description of the individual subroutines are included.
Ovesný, Martin; Křížek, Pavel; Borkovec, Josef; Švindrych, Zdeněk; Hagen, Guy M.
2014-01-01
Summary: ThunderSTORM is an open-source, interactive and modular plug-in for ImageJ designed for automated processing, analysis and visualization of data acquired by single-molecule localization microscopy methods such as photo-activated localization microscopy and stochastic optical reconstruction microscopy. ThunderSTORM offers an extensive collection of processing and post-processing methods so that users can easily adapt the process of analysis to their data. ThunderSTORM also offers a set of tools for creation of simulated data and quantitative performance evaluation of localization algorithms using Monte Carlo simulations. Availability and implementation: ThunderSTORM and the online documentation are both freely accessible at https://code.google.com/p/thunder-storm/ Contact: guy.hagen@lf1.cuni.cz Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24771516
An object oriented fully 3D tomography visual toolkit.
Agostinelli, S; Paoli, G
2001-04-01
In this paper we present a modern object oriented component object model (COMM) C + + toolkit dedicated to fully 3D cone-beam tomography. The toolkit allows the display and visual manipulation of analytical phantoms, projection sets and volumetric data through a standard Windows graphical user interface. Data input/output is performed using proprietary file formats but import/export of industry standard file formats, including raw binary, Windows bitmap and AVI, ACR/NEMA DICOMM 3 and NCSA HDF is available. At the time of writing built-in implemented data manipulators include a basic phantom ray-tracer and a Matrox Genesis frame grabbing facility. A COMM plug-in interface is provided for user-defined custom backprojector algorithms: a simple Feldkamp ActiveX control, including source code, is provided as an example; our fast Feldkamp plug-in is also available.
Glaser, Johann; Beisteiner, Roland; Bauer, Herbert; Fischmeister, Florian Ph S
2013-11-09
In concurrent EEG/fMRI recordings, EEG data are impaired by the fMRI gradient artifacts which exceed the EEG signal by several orders of magnitude. While several algorithms exist to correct the EEG data, these algorithms lack the flexibility to either leave out or add new steps. The here presented open-source MATLAB toolbox FACET is a modular toolbox for the fast and flexible correction and evaluation of imaging artifacts from concurrently recorded EEG datasets. It consists of an Analysis, a Correction and an Evaluation framework allowing the user to choose from different artifact correction methods with various pre- and post-processing steps to form flexible combinations. The quality of the chosen correction approach can then be evaluated and compared to different settings. FACET was evaluated on a dataset provided with the FMRIB plugin for EEGLAB using two different correction approaches: Averaged Artifact Subtraction (AAS, Allen et al., NeuroImage 12(2):230-239, 2000) and the FMRI Artifact Slice Template Removal (FASTR, Niazy et al., NeuroImage 28(3):720-737, 2005). Evaluation of the obtained results were compared to the FASTR algorithm implemented in the EEGLAB plugin FMRIB. No differences were found between the FACET implementation of FASTR and the original algorithm across all gradient artifact relevant performance indices. The FACET toolbox not only provides facilities for all three modalities: data analysis, artifact correction as well as evaluation and documentation of the results but it also offers an easily extendable framework for development and evaluation of new approaches.
2013-01-01
Background In concurrent EEG/fMRI recordings, EEG data are impaired by the fMRI gradient artifacts which exceed the EEG signal by several orders of magnitude. While several algorithms exist to correct the EEG data, these algorithms lack the flexibility to either leave out or add new steps. The here presented open-source MATLAB toolbox FACET is a modular toolbox for the fast and flexible correction and evaluation of imaging artifacts from concurrently recorded EEG datasets. It consists of an Analysis, a Correction and an Evaluation framework allowing the user to choose from different artifact correction methods with various pre- and post-processing steps to form flexible combinations. The quality of the chosen correction approach can then be evaluated and compared to different settings. Results FACET was evaluated on a dataset provided with the FMRIB plugin for EEGLAB using two different correction approaches: Averaged Artifact Subtraction (AAS, Allen et al., NeuroImage 12(2):230–239, 2000) and the FMRI Artifact Slice Template Removal (FASTR, Niazy et al., NeuroImage 28(3):720–737, 2005). Evaluation of the obtained results were compared to the FASTR algorithm implemented in the EEGLAB plugin FMRIB. No differences were found between the FACET implementation of FASTR and the original algorithm across all gradient artifact relevant performance indices. Conclusion The FACET toolbox not only provides facilities for all three modalities: data analysis, artifact correction as well as evaluation and documentation of the results but it also offers an easily extendable framework for development and evaluation of new approaches. PMID:24206927
Muth, Thilo; García-Martín, Juan A; Rausell, Antonio; Juan, David; Valencia, Alfonso; Pazos, Florencio
2012-02-15
We have implemented in a single package all the features required for extracting, visualizing and manipulating fully conserved positions as well as those with a family-dependent conservation pattern in multiple sequence alignments. The program allows, among other things, to run different methods for extracting these positions, combine the results and visualize them in protein 3D structures and sequence spaces. JDet is a multiplatform application written in Java. It is freely available, including the source code, at http://csbg.cnb.csic.es/JDet. The package includes two of our recently developed programs for detecting functional positions in protein alignments (Xdet and S3Det), and support for other methods can be added as plug-ins. A help file and a guided tutorial for JDet are also available.
CaFE: a tool for binding affinity prediction using end-point free energy methods.
Liu, Hui; Hou, Tingjun
2016-07-15
Accurate prediction of binding free energy is of particular importance to computational biology and structure-based drug design. Among those methods for binding affinity predictions, the end-point approaches, such as MM/PBSA and LIE, have been widely used because they can achieve a good balance between prediction accuracy and computational cost. Here we present an easy-to-use pipeline tool named Calculation of Free Energy (CaFE) to conduct MM/PBSA and LIE calculations. Powered by the VMD and NAMD programs, CaFE is able to handle numerous static coordinate and molecular dynamics trajectory file formats generated by different molecular simulation packages and supports various force field parameters. CaFE source code and documentation are freely available under the GNU General Public License via GitHub at https://github.com/huiliucode/cafe_plugin It is a VMD plugin written in Tcl and the usage is platform-independent. tingjunhou@zju.edu.cn. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Flowgen: Flowchart-based documentation for C + + codes
NASA Astrophysics Data System (ADS)
Kosower, David A.; Lopez-Villarejo, J. J.
2015-11-01
We present the Flowgen tool, which generates flowcharts from annotated C + + source code. The tool generates a set of interconnected high-level UML activity diagrams, one for each function or method in the C + + sources. It provides a simple and visual overview of complex implementations of numerical algorithms. Flowgen is complementary to the widely-used Doxygen documentation tool. The ultimate aim is to render complex C + + computer codes accessible, and to enhance collaboration between programmers and algorithm or science specialists. We describe the tool and a proof-of-concept application to the VINCIA plug-in for simulating collisions at CERN's Large Hadron Collider.
Caldas, Victor E A; Punter, Christiaan M; Ghodke, Harshad; Robinson, Andrew; van Oijen, Antoine M
2015-10-01
Recent technical advances have made it possible to visualize single molecules inside live cells. Microscopes with single-molecule sensitivity enable the imaging of low-abundance proteins, allowing for a quantitative characterization of molecular properties. Such data sets contain information on a wide spectrum of important molecular properties, with different aspects highlighted in different imaging strategies. The time-lapsed acquisition of images provides information on protein dynamics over long time scales, giving insight into expression dynamics and localization properties. Rapid burst imaging reveals properties of individual molecules in real-time, informing on their diffusion characteristics, binding dynamics and stoichiometries within complexes. This richness of information, however, adds significant complexity to analysis protocols. In general, large datasets of images must be collected and processed in order to produce statistically robust results and identify rare events. More importantly, as live-cell single-molecule measurements remain on the cutting edge of imaging, few protocols for analysis have been established and thus analysis strategies often need to be explored for each individual scenario. Existing analysis packages are geared towards either single-cell imaging data or in vitro single-molecule data and typically operate with highly specific algorithms developed for particular situations. Our tool, iSBatch, instead allows users to exploit the inherent flexibility of the popular open-source package ImageJ, providing a hierarchical framework in which existing plugins or custom macros may be executed over entire datasets or portions thereof. This strategy affords users freedom to explore new analysis protocols within large imaging datasets, while maintaining hierarchical relationships between experiments, samples, fields of view, cells, and individual molecules.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brant Peery; Sam Alessi; Randy Lee
2014-06-01
There is a need for a spatial decision support application that allows users to create customized metrics for comparing proposed locations of a new solar installation. This document discusses how PVMapper was designed to overcome the customization problem through the development of loosely coupled spatial and decision components in a JavaScript plugin architecture. This allows the user to easily add functionality and data to the system. The paper also explains how PVMapper provides the user with a dynamic and customizable decision tool that enables them to visually modify the formulas that are used in the decision algorithms that convert datamore » to comparable metrics. The technologies that make up the presentation and calculation software stack are outlined. This document also explains the architecture that allows the tool to grow through custom plugins created by the software users. Some discussion is given on the difficulties encountered while designing the system.« less
NASA Astrophysics Data System (ADS)
Deng, Chao; Ren, Wei; Mao, Yao; Ren, Ge
2017-08-01
A plug-in module acceleration feedback control (Plug-In AFC) strategy based on the disturbance observer (DOB) principle is proposed for charge-coupled device (CCD)-based fast steering mirror (FSM) stabilization systems. In classical FSM tracking systems, dual-loop control (DLC), including velocity feedback and position feedback, is usually utilized to enhance the closed-loop performance. Due to the mechanical resonance of the system and CCD time delay, the closed-loop bandwidth is severely restricted. To solve this problem, cascade acceleration feedback control (AFC), which is a kind of high-precision robust control method, is introduced to strengthen the disturbance rejection property. However, in practical applications, it is difficult to realize an integral algorithm in an acceleration controller to compensate for the quadratic differential contained in the FSM acceleration model, resulting in a challenging controller design and a limited improvement. To optimize the acceleration feedback framework in the FSM system, different from the cascade AFC, the accelerometers are used to construct DOB to compensate for the platform vibrations directly. The acceleration nested loop can be plugged into the velocity loop without changing the system stability, and the controller design is quite simple. A series of comparative experimental results demonstrate that the disturbance rejection property of the CCD-based FSM can be effectively improved by the proposed approach.
MassCascade: Visual Programming for LC-MS Data Processing in Metabolomics.
Beisken, Stephan; Earll, Mark; Portwood, David; Seymour, Mark; Steinbeck, Christoph
2014-04-01
Liquid chromatography coupled to mass spectrometry (LC-MS) is commonly applied to investigate the small molecule complement of organisms. Several software tools are typically joined in custom pipelines to semi-automatically process and analyse the resulting data. General workflow environments like the Konstanz Information Miner (KNIME) offer the potential of an all-in-one solution to process LC-MS data by allowing easy integration of different tools and scripts. We describe MassCascade and its workflow plug-in for processing LC-MS data. The Java library integrates frequently used algorithms in a modular fashion, thus enabling it to serve as back-end for graphical front-ends. The functions available in MassCascade have been encapsulated in a plug-in for the workflow environment KNIME, allowing combined use with e.g. statistical workflow nodes from other providers and making the tool intuitive to use without knowledge of programming. The design of the software guarantees a high level of modularity where processing functions can be quickly replaced or concatenated. MassCascade is an open-source library for LC-MS data processing in metabolomics. It embraces the concept of visual programming through its KNIME plug-in, simplifying the process of building complex workflows. The library was validated using open data.
A QR code identification technology in package auto-sorting system
NASA Astrophysics Data System (ADS)
di, Yi-Juan; Shi, Jian-Ping; Mao, Guo-Yong
2017-07-01
Traditional manual sorting operation is not suitable for the development of Chinese logistics. For better sorting packages, a QR code recognition technology is proposed to identify the QR code label on the packages in package auto-sorting system. The experimental results compared with other algorithms in literatures demonstrate that the proposed method is valid and its performance is superior to other algorithms.
Community-driven computational biology with Debian Linux
2010-01-01
Background The Open Source movement and its technologies are popular in the bioinformatics community because they provide freely available tools and resources for research. In order to feed the steady demand for updates on software and associated data, a service infrastructure is required for sharing and providing these tools to heterogeneous computing environments. Results The Debian Med initiative provides ready and coherent software packages for medical informatics and bioinformatics. These packages can be used together in Taverna workflows via the UseCase plugin to manage execution on local or remote machines. If such packages are available in cloud computing environments, the underlying hardware and the analysis pipelines can be shared along with the software. Conclusions Debian Med closes the gap between developers and users. It provides a simple method for offering new releases of software and data resources, thus provisioning a local infrastructure for computational biology. For geographically distributed teams it can ensure they are working on the same versions of tools, in the same conditions. This contributes to the world-wide networking of researchers. PMID:21210984
Watermarking spot colors in packaging
NASA Astrophysics Data System (ADS)
Reed, Alastair; Filler, TomáÅ.¡; Falkenstern, Kristyn; Bai, Yang
2015-03-01
In January 2014, Digimarc announced Digimarc® Barcode for the packaging industry to improve the check-out efficiency and customer experience for retailers. Digimarc Barcode is a machine readable code that carries the same information as a traditional Universal Product Code (UPC) and is introduced by adding a robust digital watermark to the package design. It is imperceptible to the human eye but can be read by a modern barcode scanner at the Point of Sale (POS) station. Compared to a traditional linear barcode, Digimarc Barcode covers the whole package with minimal impact on the graphic design. This significantly improves the Items per Minute (IPM) metric, which retailers use to track the checkout efficiency since it closely relates to their profitability. Increasing IPM by a few percent could lead to potential savings of millions of dollars for retailers, giving them a strong incentive to add the Digimarc Barcode to their packages. Testing performed by Digimarc showed increases in IPM of at least 33% using the Digimarc Barcode, compared to using a traditional barcode. A method of watermarking print ready image data used in the commercial packaging industry is described. A significant proportion of packages are printed using spot colors, therefore spot colors needs to be supported by an embedder for Digimarc Barcode. Digimarc Barcode supports the PANTONE spot color system, which is commonly used in the packaging industry. The Digimarc Barcode embedder allows a user to insert the UPC code in an image while minimizing perceptibility to the Human Visual System (HVS). The Digimarc Barcode is inserted in the printing ink domain, using an Adobe Photoshop plug-in as the last step before printing. Since Photoshop is an industry standard widely used by pre-press shops in the packaging industry, a Digimarc Barcode can be easily inserted and proofed.
[Propensity score matching in SPSS].
Huang, Fuqiang; DU, Chunlin; Sun, Menghui; Ning, Bing; Luo, Ying; An, Shengli
2015-11-01
To realize propensity score matching in PS Matching module of SPSS and interpret the analysis results. The R software and plug-in that could link with the corresponding versions of SPSS and propensity score matching package were installed. A PS matching module was added in the SPSS interface, and its use was demonstrated with test data. Score estimation and nearest neighbor matching was achieved with the PS matching module, and the results of qualitative and quantitative statistical description and evaluation were presented in the form of a graph matching. Propensity score matching can be accomplished conveniently using SPSS software.
AGAMA: Action-based galaxy modeling framework
NASA Astrophysics Data System (ADS)
Vasiliev, Eugene
2018-05-01
The AGAMA library models galaxies. It computes gravitational potential and forces, performs orbit integration and analysis, and can convert between position/velocity and action/angle coordinates. It offers a framework for finding best-fit parameters of a model from data and self-consistent multi-component galaxy models, and contains useful auxiliary utilities such as various mathematical routines. The core of the library is written in C++, and there are Python and Fortran interfaces. AGAMA may be used as a plugin for the stellar-dynamical software packages galpy (ascl:1411.008), AMUSE (ascl:1107.007), and NEMO (ascl:1010.051).
Modular power converter having fluid cooled support
Beihoff, Bruce C.; Radosevich, Lawrence D.; Meyer, Andreas A.; Gollhardt, Neil; Kannenberg, Daniel G.
2005-09-06
A support may receive one or more power electronic circuits. The support may aid in removing heat from the circuits through fluid circulating through the support. The support, in conjunction with other packaging features may form a shield from both external EMI/RFI and from interference generated by operation of the power electronic circuits. Features may be provided to permit and enhance connection of the circuitry to external circuitry, such as improved terminal configurations. Modular units may be assembled that may be coupled to electronic circuitry via plug-in arrangements or through interface with a backplane or similar mounting and interconnecting structures.
Modular power converter having fluid cooled support
Beihoff, Bruce C.; Radosevich, Lawrence D.; Meyer, Andreas A.; Gollhardt, Neil; Kannenberg, Daniel G.
2005-12-06
A support may receive one or more power electronic circuits. The support may aid in removing heat from the circuits through fluid circulating through the support. The support, in conjunction with other packaging features may form a shield from both external EMI/RFI and from interference generated by operation of the power electronic circuits. Features may be provided to permit and enhance connection of the circuitry to external circuitry, such as improved terminal configurations. Modular units may be assembled that may be coupled to electronic circuitry via plug-in arrangements or through interface with a backplane or similar mounting and interconnecting structures.
Compact fluid cooled power converter supporting multiple circuit boards
Radosevich, Lawrence D.; Meyer, Andreas A.; Beihoff, Bruce C.; Kannenberg, Daniel G.
2005-03-08
A support may receive one or more power electronic circuits. The support may aid in removing heat from the circuits through fluid circulating through the support. The support, in conjunction with other packaging features may form a shield from both external EMI/RFI and from interference generated by operation of the power electronic circuits. Features may be provided to permit and enhance connection of the circuitry to external circuitry, such as improved terminal configurations. Modular units may be assembled that may be coupled to electronic circuitry via plug-in arrangements or through interface with a backplane or similar mounting and interconnecting structures.
SBEToolbox: A Matlab Toolbox for Biological Network Analysis
Konganti, Kranti; Wang, Gang; Yang, Ence; Cai, James J.
2013-01-01
We present SBEToolbox (Systems Biology and Evolution Toolbox), an open-source Matlab toolbox for biological network analysis. It takes a network file as input, calculates a variety of centralities and topological metrics, clusters nodes into modules, and displays the network using different graph layout algorithms. Straightforward implementation and the inclusion of high-level functions allow the functionality to be easily extended or tailored through developing custom plugins. SBEGUI, a menu-driven graphical user interface (GUI) of SBEToolbox, enables easy access to various network and graph algorithms for programmers and non-programmers alike. All source code and sample data are freely available at https://github.com/biocoder/SBEToolbox/releases. PMID:24027418
SBEToolbox: A Matlab Toolbox for Biological Network Analysis.
Konganti, Kranti; Wang, Gang; Yang, Ence; Cai, James J
2013-01-01
We present SBEToolbox (Systems Biology and Evolution Toolbox), an open-source Matlab toolbox for biological network analysis. It takes a network file as input, calculates a variety of centralities and topological metrics, clusters nodes into modules, and displays the network using different graph layout algorithms. Straightforward implementation and the inclusion of high-level functions allow the functionality to be easily extended or tailored through developing custom plugins. SBEGUI, a menu-driven graphical user interface (GUI) of SBEToolbox, enables easy access to various network and graph algorithms for programmers and non-programmers alike. All source code and sample data are freely available at https://github.com/biocoder/SBEToolbox/releases.
Odyne Plug-In Hybrid Electric Utility Truck Testing | Transportation
Research | NREL Odyne Plug-In Hybrid Electric Utility Truck Evaluation Odyne Plug-In Hybrid data on plug-in hybrid electric utility trucks operated by a variety of companies. Photo courtesy of Odyne, NREL NREL is evaluating the in-service performance of about 120 plug-in hybrid electric utility
SoilJ - An ImageJ plugin for semi-automatized image-processing of 3-D X-ray images of soil columns
NASA Astrophysics Data System (ADS)
Koestel, John
2016-04-01
3-D X-ray imaging is a formidable tool for quantifying soil structural properties which are known to be extremely diverse. This diversity necessitates the collection of large sample sizes for adequately representing the spatial variability of soil structure at a specific sampling site. One important bottleneck of using X-ray imaging is however the large amount of time required by a trained specialist to process the image data which makes it difficult to process larger amounts of samples. The software SoilJ aims at removing this bottleneck by automatizing most of the required image processing steps needed to analyze image data of cylindrical soil columns. SoilJ is a plugin of the free Java-based image-processing software ImageJ. The plugin is designed to automatically process all images located with a designated folder. In a first step, SoilJ recognizes the outlines of the soil column upon which the column is rotated to an upright position and placed in the center of the canvas. Excess canvas is removed from the images. Then, SoilJ samples the grey values of the column material as well as the surrounding air in Z-direction. Assuming that the column material (mostly PVC of aluminium) exhibits a spatially constant density, these grey values serve as a proxy for the image illumination at a specific Z-coordinate. Together with the grey values of the air they are used to correct image illumination fluctuations which often occur along the axis of rotation during image acquisition. SoilJ includes also an algorithm for beam-hardening artefact removal and extended image segmentation options. Finally, SoilJ integrates the morphology analyses plugins of BoneJ (Doube et al., 2006, BoneJ Free and extensible bone image analysis in ImageJ. Bone 47: 1076-1079) and provides an ASCII file summarizing these measures for each investigated soil column, respectively. In the future it is planned to integrate SoilJ into FIJI, the maintained and updated edition of ImageJ with selected plugins.
Computational analysis on plug-in hybrid electric motorcycle chassis
NASA Astrophysics Data System (ADS)
Teoh, S. J.; Bakar, R. A.; Gan, L. M.
2013-12-01
Plug-in hybrid electric motorcycle (PHEM) is an alternative to promote sustainability lower emissions. However, the PHEM overall system packaging is constrained by limited space in a motorcycle chassis. In this paper, a chassis applying the concept of a Chopper is analysed to apply in PHEM. The chassis 3dimensional (3D) modelling is built with CAD software. The PHEM power-train components and drive-train mechanisms are intergraded into the 3D modelling to ensure the chassis provides sufficient space. Besides that, a human dummy model is built into the 3D modelling to ensure the rider?s ergonomics and comfort. The chassis 3D model then undergoes stress-strain simulation. The simulation predicts the stress distribution, displacement and factor of safety (FOS). The data are used to identify the critical point, thus suggesting the chassis design is applicable or need to redesign/ modify to meet the require strength. Critical points mean highest stress which might cause the chassis to fail. This point occurs at the joints at triple tree and bracket rear absorber for a motorcycle chassis. As a conclusion, computational analysis predicts the stress distribution and guideline to develop a safe prototype chassis.
Modrák, Martin; Vohradský, Jiří
2018-04-13
Identifying regulons of sigma factors is a vital subtask of gene network inference. Integrating multiple sources of data is essential for correct identification of regulons and complete gene regulatory networks. Time series of expression data measured with microarrays or RNA-seq combined with static binding experiments (e.g., ChIP-seq) or literature mining may be used for inference of sigma factor regulatory networks. We introduce Genexpi: a tool to identify sigma factors by combining candidates obtained from ChIP experiments or literature mining with time-course gene expression data. While Genexpi can be used to infer other types of regulatory interactions, it was designed and validated on real biological data from bacterial regulons. In this paper, we put primary focus on CyGenexpi: a plugin integrating Genexpi with the Cytoscape software for ease of use. As a part of this effort, a plugin for handling time series data in Cytoscape called CyDataseries has been developed and made available. Genexpi is also available as a standalone command line tool and an R package. Genexpi is a useful part of gene network inference toolbox. It provides meaningful information about the composition of regulons and delivers biologically interpretable results.
Iris: Constructing and Analyzing Spectral Energy Distributions with the Virtual Observatory
NASA Astrophysics Data System (ADS)
Laurino, O.; Budynkiewicz, J.; Busko, I.; Cresitello-Dittmar, M.; D'Abrusco, R.; Doe, S.; Evans, J.; Pevunova, O.
2014-05-01
We present Iris 2.0, the latest release of the Virtual Astronomical Observatory application for building and analyzing Spectral Energy Distributions (SEDs). With Iris, users may read in and display SEDs inspect and edit any selection of SED data, fit models to SEDs in arbitrary spectral ranges, and calculate confidence limits on best-fit parameters. SED data may be loaded into the application from VOTable and FITS files compliant with the International Virtual Observatoy Alliance interoperable data models, or retrieved directly from NED or the Italian Space Agency Science Data Center; data in non-standard formats may also be converted within the application. Users may seamlessy exchange data between Iris and other Virtual Observatoy tools using the Simple Application Messaging Protocol. Iris 2.0 also provides a tool for redshifting, interpolating, and measuring integratd fluxes, and allows simple aperture corrections for individual points and SED segments. Custom Python functions, template models and template libraries may be imported into Iris for fitting SEDs. Iris may be extended through Java plugins; users can install third-party packages, or develop their own plugin using Iris' Software Development Kit. Iris 2.0 is available for Linux and Mac OS X systems.
Open Ephys: an open-source, plugin-based platform for multichannel electrophysiology.
Siegle, Joshua H; López, Aarón Cuevas; Patel, Yogi A; Abramov, Kirill; Ohayon, Shay; Voigts, Jakob
2017-08-01
Closed-loop experiments, in which causal interventions are conditioned on the state of the system under investigation, have become increasingly common in neuroscience. Such experiments can have a high degree of explanatory power, but they require a precise implementation that can be difficult to replicate across laboratories. We sought to overcome this limitation by building open-source software that makes it easier to develop and share algorithms for closed-loop control. We created the Open Ephys GUI, an open-source platform for multichannel electrophysiology experiments. In addition to the standard 'open-loop' visualization and recording functionality, the GUI also includes modules for delivering feedback in response to events detected in the incoming data stream. Importantly, these modules can be built and shared as plugins, which makes it possible for users to extend the functionality of the GUI through a simple API, without having to understand the inner workings of the entire application. In combination with low-cost, open-source hardware for amplifying and digitizing neural signals, the GUI has been used for closed-loop experiments that perturb the hippocampal theta rhythm in a phase-specific manner. The Open Ephys GUI is the first widely used application for multichannel electrophysiology that leverages a plugin-based workflow. We hope that it will lower the barrier to entry for electrophysiologists who wish to incorporate real-time feedback into their research.
Open Ephys: an open-source, plugin-based platform for multichannel electrophysiology
NASA Astrophysics Data System (ADS)
Siegle, Joshua H.; Cuevas López, Aarón; Patel, Yogi A.; Abramov, Kirill; Ohayon, Shay; Voigts, Jakob
2017-08-01
Objective. Closed-loop experiments, in which causal interventions are conditioned on the state of the system under investigation, have become increasingly common in neuroscience. Such experiments can have a high degree of explanatory power, but they require a precise implementation that can be difficult to replicate across laboratories. We sought to overcome this limitation by building open-source software that makes it easier to develop and share algorithms for closed-loop control. Approach. We created the Open Ephys GUI, an open-source platform for multichannel electrophysiology experiments. In addition to the standard ‘open-loop’ visualization and recording functionality, the GUI also includes modules for delivering feedback in response to events detected in the incoming data stream. Importantly, these modules can be built and shared as plugins, which makes it possible for users to extend the functionality of the GUI through a simple API, without having to understand the inner workings of the entire application. Main results. In combination with low-cost, open-source hardware for amplifying and digitizing neural signals, the GUI has been used for closed-loop experiments that perturb the hippocampal theta rhythm in a phase-specific manner. Significance. The Open Ephys GUI is the first widely used application for multichannel electrophysiology that leverages a plugin-based workflow. We hope that it will lower the barrier to entry for electrophysiologists who wish to incorporate real-time feedback into their research.
Browndye: A Software Package for Brownian Dynamics
McCammon, J. Andrew
2010-01-01
A new software package, Browndye, is presented for simulating the diffusional encounter of two large biological molecules. It can be used to estimate second-order rate constants and encounter probabilities, and to explore reaction trajectories. Browndye builds upon previous knowledge and algorithms from software packages such as UHBD, SDA, and Macrodox, while implementing algorithms that scale to larger systems. PMID:21132109
Electrical power converter method and system employing multiple output converters
Beihoff, Bruce C [Wauwatosa, WI; Radosevich, Lawrence D [Muskego, WI; Meyer, Andreas A [Richmond Heights, OH; Gollhardt, Neil [Fox Point, WI; Kannenberg, Daniel G [Waukesha, WI
2007-05-01
A support may receive one or more power electronic circuits. The support may aid in removing heat from the circuits through fluid circulating through the support. The support, in conjunction with other packaging features may form a shield from both external EMI/RFI and from interference generated by operation of the power electronic circuits. Features may be provided to permit and enhance connection of the circuitry to external circuitry, such as improved terminal configurations. Modular units may be assembled that may be coupled to electronic circuitry via plug-in arrangements or through interface with a backplane or similar mounting and interconnecting structures.
Fluid cooled vehicle drive module
Beihoff, Bruce C.; Radosevich, Lawrence D.; Meyer, Andreas A.; Gollhardt, Neil; Kannenberg, Daniel G.
2005-11-15
An electric vehicle drive includes a support may receive one or more power electronic circuits. The support may aid in removing heat from the circuits through fluid circulating through the support. The support, in conjunction with other packaging features may form a shield from both external EM/RFI and from interference generated by operation of the power electronic circuits. Features may be provided to permit and enhance connection of the circuitry to external circuitry, such as improved terminal configurations. Modular units may be assembled that may be coupled to electronic circuitry via plug-in arrangements or through interface with a backplane or similar mounting and interconnecting structures.
Electrical power converter method and system employing multiple-output converters
Beihoff, Bruce C.; Radosevich, Lawrence D.; Meyer, Andreas A.; Gollhardt, Neil; Kannenberg, Daniel G.
2006-03-21
A support may receive one or more power electronic circuits. The support may aid in removing heat from the circuits through fluid circulating through the support. The support, in conjunction with other packaging features may form a shield from both external EMI/RFI and from interference generated by operation of the power electronic circuits. Features may be provided to permit and enhance connection of the circuitry to external circuitry, such as improved terminal configurations. Modular units may be assembled that may be coupled to electronic circuitry via plug-in arrangements or through interface with a backplane or similar mounting and interconnecting structures.
The Perseus computational platform for comprehensive analysis of (prote)omics data.
Tyanova, Stefka; Temu, Tikira; Sinitcyn, Pavel; Carlson, Arthur; Hein, Marco Y; Geiger, Tamar; Mann, Matthias; Cox, Jürgen
2016-09-01
A main bottleneck in proteomics is the downstream biological analysis of highly multivariate quantitative protein abundance data generated using mass-spectrometry-based analysis. We developed the Perseus software platform (http://www.perseus-framework.org) to support biological and biomedical researchers in interpreting protein quantification, interaction and post-translational modification data. Perseus contains a comprehensive portfolio of statistical tools for high-dimensional omics data analysis covering normalization, pattern recognition, time-series analysis, cross-omics comparisons and multiple-hypothesis testing. A machine learning module supports the classification and validation of patient groups for diagnosis and prognosis, and it also detects predictive protein signatures. Central to Perseus is a user-friendly, interactive workflow environment that provides complete documentation of computational methods used in a publication. All activities in Perseus are realized as plugins, and users can extend the software by programming their own, which can be shared through a plugin store. We anticipate that Perseus's arsenal of algorithms and its intuitive usability will empower interdisciplinary analysis of complex large data sets.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chertkov, Michael; Turitsyn, Konstantin; Sulc, Petr
The anticipated increase in the number of plug-in electric vehicles (EV) will put additional strain on electrical distribution circuits. Many control schemes have been proposed to control EV charging. Here, we develop control algorithms based on randomized EV charging start times and simple one-way broadcast communication allowing for a time delay between communication events. Using arguments from queuing theory and statistical analysis, we seek to maximize the utilization of excess distribution circuit capacity while keeping the probability of a circuit overload negligible.
Mousetrap: An integrated, open-source mouse-tracking package.
Kieslich, Pascal J; Henninger, Felix
2017-10-01
Mouse-tracking - the analysis of mouse movements in computerized experiments - is becoming increasingly popular in the cognitive sciences. Mouse movements are taken as an indicator of commitment to or conflict between choice options during the decision process. Using mouse-tracking, researchers have gained insight into the temporal development of cognitive processes across a growing number of psychological domains. In the current article, we present software that offers easy and convenient means of recording and analyzing mouse movements in computerized laboratory experiments. In particular, we introduce and demonstrate the mousetrap plugin that adds mouse-tracking to OpenSesame, a popular general-purpose graphical experiment builder. By integrating with this existing experimental software, mousetrap allows for the creation of mouse-tracking studies through a graphical interface, without requiring programming skills. Thus, researchers can benefit from the core features of a validated software package and the many extensions available for it (e.g., the integration with auxiliary hardware such as eye-tracking, or the support of interactive experiments). In addition, the recorded data can be imported directly into the statistical programming language R using the mousetrap package, which greatly facilitates analysis. Mousetrap is cross-platform, open-source and available free of charge from https://github.com/pascalkieslich/mousetrap-os .
Schmitter, Daniel; Wachowicz, Paulina; Sage, Daniel; Chasapi, Anastasia; Xenarios, Ioannis; Simanis; Unser, Michael
2013-01-01
The yeast Schizosaccharomyces pombe is frequently used as a model for studying the cell cycle. The cells are rod-shaped and divide by medial fission. The process of cell division, or cytokinesis, is controlled by a network of signaling proteins called the Septation Initiation Network (SIN); SIN proteins associate with the SPBs during nuclear division (mitosis). Some SIN proteins associate with both SPBs early in mitosis, and then display strongly asymmetric signal intensity at the SPBs in late mitosis, just before cytokinesis. This asymmetry is thought to be important for correct regulation of SIN signaling, and coordination of cytokinesis and mitosis. In order to study the dynamics of organelles or large protein complexes such as the spindle pole body (SPB), which have been labeled with a fluorescent protein tag in living cells, a number of the image analysis problems must be solved; the cell outline must be detected automatically, and the position and signal intensity associated with the structures of interest within the cell must be determined. We present a new 2D and 3D image analysis system that permits versatile and robust analysis of motile, fluorescently labeled structures in rod-shaped cells. We have designed an image analysis system that we have implemented as a user-friendly software package allowing the fast and robust image-analysis of large numbers of rod-shaped cells. We have developed new robust algorithms, which we combined with existing methodologies to facilitate fast and accurate analysis. Our software permits the detection and segmentation of rod-shaped cells in either static or dynamic (i.e. time lapse) multi-channel images. It enables tracking of two structures (for example SPBs) in two different image channels. For 2D or 3D static images, the locations of the structures are identified, and then intensity values are extracted together with several quantitative parameters, such as length, width, cell orientation, background fluorescence and the distance between the structures of interest. Furthermore, two kinds of kymographs of the tracked structures can be established, one representing the migration with respect to their relative position, the other representing their individual trajectories inside the cell. This software package, called "RodCellJ", allowed us to analyze a large number of S. pombe cells to understand the rules that govern SIN protein asymmetry. (Continued on next page) (Continued from previous page). "RodCellJ" is freely available to the community as a package of several ImageJ plugins to simultaneously analyze the behavior of a large number of rod-shaped cells in an extensive manner. The integration of different image-processing techniques in a single package, as well as the development of novel algorithms does not only allow to speed up the analysis with respect to the usage of existing tools, but also accounts for higher accuracy. Its utility was demonstrated on both 2D and 3D static and dynamic images to study the septation initiation network of the yeast Schizosaccharomyces pombe. More generally, it can be used in any kind of biological context where fluorescent-protein labeled structures need to be analyzed in rod-shaped cells. RodCellJ is freely available under http://bigwww.epfl.ch/algorithms.html.
NASA Astrophysics Data System (ADS)
Hayrapetyan, David B.; Hovhannisyan, Levon; Mantashyan, Paytsar A.
2013-04-01
The analysis of complex spectra is an actual problem for modern science. The work is devoted to the creation of a software package, which analyzes spectrum in the different formats, possesses by dynamic knowledge database and self-study mechanism, performs automated analysis of the spectra compound based on knowledge database by application of certain algorithms. In the software package as searching systems, hyper-spherical random search algorithms, gradient algorithms and genetic searching algorithms were used. The analysis of Raman and IR spectrum of diamond-like carbon (DLC) samples were performed by elaborated program. After processing the data, the program immediately displays all the calculated parameters of DLC.
Alternative Fuels Data Center: Developing Infrastructure to Charge Plug-In
Electric Vehicles Developing Infrastructure to Charge Plug-In Electric Vehicles to someone by E -mail Share Alternative Fuels Data Center: Developing Infrastructure to Charge Plug-In Electric Vehicles on Facebook Tweet about Alternative Fuels Data Center: Developing Infrastructure to Charge Plug-In
Neutrino oscillation parameter sampling with MonteCUBES
NASA Astrophysics Data System (ADS)
Blennow, Mattias; Fernandez-Martinez, Enrique
2010-01-01
We present MonteCUBES ("Monte Carlo Utility Based Experiment Simulator"), a software package designed to sample the neutrino oscillation parameter space through Markov Chain Monte Carlo algorithms. MonteCUBES makes use of the GLoBES software so that the existing experiment definitions for GLoBES, describing long baseline and reactor experiments, can be used with MonteCUBES. MonteCUBES consists of two main parts: The first is a C library, written as a plug-in for GLoBES, implementing the Markov Chain Monte Carlo algorithm to sample the parameter space. The second part is a user-friendly graphical Matlab interface to easily read, analyze, plot and export the results of the parameter space sampling. Program summaryProgram title: MonteCUBES (Monte Carlo Utility Based Experiment Simulator) Catalogue identifier: AEFJ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFJ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public Licence No. of lines in distributed program, including test data, etc.: 69 634 No. of bytes in distributed program, including test data, etc.: 3 980 776 Distribution format: tar.gz Programming language: C Computer: MonteCUBES builds and installs on 32 bit and 64 bit Linux systems where GLoBES is installed Operating system: 32 bit and 64 bit Linux RAM: Typically a few MBs Classification: 11.1 External routines: GLoBES [1,2] and routines/libraries used by GLoBES Subprograms used:Cat Id ADZI_v1_0, Title GLoBES, Reference CPC 177 (2007) 439 Nature of problem: Since neutrino masses do not appear in the standard model of particle physics, many models of neutrino masses also induce other types of new physics, which could affect the outcome of neutrino oscillation experiments. In general, these new physics imply high-dimensional parameter spaces that are difficult to explore using classical methods such as multi-dimensional projections and minimizations, such as those used in GLoBES [1,2]. Solution method: MonteCUBES is written as a plug-in to the GLoBES software [1,2] and provides the necessary methods to perform Markov Chain Monte Carlo sampling of the parameter space. This allows an efficient sampling of the parameter space and has a complexity which does not grow exponentially with the parameter space dimension. The integration of the MonteCUBES package with the GLoBES software makes sure that the experimental definitions already in use by the community can also be used with MonteCUBES, while also lowering the learning threshold for users who already know GLoBES. Additional comments: A Matlab GUI for interpretation of results is included in the distribution. Running time: The typical running time varies depending on the dimensionality of the parameter space, the complexity of the experiment, and how well the parameter space should be sampled. The running time for our simulations [3] with 15 free parameters at a Neutrino Factory with O(10) samples varied from a few hours to tens of hours. References:P. Huber, M. Lindner, W. Winter, Comput. Phys. Comm. 167 (2005) 195, hep-ph/0407333. P. Huber, J. Kopp, M. Lindner, M. Rolinec, W. Winter, Comput. Phys. Comm. 177 (2007) 432, hep-ph/0701187. S. Antusch, M. Blennow, E. Fernandez-Martinez, J. Lopez-Pavon, arXiv:0903.3986 [hep-ph].
Analysis of counting data: Development of the SATLAS Python package
NASA Astrophysics Data System (ADS)
Gins, W.; de Groote, R. P.; Bissell, M. L.; Granados Buitrago, C.; Ferrer, R.; Lynch, K. M.; Neyens, G.; Sels, S.
2018-01-01
For the analysis of low-statistics counting experiments, a traditional nonlinear least squares minimization routine may not always provide correct parameter and uncertainty estimates due to the assumptions inherent in the algorithm(s). In response to this, a user-friendly Python package (SATLAS) was written to provide an easy interface between the data and a variety of minimization algorithms which are suited for analyzinglow, as well as high, statistics data. The advantage of this package is that it allows the user to define their own model function and then compare different minimization routines to determine the optimal parameter values and their respective (correlated) errors. Experimental validation of the different approaches in the package is done through analysis of hyperfine structure data of 203Fr gathered by the CRIS experiment at ISOLDE, CERN.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kreutzer, Cory J.; Rugh, John; Tomerlin, Jeff
Increased market penetration of electric drive vehicles (EDVs) requires overcoming a number of hurdles, including limited vehicle range and the elevated cost in comparison to conventional vehicles. Climate control loads have a significant impact on range, cutting it by over 50% in both cooling and heating conditions. To minimize the impact of climate control on EDV range, the National Renewable Energy Laboratory has partnered with Hyundai America and key industry partners to quantify the performance of thermal load reduction technologies on a Hyundai Sonata plug-in hybrid electric vehicle. Technologies that impact vehicle cabin heating in cold weather conditions and cabinmore » cooling in warm weather conditions were evaluated. Tests included thermal transient and steady-state periods for all technologies, including the development of a new test methodology to evaluate the performance of occupant thermal conditioning. Heated surfaces demonstrated significant reductions in energy use from steady-state heating, including a 29%-59% reduction from heated surfaces. Solar control glass packages demonstrated significant reductions in energy use for both transient and steady-state cooling, with up to a 42% reduction in transient and 12.8% reduction in steady-state energy use for the packages evaluated. Technologies that demonstrated significant climate control load reduction were selected for incorporation into a complete thermal load reduction package. The complete package is set to be evaluated in the second phase of the ongoing project.« less
Intelligent emission-sensitive routing for plugin hybrid electric vehicles.
Sun, Zhonghao; Zhou, Xingshe
2016-01-01
The existing transportation sector creates heavily environmental impacts and is a prime cause for the current climate change. The need to reduce emissions from this sector has stimulated efforts to speed up the application of electric vehicles (EVs). A subset of EVs, called plug-in hybrid electric vehicles (PHEVs), backup batteries with combustion engine, which makes PHEVs have a comparable driving range to conventional vehicles. However, this hybridization comes at a cost of higher emissions than all-electric vehicles. This paper studies the routing problem for PHEVs to minimize emissions. The existing shortest-path based algorithms cannot be applied to solving this problem, because of the several new challenges: (1) an optimal route may contain circles caused by detour for recharging; (2) emissions of PHEVs not only depend on the driving distance, but also depend on the terrain and the state of charge (SOC) of batteries; (3) batteries can harvest energy by regenerative braking, which makes some road segments have negative energy consumption. To address these challenges, this paper proposes a green navigation algorithm (GNA) which finds the optimal strategies: where to go and where to recharge. GNA discretizes the SOC, then makes the PHEV routing problem to satisfy the principle of optimality. Finally, GNA adopts dynamic programming to solve the problem. We evaluate GNA using synthetic maps generated by the delaunay triangulation. The results show that GNA can save more than 10 % energy and reduce 10 % emissions when compared to the shortest path algorithm. We also observe that PHEVs with the battery capacity of 10-15 KWh detour most and nearly no detour when larger than 30 KWh. This observation gives some insights when developing PHEVs.
FERN - a Java framework for stochastic simulation and evaluation of reaction networks.
Erhard, Florian; Friedel, Caroline C; Zimmer, Ralf
2008-08-29
Stochastic simulation can be used to illustrate the development of biological systems over time and the stochastic nature of these processes. Currently available programs for stochastic simulation, however, are limited in that they either a) do not provide the most efficient simulation algorithms and are difficult to extend, b) cannot be easily integrated into other applications or c) do not allow to monitor and intervene during the simulation process in an easy and intuitive way. Thus, in order to use stochastic simulation in innovative high-level modeling and analysis approaches more flexible tools are necessary. In this article, we present FERN (Framework for Evaluation of Reaction Networks), a Java framework for the efficient simulation of chemical reaction networks. FERN is subdivided into three layers for network representation, simulation and visualization of the simulation results each of which can be easily extended. It provides efficient and accurate state-of-the-art stochastic simulation algorithms for well-mixed chemical systems and a powerful observer system, which makes it possible to track and control the simulation progress on every level. To illustrate how FERN can be easily integrated into other systems biology applications, plugins to Cytoscape and CellDesigner are included. These plugins make it possible to run simulations and to observe the simulation progress in a reaction network in real-time from within the Cytoscape or CellDesigner environment. FERN addresses shortcomings of currently available stochastic simulation programs in several ways. First, it provides a broad range of efficient and accurate algorithms both for exact and approximate stochastic simulation and a simple interface for extending to new algorithms. FERN's implementations are considerably faster than the C implementations of gillespie2 or the Java implementations of ISBJava. Second, it can be used in a straightforward way both as a stand-alone program and within new systems biology applications. Finally, complex scenarios requiring intervention during the simulation progress can be modelled easily with FERN.
Optimal segmentation and packaging process
Kostelnik, Kevin M.; Meservey, Richard H.; Landon, Mark D.
1999-01-01
A process for improving packaging efficiency uses three dimensional, computer simulated models with various optimization algorithms to determine the optimal segmentation process and packaging configurations based on constraints including container limitations. The present invention is applied to a process for decontaminating, decommissioning (D&D), and remediating a nuclear facility involving the segmentation and packaging of contaminated items in waste containers in order to minimize the number of cuts, maximize packaging density, and reduce worker radiation exposure. A three-dimensional, computer simulated, facility model of the contaminated items are created. The contaminated items are differentiated. The optimal location, orientation and sequence of the segmentation and packaging of the contaminated items is determined using the simulated model, the algorithms, and various constraints including container limitations. The cut locations and orientations are transposed to the simulated model. The contaminated items are actually segmented and packaged. The segmentation and packaging may be simulated beforehand. In addition, the contaminated items may be cataloged and recorded.
Tests of a Semi-Analytical Case 1 and Gelbstoff Case 2 SeaWiFS Algorithm with a Global Data Set
NASA Technical Reports Server (NTRS)
Carder, Kendall L.; Hawes, Steve K.; Lee, Zhongping
1997-01-01
A semi-analytical algorithm was tested with a total of 733 points of either unpackaged or packaged-pigment data, with corresponding algorithm parameters for each data type. The 'unpackaged' type consisted of data sets that were generally consistent with the Case 1 CZCS algorithm and other well calibrated data sets. The 'packaged' type consisted of data sets apparently containing somewhat more packaged pigments, requiring modification of the absorption parameters of the model consistent with the CalCOFI study area. This resulted in two equally divided data sets. A more thorough scrutiny of these and other data sets using a semianalytical model requires improved knowledge of the phytoplankton and gelbstoff of the specific environment studied. Since the semi-analytical algorithm is dependent upon 4 spectral channels including the 412 nm channel, while most other algorithms are not, a means of testing data sets for consistency was sought. A numerical filter was developed to classify data sets into the above classes. The filter uses reflectance ratios, which can be determined from space. The sensitivity of such numerical filters to measurement resulting from atmospheric correction and sensor noise errors requires further study. The semi-analytical algorithm performed superbly on each of the data sets after classification, resulting in RMS1 errors of 0.107 and 0.121, respectively, for the unpackaged and packaged data-set classes, with little bias and slopes near 1.0. In combination, the RMS1 performance was 0.114. While these numbers appear rather sterling, one must bear in mind what mis-classification does to the results. Using an average or compromise parameterization on the modified global data set yielded an RMS1 error of 0.171, while using the unpackaged parameterization on the global evaluation data set yielded an RMS1 error of 0.284. So, without classification, the algorithm performs better globally using the average parameters than it does using the unpackaged parameters. Finally, the effects of even more extreme pigment packaging must be examined in order to improve algorithm performance at high latitudes. Note, however, that the North Sea and Mississippi River plume studies contributed data to the packaged and unpackaged classess, respectively, with little effect on algorithm performance. This suggests that gelbstoff-rich Case 2 waters do not seriously degrade performance of the semi-analytical algorithm.
The software and algorithms for hyperspectral data processing
NASA Astrophysics Data System (ADS)
Shyrayeva, Anhelina; Martinov, Anton; Ivanov, Victor; Katkovsky, Leonid
2017-04-01
Hyperspectral remote sensing technique is widely used for collecting and processing -information about the Earth's surface objects. Hyperspectral data are combined to form a three-dimensional (x, y, λ) data cube. Department of Aerospace Research of the Institute of Applied Physical Problems of the Belarusian State University presents a general model of the software for hyperspectral image data analysis and processing. The software runs in Windows XP/7/8/8.1/10 environment on any personal computer. This complex has been has been written in C++ language using QT framework and OpenGL for graphical data visualization. The software has flexible structure that consists of a set of independent plugins. Each plugin was compiled as Qt Plugin and represents Windows Dynamic library (dll). Plugins can be categorized in terms of data reading types, data visualization (3D, 2D, 1D) and data processing The software has various in-built functions for statistical and mathematical analysis, signal processing functions like direct smoothing function for moving average, Savitzky-Golay smoothing technique, RGB correction, histogram transformation, and atmospheric correction. The software provides two author's engineering techniques for the solution of atmospheric correction problem: iteration method of refinement of spectral albedo's parameters using Libradtran and analytical least square method. The main advantages of these methods are high rate of processing (several minutes for 1 GB data) and low relative error in albedo retrieval (less than 15%). Also, the software supports work with spectral libraries, region of interest (ROI) selection, spectral analysis such as cluster-type image classification and automatic hypercube spectrum comparison by similarity criterion with similar ones from spectral libraries, and vice versa. The software deals with different kinds of spectral information in order to identify and distinguish spectrally unique materials. Also, the following advantages should be noted: fast and low memory hypercube manipulation features, user-friendly interface, modularity, and expandability.
A Highly Functional Decision Paradigm Based on Nonlinear Adaptive Genetic Algorithm
1997-10-07
significant speedup. p£ lC <$jALTnimm SCTED & 14. SUBJECT TERMS Network Topology Optimization, Mathlink, Mathematica Plug-In, GA Route Optimizer, DSP...operations per second 2.4 Gbytes/second sustainable on-chip data transfer rate 400 Mb/s off-chip peak transfer rate Layer-to-layer interconnection...SecondHighestDist# = DistanceArray%(IndexList%(ChromeGene%(i%, 1) -1), IndexList%( Chrom eGene%(i%, 2) - 1)) For j% = 1 To StrandLength% - 1 ’If highest distance
Alternative Fuels Data Center: Charging Plug-In Electric Vehicles in Public
in Public to someone by E-mail Share Alternative Fuels Data Center: Charging Plug-In Electric Vehicles in Public on Facebook Tweet about Alternative Fuels Data Center: Charging Plug-In Electric Vehicles in Public on Twitter Bookmark Alternative Fuels Data Center: Charging Plug-In Electric Vehicles in
Plug-In Hybrid Electric Vehicle Basics | NREL
Plug-In Hybrid Electric Vehicle Basics Plug-In Hybrid Electric Vehicle Basics Imagine being able to one that's in a standard hybrid electric vehicle. The larger battery pack allows plug-in hybrids to fuel from its onboard tank, and this provides a driving range (the distance a vehicle can travel
Mustafin, Zakhar Sergeevich; Lashin, Sergey Alexandrovich; Matushkin, Yury Georgievich; Gunbin, Konstantin Vladimirovich; Afonnikov, Dmitry Arkadievich
2017-01-27
There are many available software tools for visualization and analysis of biological networks. Among them, Cytoscape ( http://cytoscape.org/ ) is one of the most comprehensive packages, with many plugins and applications which extends its functionality by providing analysis of protein-protein interaction, gene regulatory and gene co-expression networks, metabolic, signaling, neural as well as ecological-type networks including food webs, communities networks etc. Nevertheless, only three plugins tagged 'network evolution' found in Cytoscape official app store and in literature. We have developed a new Cytoscape 3.0 application Orthoscape aimed to facilitate evolutionary analysis of gene networks and visualize the results. Orthoscape aids in analysis of evolutionary information available for gene sets and networks by highlighting: (1) the orthology relationships between genes; (2) the evolutionary origin of gene network components; (3) the evolutionary pressure mode (diversifying or stabilizing, negative or positive selection) of orthologous groups in general and/or branch-oriented mode. The distinctive feature of Orthoscape is the ability to control all data analysis steps via user-friendly interface. Orthoscape allows its users to analyze gene networks or separated gene sets in the context of evolution. At each step of data analysis, Orthoscape also provides for convenient visualization and data manipulation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stevens, K; Huang, T; Buttler, D
We present the C-Cat Wordnet package, an open source library for using and modifying Wordnet. The package includes four key features: an API for modifying Synsets; implementations of standard similarity metrics, implementations of well known Word Sense Disambiguation algorithms, and an implementation of the Castanet algorithm. The library is easily extendible and usable in many runtime environments. We demonstrate it's use on two standard Word Sense Disambiguation tasks and apply the Castanet algorithm to a corpus.
One-dimensional swarm algorithm packaging
NASA Astrophysics Data System (ADS)
Lebedev, Boris K.; Lebedev, Oleg B.; Lebedeva, Ekaterina O.
2018-05-01
The paper considers an algorithm for solving the problem of onedimensional packaging based on the adaptive behavior model of an ant colony. The key role in the development of the ant algorithm is the choice of representation (interpretation) of the solution. The structure of the solution search graph, the procedure for finding solutions on the graph, the methods of deposition and evaporation of pheromone are described. Unlike the canonical paradigm of an ant algorithm, an ant on the solution search graph generates sets of elements distributed across blocks. Experimental studies were conducted on IBM PC. Compared with the existing algorithms, the results are improved.
A method for feature selection of APT samples based on entropy
NASA Astrophysics Data System (ADS)
Du, Zhenyu; Li, Yihong; Hu, Jinsong
2018-05-01
By studying the known APT attack events deeply, this paper propose a feature selection method of APT sample and a logic expression generation algorithm IOCG (Indicator of Compromise Generate). The algorithm can automatically generate machine readable IOCs (Indicator of Compromise), to solve the existing IOCs logical relationship is fixed, the number of logical items unchanged, large scale and cannot generate a sample of the limitations of the expression. At the same time, it can reduce the redundancy and useless APT sample processing time consumption, and improve the sharing rate of information analysis, and actively respond to complex and volatile APT attack situation. The samples were divided into experimental set and training set, and then the algorithm was used to generate the logical expression of the training set with the IOC_ Aware plug-in. The contrast expression itself was different from the detection result. The experimental results show that the algorithm is effective and can improve the detection effect.
Advantages of GPU technology in DFT calculations of intercalated graphene
NASA Astrophysics Data System (ADS)
Pešić, J.; Gajić, R.
2014-09-01
Over the past few years, the expansion of general-purpose graphic-processing unit (GPGPU) technology has had a great impact on computational science. GPGPU is the utilization of a graphics-processing unit (GPU) to perform calculations in applications usually handled by the central processing unit (CPU). Use of GPGPUs as a way to increase computational power in the material sciences has significantly decreased computational costs in already highly demanding calculations. A level of the acceleration and parallelization depends on the problem itself. Some problems can benefit from GPU acceleration and parallelization, such as the finite-difference time-domain algorithm (FTDT) and density-functional theory (DFT), while others cannot take advantage of these modern technologies. A number of GPU-supported applications had emerged in the past several years (www.nvidia.com/object/gpu-applications.html). Quantum Espresso (QE) is reported as an integrated suite of open source computer codes for electronic-structure calculations and materials modeling at the nano-scale. It is based on DFT, the use of a plane-waves basis and a pseudopotential approach. Since the QE 5.0 version, it has been implemented as a plug-in component for standard QE packages that allows exploiting the capabilities of Nvidia GPU graphic cards (www.qe-forge.org/gf/proj). In this study, we have examined the impact of the usage of GPU acceleration and parallelization on the numerical performance of DFT calculations. Graphene has been attracting attention worldwide and has already shown some remarkable properties. We have studied an intercalated graphene, using the QE package PHonon, which employs GPU. The term ‘intercalation’ refers to a process whereby foreign adatoms are inserted onto a graphene lattice. In addition, by intercalating different atoms between graphene layers, it is possible to tune their physical properties. Our experiments have shown there are benefits from using GPUs, and we reached an acceleration of several times compared to standard CPU calculations.
Danaci, Hasan Fehmi; Cetin-Atalay, Rengul; Atalay, Volkan
2018-03-26
Visualizing large-scale data produced by the high throughput experiments as a biological graph leads to better understanding and analysis. This study describes a customized force-directed layout algorithm, EClerize, for biological graphs that represent pathways in which the nodes are associated with Enzyme Commission (EC) attributes. The nodes with the same EC class numbers are treated as members of the same cluster. Positions of nodes are then determined based on both the biological similarity and the connection structure. EClerize minimizes the intra-cluster distance, that is the distance between the nodes of the same EC cluster and maximizes the inter-cluster distance, that is the distance between two distinct EC clusters. EClerize is tested on a number of biological pathways and the improvement brought in is presented with respect to the original algorithm. EClerize is available as a plug-in to cytoscape ( http://apps.cytoscape.org/apps/eclerize ).
ePMV embeds molecular modeling into professional animation software environments.
Johnson, Graham T; Autin, Ludovic; Goodsell, David S; Sanner, Michel F; Olson, Arthur J
2011-03-09
Increasingly complex research has made it more difficult to prepare data for publication, education, and outreach. Many scientists must also wade through black-box code to interface computational algorithms from diverse sources to supplement their bench work. To reduce these barriers we have developed an open-source plug-in, embedded Python Molecular Viewer (ePMV), that runs molecular modeling software directly inside of professional 3D animation applications (hosts) to provide simultaneous access to the capabilities of these newly connected systems. Uniting host and scientific algorithms into a single interface allows users from varied backgrounds to assemble professional quality visuals and to perform computational experiments with relative ease. By enabling easy exchange of algorithms, ePMV can facilitate interdisciplinary research, smooth communication between broadly diverse specialties, and provide a common platform to frame and visualize the increasingly detailed intersection(s) of cellular and molecular biology. Copyright © 2011 Elsevier Ltd. All rights reserved.
ePMV Embeds Molecular Modeling into Professional Animation Software Environments
Johnson, Graham T.; Autin, Ludovic; Goodsell, David S.; Sanner, Michel F.; Olson, Arthur J.
2011-01-01
SUMMARY Increasingly complex research has made it more difficult to prepare data for publication, education, and outreach. Many scientists must also wade through black-box code to interface computational algorithms from diverse sources to supplement their bench work. To reduce these barriers, we have developed an open-source plug-in, embedded Python Molecular Viewer (ePMV), that runs molecular modeling software directly inside of professional 3D animation applications (hosts) to provide simultaneous access to the capabilities of these newly connected systems. Uniting host and scientific algorithms into a single interface allows users from varied backgrounds to assemble professional quality visuals and to perform computational experiments with relative ease. By enabling easy exchange of algorithms, ePMV can facilitate interdisciplinary research, smooth communication between broadly diverse specialties and provide a common platform to frame and visualize the increasingly detailed intersection(s) of cellular and molecular biology. PMID:21397181
Composing problem solvers for simulation experimentation: a case study on steady state estimation.
Leye, Stefan; Ewald, Roland; Uhrmacher, Adelinde M
2014-01-01
Simulation experiments involve various sub-tasks, e.g., parameter optimization, simulation execution, or output data analysis. Many algorithms can be applied to such tasks, but their performance depends on the given problem. Steady state estimation in systems biology is a typical example for this: several estimators have been proposed, each with its own (dis-)advantages. Experimenters, therefore, must choose from the available options, even though they may not be aware of the consequences. To support those users, we propose a general scheme to aggregate such algorithms to so-called synthetic problem solvers, which exploit algorithm differences to improve overall performance. Our approach subsumes various aggregation mechanisms, supports automatic configuration from training data (e.g., via ensemble learning or portfolio selection), and extends the plugin system of the open source modeling and simulation framework James II. We show the benefits of our approach by applying it to steady state estimation for cell-biological models.
Alternative Fuels Data Center: Los Angeles Sets the Stage for Plug-In
Electric Vehicles Los Angeles Sets the Stage for Plug-In Electric Vehicles to someone by E-mail Share Alternative Fuels Data Center: Los Angeles Sets the Stage for Plug-In Electric Vehicles on Facebook Tweet about Alternative Fuels Data Center: Los Angeles Sets the Stage for Plug-In Electric
Electric and Plug-In Hybrid Electric Fleet Vehicle Testing | Transportation
Research | NREL Electric and Plug-In Hybrid Electric Fleet Vehicle Evaluations Electric and Plug-In Hybrid Electric Fleet Vehicle Evaluations How Electric and Plug-In Hybrid Electric Vehicles Work EVs use batteries to store the electric energy that powers the motor. EV batteries are charged by
Gas flow calculation method of a ramjet engine
NASA Astrophysics Data System (ADS)
Kostyushin, Kirill; Kagenov, Anuar; Eremin, Ivan; Zhiltsov, Konstantin; Shuvarikov, Vladimir
2017-11-01
At the present study calculation methodology of gas dynamics equations in ramjet engine is presented. The algorithm is based on Godunov`s scheme. For realization of calculation algorithm, the system of data storage is offered, the system does not depend on mesh topology, and it allows using the computational meshes with arbitrary number of cell faces. The algorithm of building a block-structured grid is given. Calculation algorithm in the software package "FlashFlow" is implemented. Software package is verified on the calculations of simple configurations of air intakes and scramjet models.
Optimal segmentation and packaging process
Kostelnik, K.M.; Meservey, R.H.; Landon, M.D.
1999-08-10
A process for improving packaging efficiency uses three dimensional, computer simulated models with various optimization algorithms to determine the optimal segmentation process and packaging configurations based on constraints including container limitations. The present invention is applied to a process for decontaminating, decommissioning (D and D), and remediating a nuclear facility involving the segmentation and packaging of contaminated items in waste containers in order to minimize the number of cuts, maximize packaging density, and reduce worker radiation exposure. A three-dimensional, computer simulated, facility model of the contaminated items are created. The contaminated items are differentiated. The optimal location, orientation and sequence of the segmentation and packaging of the contaminated items is determined using the simulated model, the algorithms, and various constraints including container limitations. The cut locations and orientations are transposed to the simulated model. The contaminated items are actually segmented and packaged. The segmentation and packaging may be simulated beforehand. In addition, the contaminated items may be cataloged and recorded. 3 figs.
Untwisting the Caenorhabditis elegans embryo.
Christensen, Ryan Patrick; Bokinsky, Alexandra; Santella, Anthony; Wu, Yicong; Marquina-Solis, Javier; Guo, Min; Kovacevic, Ismar; Kumar, Abhishek; Winter, Peter W; Tashakkori, Nicole; McCreedy, Evan; Liu, Huafeng; McAuliffe, Matthew; Mohler, William; Colón-Ramos, Daniel A; Bao, Zhirong; Shroff, Hari
2015-12-03
The nematode Caenorhabditis elegans possesses a simple embryonic nervous system with few enough neurons that the growth of each cell could be followed to provide a systems-level view of development. However, studies of single cell development have largely been conducted in fixed or pre-twitching live embryos, because of technical difficulties associated with embryo movement in late embryogenesis. We present open-source untwisting and annotation software (http://mipav.cit.nih.gov/plugin_jws/mipav_worm_plugin.php) that allows the investigation of neurodevelopmental events in late embryogenesis and apply it to track the 3D positions of seam cell nuclei, neurons, and neurites in multiple elongating embryos. We also provide a tutorial describing how to use the software (Supplementary file 1) and a detailed description of the untwisting algorithm (Appendix). The detailed positional information we obtained enabled us to develop a composite model showing movement of these cells and neurites in an 'average' worm embryo. The untwisting and cell tracking capabilities of our method provide a foundation on which to catalog C. elegans neurodevelopment, allowing interrogation of developmental events in previously inaccessible periods of embryogenesis.
Cognitive Foundry v. 3.0 (OSS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Basilico, Justin; Dixon, Kevin; McClain, Jonathan
2009-11-18
The Cognitive Foundry is a unified collection of tools designed for research and applications that use cognitive modeling, machine learning, or pattern recognition. The software library contains design patterns, interface definitions, and default implementations of reusable software components and algorithms designed to support a wide variety of research and development needs. The library contains three main software packages: the Common package that contains basic utilities and linear algebraic methods, the Cognitive Framework package that contains tools to assist in implementing and analyzing theories of cognition, and the Machine Learning package that provides general algorithms and methods for populating Cognitive Frameworkmore » components from domain-relevant data.« less
Battery algorithm verification and development using hardware-in-the-loop testing
NASA Astrophysics Data System (ADS)
He, Yongsheng; Liu, Wei; Koch, Brain J.
Battery algorithms play a vital role in hybrid electric vehicles (HEVs), plug-in hybrid electric vehicles (PHEVs), extended-range electric vehicles (EREVs), and electric vehicles (EVs). The energy management of hybrid and electric propulsion systems needs to rely on accurate information on the state of the battery in order to determine the optimal electric drive without abusing the battery. In this study, a cell-level hardware-in-the-loop (HIL) system is used to verify and develop state of charge (SOC) and power capability predictions of embedded battery algorithms for various vehicle applications. Two different batteries were selected as representative examples to illustrate the battery algorithm verification and development procedure. One is a lithium-ion battery with a conventional metal oxide cathode, which is a power battery for HEV applications. The other is a lithium-ion battery with an iron phosphate (LiFePO 4) cathode, which is an energy battery for applications in PHEVs, EREVs, and EVs. The battery cell HIL testing provided valuable data and critical guidance to evaluate the accuracy of the developed battery algorithms, to accelerate battery algorithm future development and improvement, and to reduce hybrid/electric vehicle system development time and costs.
A clustering algorithm for determining community structure in complex networks
NASA Astrophysics Data System (ADS)
Jin, Hong; Yu, Wei; Li, ShiJun
2018-02-01
Clustering algorithms are attractive for the task of community detection in complex networks. DENCLUE is a representative density based clustering algorithm which has a firm mathematical basis and good clustering properties allowing for arbitrarily shaped clusters in high dimensional datasets. However, this method cannot be directly applied to community discovering due to its inability to deal with network data. Moreover, it requires a careful selection of the density parameter and the noise threshold. To solve these issues, a new community detection method is proposed in this paper. First, we use a spectral analysis technique to map the network data into a low dimensional Euclidean Space which can preserve node structural characteristics. Then, DENCLUE is applied to detect the communities in the network. A mathematical method named Sheather-Jones plug-in is chosen to select the density parameter which can describe the intrinsic clustering structure accurately. Moreover, every node on the network is meaningful so there were no noise nodes as a result the noise threshold can be ignored. We test our algorithm on both benchmark and real-life networks, and the results demonstrate the effectiveness of our algorithm over other popularity density based clustering algorithms adopted to community detection.
Solernou, Albert; Hanson, Benjamin S; Richardson, Robin A; Welch, Robert; Read, Daniel J; Harlen, Oliver G; Harris, Sarah A
2018-03-01
Fluctuating Finite Element Analysis (FFEA) is a software package designed to perform continuum mechanics simulations of proteins and other globular macromolecules. It combines conventional finite element methods with stochastic thermal noise, and is appropriate for simulations of large proteins and protein complexes at the mesoscale (length-scales in the range of 5 nm to 1 μm), where there is currently a paucity of modelling tools. It requires 3D volumetric information as input, which can be low resolution structural information such as cryo-electron tomography (cryo-ET) maps or much higher resolution atomistic co-ordinates from which volumetric information can be extracted. In this article we introduce our open source software package for performing FFEA simulations which we have released under a GPLv3 license. The software package includes a C ++ implementation of FFEA, together with tools to assist the user to set up the system from Electron Microscopy Data Bank (EMDB) or Protein Data Bank (PDB) data files. We also provide a PyMOL plugin to perform basic visualisation and additional Python tools for the analysis of FFEA simulation trajectories. This manuscript provides a basic background to the FFEA method, describing the implementation of the core mechanical model and how intermolecular interactions and the solvent environment are included within this framework. We provide prospective FFEA users with a practical overview of how to set up an FFEA simulation with reference to our publicly available online tutorials and manuals that accompany this first release of the package.
pubmed.mineR: an R package with text-mining algorithms to analyse PubMed abstracts.
Rani, Jyoti; Shah, A B Rauf; Ramachandran, Srinivasan
2015-10-01
The PubMed literature database is a valuable source of information for scientific research. It is rich in biomedical literature with more than 24 million citations. Data-mining of voluminous literature is a challenging task. Although several text-mining algorithms have been developed in recent years with focus on data visualization, they have limitations such as speed, are rigid and are not available in the open source. We have developed an R package, pubmed.mineR, wherein we have combined the advantages of existing algorithms, overcome their limitations, and offer user flexibility and link with other packages in Bioconductor and the Comprehensive R Network (CRAN) in order to expand the user capabilities for executing multifaceted approaches. Three case studies are presented, namely, 'Evolving role of diabetes educators', 'Cancer risk assessment' and 'Dynamic concepts on disease and comorbidity' to illustrate the use of pubmed.mineR. The package generally runs fast with small elapsed times in regular workstations even on large corpus sizes and with compute intensive functions. The pubmed.mineR is available at http://cran.rproject. org/web/packages/pubmed.mineR.
GenIce: Hydrogen-Disordered Ice Generator.
Matsumoto, Masakazu; Yagasaki, Takuma; Tanaka, Hideki
2018-01-05
GenIce is an efficient and user-friendly tool to generate hydrogen-disordered ice structures. It makes ice and clathrate hydrate structures in various file formats. More than 100 kinds of structures are preset. Users can install their own crystal structures, guest molecules, and file formats as plugins. The algorithm certifies that the generated structures are completely randomized hydrogen-disordered networks obeying the ice rule with zero net polarization. © 2017 The Authors. Journal of Computational Chemistry Published by Wiley Periodicals, Inc. © 2017 The Authors. Journal of Computational Chemistry Published by Wiley Periodicals, Inc.
Smart procurement of naturally generated energy (SPONGE) for PHEVs
NASA Astrophysics Data System (ADS)
Gu, Yingqi; Häusler, Florian; Griggs, Wynita; Crisostomi, Emanuele; Shorten, Robert
2016-07-01
In this paper, we propose a new engine management system for hybrid vehicles to enable energy providers and car manufacturers to provide new services. Energy forecasts are used to collaboratively orchestrate the behaviour of engine management systems of a fleet of plug-in hybrid electric vehicle (PHEVs) to absorb oncoming energy in a smart manner. Cooperative algorithms are suggested to manage the energy absorption in an optimal manner for a fleet of vehicles, and the mobility simulator SUMO (Simulation of Urban MObility) is used to demonstrate the efficacy of the proposed idea.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spotz, William F.
PyTrilinos is a set of Python interfaces to compiled Trilinos packages. This collection supports serial and parallel dense linear algebra, serial and parallel sparse linear algebra, direct and iterative linear solution techniques, algebraic and multilevel preconditioners, nonlinear solvers and continuation algorithms, eigensolvers and partitioning algorithms. Also included are a variety of related utility functions and classes, including distributed I/O, coloring algorithms and matrix generation. PyTrilinos vector objects are compatible with the popular NumPy Python package. As a Python front end to compiled libraries, PyTrilinos takes advantage of the flexibility and ease of use of Python, and the efficiency of themore » underlying C++, C and Fortran numerical kernels. This paper covers recent, previously unpublished advances in the PyTrilinos package.« less
Multi-particle phase space integration with arbitrary set of singularities in CompHEP
NASA Astrophysics Data System (ADS)
Kovalenko, D. N.; Pukhov, A. E.
1997-02-01
We describe an algorithm of multi-particle phase space integration for collision and decay processes realized in CompHEP package version 3.2. In the framework of this algorithm it is possible to regularize an arbitrary set of singularities caused by virtual particle propagators. The algorithm is based on the method of the recursive representation of kinematics and on the multichannel Monte Carlo approach. CompHEP package is available by WWW: http://theory.npi.msu.su/pukhov/comphep.html
rTANDEM, an R/Bioconductor package for MS/MS protein identification.
Fournier, Frédéric; Joly Beauparlant, Charles; Paradis, René; Droit, Arnaud
2014-08-01
rTANDEM is an R/Bioconductor package that interfaces the X!Tandem protein identification algorithm. The package can run the multi-threaded algorithm on proteomic data files directly from R. It also provides functions to convert search parameters and results to/from R as well as functions to manipulate parameters and automate searches. An associated R package, shinyTANDEM, provides a web-based graphical interface to visualize and interpret the results. Together, those two packages form an entry point for a general MS/MS-based proteomic pipeline in R/Bioconductor. rTANDEM and shinyTANDEM are distributed in R/Bioconductor, http://bioconductor.org/packages/release/bioc/. The packages are under open licenses (GPL-3 and Artistice-1.0). frederic.fournier@crchuq.ulaval.ca or arnaud.droit@crchuq.ulaval.ca Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Parallel Eclipse Project Checkout
NASA Technical Reports Server (NTRS)
Crockett, Thomas M.; Joswig, Joseph C.; Shams, Khawaja S.; Powell, Mark W.; Bachmann, Andrew G.
2011-01-01
Parallel Eclipse Project Checkout (PEPC) is a program written to leverage parallelism and to automate the checkout process of plug-ins created in Eclipse RCP (Rich Client Platform). Eclipse plug-ins can be aggregated in a feature project. This innovation digests a feature description (xml file) and automatically checks out all of the plug-ins listed in the feature. This resolves the issue of manually checking out each plug-in required to work on the project. To minimize the amount of time necessary to checkout the plug-ins, this program makes the plug-in checkouts parallel. After parsing the feature, a request to checkout for each plug-in in the feature has been inserted. These requests are handled by a thread pool with a configurable number of threads. By checking out the plug-ins in parallel, the checkout process is streamlined before getting started on the project. For instance, projects that took 30 minutes to checkout now take less than 5 minutes. The effect is especially clear on a Mac, which has a network monitor displaying the bandwidth use. When running the client from a developer s home, the checkout process now saturates the bandwidth in order to get all the plug-ins checked out as fast as possible. For comparison, a checkout process that ranged from 8-200 Kbps from a developer s home is now able to saturate a pipe of 1.3 Mbps, resulting in significantly faster checkouts. Eclipse IDE (integrated development environment) tries to build a project as soon as it is downloaded. As part of another optimization, this innovation programmatically tells Eclipse to stop building while checkouts are happening, which dramatically reduces lock contention and enables plug-ins to continue downloading until all of them finish. Furthermore, the software re-enables automatic building, and forces Eclipse to do a clean build once it finishes checking out all of the plug-ins. This software is fully generic and does not contain any NASA-specific code. It can be applied to any Eclipse-based repository with a similar structure. It also can apply build parameters and preferences automatically at the end of the checkout.
GillesPy: A Python Package for Stochastic Model Building and Simulation.
Abel, John H; Drawert, Brian; Hellander, Andreas; Petzold, Linda R
2016-09-01
GillesPy is an open-source Python package for model construction and simulation of stochastic biochemical systems. GillesPy consists of a Python framework for model building and an interface to the StochKit2 suite of efficient simulation algorithms based on the Gillespie stochastic simulation algorithms (SSA). To enable intuitive model construction and seamless integration into the scientific Python stack, we present an easy to understand, action-oriented programming interface. Here, we describe the components of this package and provide a detailed example relevant to the computational biology community.
GillesPy: A Python Package for Stochastic Model Building and Simulation
Abel, John H.; Drawert, Brian; Hellander, Andreas; Petzold, Linda R.
2017-01-01
GillesPy is an open-source Python package for model construction and simulation of stochastic biochemical systems. GillesPy consists of a Python framework for model building and an interface to the StochKit2 suite of efficient simulation algorithms based on the Gillespie stochastic simulation algorithms (SSA). To enable intuitive model construction and seamless integration into the scientific Python stack, we present an easy to understand, action-oriented programming interface. Here, we describe the components of this package and provide a detailed example relevant to the computational biology community. PMID:28630888
gkmSVM: an R package for gapped-kmer SVM
Ghandi, Mahmoud; Mohammad-Noori, Morteza; Ghareghani, Narges; Lee, Dongwon; Garraway, Levi; Beer, Michael A.
2016-01-01
Summary: We present a new R package for training gapped-kmer SVM classifiers for DNA and protein sequences. We describe an improved algorithm for kernel matrix calculation that speeds run time by about 2 to 5-fold over our original gkmSVM algorithm. This package supports several sequence kernels, including: gkmSVM, kmer-SVM, mismatch kernel and wildcard kernel. Availability and Implementation: gkmSVM package is freely available through the Comprehensive R Archive Network (CRAN), for Linux, Mac OS and Windows platforms. The C ++ implementation is available at www.beerlab.org/gkmsvm Contact: mghandi@gmail.com or mbeer@jhu.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153639
Air Markets Program Data (AMPD)
The Air Markets Program Data tool allows users to search EPA data to answer scientific, general, policy, and regulatory questions about industry emissions. Air Markets Program Data (AMPD) is a web-based application that allows users easy access to both current and historical data collected as part of EPA's emissions trading programs. This site allows you to create and view reports and to download emissions data for further analysis. AMPD provides a query tool so users can create custom queries of industry source emissions data, allowance data, compliance data, and facility attributes. In addition, AMPD provides interactive maps, charts, reports, and pre-packaged datasets. AMPD does not require any additional software, plug-ins, or security controls and can be accessed using a standard web browser.
Open source software to control Bioflo bioreactors.
Burdge, David A; Libourel, Igor G L
2014-01-01
Bioreactors are designed to support highly controlled environments for growth of tissues, cell cultures or microbial cultures. A variety of bioreactors are commercially available, often including sophisticated software to enhance the functionality of the bioreactor. However, experiments that the bioreactor hardware can support, but that were not envisioned during the software design cannot be performed without developing custom software. In addition, support for third party or custom designed auxiliary hardware is often sparse or absent. This work presents flexible open source freeware for the control of bioreactors of the Bioflo product family. The functionality of the software includes setpoint control, data logging, and protocol execution. Auxiliary hardware can be easily integrated and controlled through an integrated plugin interface without altering existing software. Simple experimental protocols can be entered as a CSV scripting file, and a Python-based protocol execution model is included for more demanding conditional experimental control. The software was designed to be a more flexible and free open source alternative to the commercially available solution. The source code and various auxiliary hardware plugins are publicly available for download from https://github.com/LibourelLab/BiofloSoftware. In addition to the source code, the software was compiled and packaged as a self-installing file for 32 and 64 bit windows operating systems. The compiled software will be able to control a Bioflo system, and will not require the installation of LabVIEW.
Open Source Software to Control Bioflo Bioreactors
Burdge, David A.; Libourel, Igor G. L.
2014-01-01
Bioreactors are designed to support highly controlled environments for growth of tissues, cell cultures or microbial cultures. A variety of bioreactors are commercially available, often including sophisticated software to enhance the functionality of the bioreactor. However, experiments that the bioreactor hardware can support, but that were not envisioned during the software design cannot be performed without developing custom software. In addition, support for third party or custom designed auxiliary hardware is often sparse or absent. This work presents flexible open source freeware for the control of bioreactors of the Bioflo product family. The functionality of the software includes setpoint control, data logging, and protocol execution. Auxiliary hardware can be easily integrated and controlled through an integrated plugin interface without altering existing software. Simple experimental protocols can be entered as a CSV scripting file, and a Python-based protocol execution model is included for more demanding conditional experimental control. The software was designed to be a more flexible and free open source alternative to the commercially available solution. The source code and various auxiliary hardware plugins are publicly available for download from https://github.com/LibourelLab/BiofloSoftware. In addition to the source code, the software was compiled and packaged as a self-installing file for 32 and 64 bit windows operating systems. The compiled software will be able to control a Bioflo system, and will not require the installation of LabVIEW. PMID:24667828
Bokulich, Nicholas A; Kaehler, Benjamin D; Rideout, Jai Ram; Dillon, Matthew; Bolyen, Evan; Knight, Rob; Huttley, Gavin A; Gregory Caporaso, J
2018-05-17
Taxonomic classification of marker-gene sequences is an important step in microbiome analysis. We present q2-feature-classifier ( https://github.com/qiime2/q2-feature-classifier ), a QIIME 2 plugin containing several novel machine-learning and alignment-based methods for taxonomy classification. We evaluated and optimized several commonly used classification methods implemented in QIIME 1 (RDP, BLAST, UCLUST, and SortMeRNA) and several new methods implemented in QIIME 2 (a scikit-learn naive Bayes machine-learning classifier, and alignment-based taxonomy consensus methods based on VSEARCH, and BLAST+) for classification of bacterial 16S rRNA and fungal ITS marker-gene amplicon sequence data. The naive-Bayes, BLAST+-based, and VSEARCH-based classifiers implemented in QIIME 2 meet or exceed the species-level accuracy of other commonly used methods designed for classification of marker gene sequences that were evaluated in this work. These evaluations, based on 19 mock communities and error-free sequence simulations, including classification of simulated "novel" marker-gene sequences, are available in our extensible benchmarking framework, tax-credit ( https://github.com/caporaso-lab/tax-credit-data ). Our results illustrate the importance of parameter tuning for optimizing classifier performance, and we make recommendations regarding parameter choices for these classifiers under a range of standard operating conditions. q2-feature-classifier and tax-credit are both free, open-source, BSD-licensed packages available on GitHub.
System and method for charging a plug-in electric vehicle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bassham, Marjorie A.; Spigno, Jr., Ciro A.; Muller, Brett T.
2017-05-02
A charging system and method that may be used to automatically apply customized charging settings to a plug-in electric vehicle, where application of the settings is based on the vehicle's location. According to an exemplary embodiment, a user may establish and save a separate charging profile with certain customized charging settings for each geographic location where they plan to charge their plug-in electric vehicle. Whenever the plug-in electric vehicle enters a new geographic area, the charging method may automatically apply the charging profile that corresponds to that area. Thus, the user does not have to manually change or manipulate themore » charging settings every time they charge the plug-in electric vehicle in a new location.« less
National Plug-In Electric Vehicle Infrastructure Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wood, Eric; Rames, Clement; Muratori, Matteo
This report addresses the fundamental question of how much plug-in electric vehicle (PEV) charging infrastructure—also known as electric vehicle supply equipment (EVSE)—is needed in the United States to support both plug-in hybrid electric vehicles (PHEVs) and battery electric vehicles (BEVs).
PathwayAccess: CellDesigner plugins for pathway databases.
Van Hemert, John L; Dickerson, Julie A
2010-09-15
CellDesigner provides a user-friendly interface for graphical biochemical pathway description. Many pathway databases are not directly exportable to CellDesigner models. PathwayAccess is an extensible suite of CellDesigner plugins, which connect CellDesigner directly to pathway databases using respective Java application programming interfaces. The process is streamlined for creating new PathwayAccess plugins for specific pathway databases. Three PathwayAccess plugins, MetNetAccess, BioCycAccess and ReactomeAccess, directly connect CellDesigner to the pathway databases MetNetDB, BioCyc and Reactome. PathwayAccess plugins enable CellDesigner users to expose pathway data to analytical CellDesigner functions, curate their pathway databases and visually integrate pathway data from different databases using standard Systems Biology Markup Language and Systems Biology Graphical Notation. Implemented in Java, PathwayAccess plugins run with CellDesigner version 4.0.1 and were tested on Ubuntu Linux, Windows XP and 7, and MacOSX. Source code, binaries, documentation and video walkthroughs are freely available at http://vrac.iastate.edu/~jlv.
NASA Astrophysics Data System (ADS)
De Vecchi, Daniele; Dell'Acqua, Fabio
2016-04-01
The EU FP7 MARSITE project aims at assessing the "state of the art" of seismic risk evaluation and management at European level, as a starting point to move a "step forward" towards new concepts of risk mitigation and management by long-term monitoring activities carried out both on land and at sea. Spaceborne Earth Observation (EO) is one of the means through which MARSITE is accomplishing this commitment, whose importance is growing as a consequence of the operational unfolding of the Copernicus initiative. Sentinel-2 data, with its open-data policy, represents an unprecedented opportunity to access global spaceborne multispectral data for various purposes including risk monitoring. In the framework of EU FP7 projects MARSITE, RASOR and SENSUM, our group has developed a suite of geospatial software tools to automatically extract risk-related features from EO data, especially on the exposure and vulnerability side of the "risk equation" [1]. These are for example the extension of a built-up area or the distribution of building density. These tools are available open-source as QGIS plug-ins [2] and their source code can be freely downloaded from GitHub [3]. A test case on the risk-prone mega city of Istanbul has been set up, and preliminary results will be presented in this paper. The output of the algorithms can be incorporated into a risk modeling process, whose output is very useful to stakeholders and decision makers who intend to assess and mitigate the risk level across the giant urban agglomerate. Keywords - Remote Sensing, Copernicus, Istanbul megacity, seismic risk, multi-risk, exposure, open-source References [1] Harb, M.M.; De Vecchi, D.; Dell'Acqua, F., "Physical Vulnerability Proxies from Remotes Sensing: Reviewing, Implementing and Disseminating Selected Techniques," Geoscience and Remote Sensing Magazine, IEEE , vol.3, no.1, pp.20,33, March 2015. doi: 10.1109/MGRS.2015.2398672 [2] SENSUM QGIS plugin, 2016, available online at: https://plugins.qgis.org/plugins/sensum_eo_tools/ [3] SENSUM QGIS code repository, 2016, available online at: https://github.com/SENSUM-project/sensum_rs_qgis
Amesos2 and Belos: Direct and Iterative Solvers for Large Sparse Linear Systems
Bavier, Eric; Hoemmen, Mark; Rajamanickam, Sivasankaran; ...
2012-01-01
Solvers for large sparse linear systems come in two categories: direct and iterative. Amesos2, a package in the Trilinos software project, provides direct methods, and Belos, another Trilinos package, provides iterative methods. Amesos2 offers a common interface to many different sparse matrix factorization codes, and can handle any implementation of sparse matrices and vectors, via an easy-to-extend C++ traits interface. It can also factor matrices whose entries have arbitrary “Scalar” type, enabling extended-precision and mixed-precision algorithms. Belos includes many different iterative methods for solving large sparse linear systems and least-squares problems. Unlike competing iterative solver libraries, Belos completely decouples themore » algorithms from the implementations of the underlying linear algebra objects. This lets Belos exploit the latest hardware without changes to the code. Belos favors algorithms that solve higher-level problems, such as multiple simultaneous linear systems and sequences of related linear systems, faster than standard algorithms. The package also supports extended-precision and mixed-precision algorithms. Together, Amesos2 and Belos form a complete suite of sparse linear solvers.« less
gkmSVM: an R package for gapped-kmer SVM.
Ghandi, Mahmoud; Mohammad-Noori, Morteza; Ghareghani, Narges; Lee, Dongwon; Garraway, Levi; Beer, Michael A
2016-07-15
We present a new R package for training gapped-kmer SVM classifiers for DNA and protein sequences. We describe an improved algorithm for kernel matrix calculation that speeds run time by about 2 to 5-fold over our original gkmSVM algorithm. This package supports several sequence kernels, including: gkmSVM, kmer-SVM, mismatch kernel and wildcard kernel. gkmSVM package is freely available through the Comprehensive R Archive Network (CRAN), for Linux, Mac OS and Windows platforms. The C ++ implementation is available at www.beerlab.org/gkmsvm mghandi@gmail.com or mbeer@jhu.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Algorithmic transformation of multi-loop master integrals to a canonical basis with CANONICA
NASA Astrophysics Data System (ADS)
Meyer, Christoph
2018-01-01
The integration of differential equations of Feynman integrals can be greatly facilitated by using a canonical basis. This paper presents the Mathematica package CANONICA, which implements a recently developed algorithm to automatize the transformation to a canonical basis. This represents the first publicly available implementation suitable for differential equations depending on multiple scales. In addition to the presentation of the package, this paper extends the description of some aspects of the algorithm, including a proof of the uniqueness of canonical forms up to constant transformations.
78 FR 70395 - Buy America Waiver Notification
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-25
...--light, medium, and heavy duty plug-in battery electric and compressed natural gas vehicles by Chicago..., medium, and heavy duty plug-in battery electric and compressed natural gas vehicles by Chicago DOT. In...--light, medium, and heavy duty plug-in battery electric and compressed natural gas vehicles ( http://www...
Algorithms and Sensors for Small Robot Path Following
NASA Technical Reports Server (NTRS)
Hogg, Robert W.; Rankin, Arturo L.; Roumeliotis, Stergios I.; McHenry, Michael C.; Helmick, Daniel M.; Bergh, Charles F.; Matthies, Larry
2002-01-01
Tracked mobile robots in the 20 kg size class are under development for applications in urban reconnaissance. For efficient deployment, it is desirable for teams of robots to be able to automatically execute path following behaviors, with one or more followers tracking the path taken by a leader. The key challenges to enabling such a capability are (l) to develop sensor packages for such small robots that can accurately determine the path of the leader and (2) to develop path following algorithms for the subsequent robots. To date, we have integrated gyros, accelerometers, compass/inclinometers, odometry, and differential GPS into an effective sensing package. This paper describes the sensor package, sensor processing algorithm, and path tracking algorithm we have developed for the leader/follower problem in small robots and shows the result of performance characterization of the system. We also document pragmatic lessons learned about design, construction, and electromagnetic interference issues particular to the performance of state sensors on small robots.
R-Based Software for the Integration of Pathway Data into Bioinformatic Algorithms
Kramer, Frank; Bayerlová, Michaela; Beißbarth, Tim
2014-01-01
Putting new findings into the context of available literature knowledge is one approach to deal with the surge of high-throughput data results. Furthermore, prior knowledge can increase the performance and stability of bioinformatic algorithms, for example, methods for network reconstruction. In this review, we examine software packages for the statistical computing framework R, which enable the integration of pathway data for further bioinformatic analyses. Different approaches to integrate and visualize pathway data are identified and packages are stratified concerning their features according to a number of different aspects: data import strategies, the extent of available data, dependencies on external tools, integration with further analysis steps and visualization options are considered. A total of 12 packages integrating pathway data are reviewed in this manuscript. These are supplemented by five R-specific packages for visualization and six connector packages, which provide access to external tools. PMID:24833336
Lan, Hongzhi; Updegrove, Adam; Wilson, Nathan M; Maher, Gabriel D; Shadden, Shawn C; Marsden, Alison L
2018-02-01
Patient-specific simulation plays an important role in cardiovascular disease research, diagnosis, surgical planning and medical device design, as well as education in cardiovascular biomechanics. simvascular is an open-source software package encompassing an entire cardiovascular modeling and simulation pipeline from image segmentation, three-dimensional (3D) solid modeling, and mesh generation, to patient-specific simulation and analysis. SimVascular is widely used for cardiovascular basic science and clinical research as well as education, following increased adoption by users and development of a GATEWAY web portal to facilitate educational access. Initial efforts of the project focused on replacing commercial packages with open-source alternatives and adding increased functionality for multiscale modeling, fluid-structure interaction (FSI), and solid modeling operations. In this paper, we introduce a major SimVascular (SV) release that includes a new graphical user interface (GUI) designed to improve user experience. Additional improvements include enhanced data/project management, interactive tools to facilitate user interaction, new boundary condition (BC) functionality, plug-in mechanism to increase modularity, a new 3D segmentation tool, and new computer-aided design (CAD)-based solid modeling capabilities. Here, we focus on major changes to the software platform and outline features added in this new release. We also briefly describe our recent experiences using SimVascular in the classroom for bioengineering education.
Untwisting the Caenorhabditis elegans embryo
Christensen, Ryan Patrick; Bokinsky, Alexandra; Santella, Anthony; Wu, Yicong; Marquina-Solis, Javier; Guo, Min; Kovacevic, Ismar; Kumar, Abhishek; Winter, Peter W; Tashakkori, Nicole; McCreedy, Evan; Liu, Huafeng; McAuliffe, Matthew; Mohler, William; Colón-Ramos, Daniel A; Bao, Zhirong; Shroff, Hari
2015-01-01
The nematode Caenorhabditis elegans possesses a simple embryonic nervous system with few enough neurons that the growth of each cell could be followed to provide a systems-level view of development. However, studies of single cell development have largely been conducted in fixed or pre-twitching live embryos, because of technical difficulties associated with embryo movement in late embryogenesis. We present open-source untwisting and annotation software (http://mipav.cit.nih.gov/plugin_jws/mipav_worm_plugin.php) that allows the investigation of neurodevelopmental events in late embryogenesis and apply it to track the 3D positions of seam cell nuclei, neurons, and neurites in multiple elongating embryos. We also provide a tutorial describing how to use the software (Supplementary file 1) and a detailed description of the untwisting algorithm (Appendix). The detailed positional information we obtained enabled us to develop a composite model showing movement of these cells and neurites in an 'average' worm embryo. The untwisting and cell tracking capabilities of our method provide a foundation on which to catalog C. elegans neurodevelopment, allowing interrogation of developmental events in previously inaccessible periods of embryogenesis. DOI: http://dx.doi.org/10.7554/eLife.10070.001 PMID:26633880
Integrated thermal and energy management of plug-in hybrid electric vehicles
NASA Astrophysics Data System (ADS)
Shams-Zahraei, Mojtaba; Kouzani, Abbas Z.; Kutter, Steffen; Bäker, Bernard
2012-10-01
In plug-in hybrid electric vehicles (PHEVs), the engine temperature declines due to reduced engine load and extended engine off period. It is proven that the engine efficiency and emissions depend on the engine temperature. Also, temperature influences the vehicle air-conditioner and the cabin heater loads. Particularly, while the engine is cold, the power demand of the cabin heater needs to be provided by the batteries instead of the waste heat of engine coolant. The existing energy management strategies (EMS) of PHEVs focus on the improvement of fuel efficiency based on hot engine characteristics neglecting the effect of temperature on the engine performance and the vehicle power demand. This paper presents a new EMS incorporating an engine thermal management method which derives the global optimal battery charge depletion trajectories. A dynamic programming-based algorithm is developed to enforce the charge depletion boundaries, while optimizing a fuel consumption cost function by controlling the engine power. The optimal control problem formulates the cost function based on two state variables: battery charge and engine internal temperature. Simulation results demonstrate that temperature and the cabin heater/air-conditioner power demand can significantly influence the optimal solution for the EMS, and accordingly fuel efficiency and emissions of PHEVs.
NASA Astrophysics Data System (ADS)
de Leon, Nathalie Pulmones
2011-12-01
With the increasing interest in green technologies in transportation, plug-in hybrid electric vehicles (PHEV) have proven to be the best short-term solution to minimize greenhouse gas emissions. Despite such interest, conventional vehicle drivers are still reluctant in using such a new technology, mainly because of the long duration (4-8 hours) required to charge PHEV batteries with the currently existing Level I and II chargers. For this reason, Level III fast-charging stations capable of reducing the charging duration to 10-15 minutes are being considered. The present thesis focuses on the design of a fast-charging station that uses, in addition to the electrical grid, two stationary energy storage devices: a flywheel energy storage and a supercapacitor. The power electronic converters used for the interface of the energy sources with the charging station are designed. The design also focuses on the energy management that will minimize the PHEV battery charging duration as well as the duration required to recharge the energy storage devices. For this reason, an algorithm that minimizes durations along with its mathematical formulation is proposed, and its application in fast charging environment will be illustrated by means of two scenarios.
Solernou, Albert
2018-01-01
Fluctuating Finite Element Analysis (FFEA) is a software package designed to perform continuum mechanics simulations of proteins and other globular macromolecules. It combines conventional finite element methods with stochastic thermal noise, and is appropriate for simulations of large proteins and protein complexes at the mesoscale (length-scales in the range of 5 nm to 1 μm), where there is currently a paucity of modelling tools. It requires 3D volumetric information as input, which can be low resolution structural information such as cryo-electron tomography (cryo-ET) maps or much higher resolution atomistic co-ordinates from which volumetric information can be extracted. In this article we introduce our open source software package for performing FFEA simulations which we have released under a GPLv3 license. The software package includes a C ++ implementation of FFEA, together with tools to assist the user to set up the system from Electron Microscopy Data Bank (EMDB) or Protein Data Bank (PDB) data files. We also provide a PyMOL plugin to perform basic visualisation and additional Python tools for the analysis of FFEA simulation trajectories. This manuscript provides a basic background to the FFEA method, describing the implementation of the core mechanical model and how intermolecular interactions and the solvent environment are included within this framework. We provide prospective FFEA users with a practical overview of how to set up an FFEA simulation with reference to our publicly available online tutorials and manuals that accompany this first release of the package. PMID:29570700
Plug-In Tutor Agents: Still Pluggin'
ERIC Educational Resources Information Center
Ritter, Steven
2016-01-01
"An Architecture for Plug-in Tutor Agents" (Ritter and Koedinger 1996) proposed a software architecture designed around the idea that tutors could be built as plug-ins for existing software applications. Looking back on the paper now, we can see that certain assumptions about the future of software architecture did not come to be, making…
Smith and Navistar Electric and Plug-In Hybrid Vehicle Testing |
plug-in hybrid electric vehicles operated by a variety of companies in diverse climates across the plug-in hybrid electric drive systems in medium-duty trucks operating in fleet service across the nation. U.S. companies agreeing to participate in this evaluation project received funding from the
Electrification of the transportation sector offers limited country-wide greenhouse gas reductions
NASA Astrophysics Data System (ADS)
Meinrenken, Christoph J.; Lackner, Klaus S.
2014-03-01
Compared with conventional propulsion, plugin and hybrid vehicles may offer reductions in greenhouse gas (GHG) emissions, regional air/noise pollution, petroleum dependence, and ownership cost. Comparing only plugins and hybrids amongst themselves, and focusing on GHG, relative merits of different options have been shown to be more nuanced, depending on grid-carbon-intensity, range and thus battery manufacturing and weight, and trip patterns. We present a life-cycle framework to compare GHG emissions for three drivetrains (plugin-electricity-only, gasoline-only-hybrid, and plugin-hybrid) across driving ranges and grid-carbon-intensities, for passenger cars, vans, buses, or trucks (well-to-wheel plus storage manufacturing). Parameter and model uncertainties are quantified via sensitivity analyses. We find that owing to the interplay of range, GHG/km, and portions of country-wide kms accessible to electrification, GHG reductions achievable from plugins (whether electricity-only or hybrids) are limited even when assuming low-carbon future grids. Furthermore, for policy makers considering GHG from electricity and transportation sectors combined, plugin technology may in fact increase GHG compared to gasoline-only-hybrids, regardless of grid-carbon-intensity.
Advanced Optimal Extraction for the Spitzer/IRS
NASA Astrophysics Data System (ADS)
Lebouteiller, V.; Bernard-Salas, J.; Sloan, G. C.; Barry, D. J.
2010-02-01
We present new advances in the spectral extraction of pointlike sources adapted to the Infrared Spectrograph (IRS) on board the Spitzer Space Telescope. For the first time, we created a supersampled point-spread function of the low-resolution modules. We describe how to use the point-spread function to perform optimal extraction of a single source and of multiple sources within the slit. We also examine the case of the optimal extraction of one or several sources with a complex background. The new algorithms are gathered in a plug-in called AdOpt which is part of the SMART data analysis software.
Electronic-type vacuum gauges with replaceable elements
Edwards, Jr., David
1984-01-01
In electronic devices for measuring pressures in vacuum systems, the metal elements which undergo thermal deterioration are made readily replaceable by making them parts of a simple plug-in unit. Thus, in ionization gauges, the filament and grid or electron collector are mounted on the novel plug-in unit. In thermocouple pressure gauges, the heater and attached thermocouple are mounted on the plug-in unit. Plug-in units have been designed to function, alternatively, as ionization gauge and as thermocouple gauge, thus providing new gauges capable of measuring broader pressure ranges than is possible with either an ionization gauge or a thermocouple gauge.
LST CGM Generator and Viewer Final Report CRADA No. TSB-1558-98
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vickers, Don; Larson, Don
The purpose of this project was to jointly develop and test a software plug-in that would convert native Pro /ENGINEER digital engineering drawings to Computer Graphics Metafile (CGM) format. If it was not feasible to convert the Pro/ENGINEER files, we planned to develop and test a similar conversion of native AutoCAD engineering drawings to CGM. CGM viewer plug-ins were developed as needed. There were four main tasks in this project: 1. Requirements for CGM Plug-in 2. Product Evaluation 3. Product Development Feasibility Study 4. Developing a "Plug-In" Application.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shishlo, Andrei P; Chu, Paul; Pelaia II, Tom
A data plotting package residing in the XAL tools set is presented. This package is based on Java SWING, and therefore it has the same portability as Java itself. The data types for charts, bar-charts, and color-surface plots are described. The algorithms, performance, interactive capabilities, limitations, and the best usage practices of this plotting package are discussed.
Development and application of GIS-based PRISM integration through a plugin approach
NASA Astrophysics Data System (ADS)
Lee, Woo-Seop; Chun, Jong Ahn; Kang, Kwangmin
2014-05-01
A PRISM (Parameter-elevation Regressions on Independent Slopes Model) QGIS-plugin was developed on Quantum GIS platform in this study. This Quantum GIS plugin system provides user-friendly graphic user interfaces (GUIs) so that users can obtain gridded meteorological data of high resolutions (1 km × 1 km). Also, this software is designed to run on a personal computer so that it does not require an internet access or a sophisticated computer system. This module is a user-friendly system that a user can generate PRISM data with ease. The proposed PRISM QGIS-plugin is a hybrid statistical-geographic model system that uses coarse resolution datasets (APHRODITE datasets in this study) with digital elevation data to generate the fine-resolution gridded precipitation. To validate the performance of the software, Prek Thnot River Basin in Kandal, Cambodia is selected for application. Overall statistical analysis shows promising outputs generated by the proposed plugin. Error measures such as RMSE (Root Mean Square Error) and MAPE (Mean Absolute Percentage Error) were used to evaluate the performance of the developed PRISM QGIS-plugin. Evaluation results using RMSE and MAPE were 2.76 mm and 4.2%, respectively. This study suggested that the plugin can be used to generate high resolution precipitation datasets for hydrological and climatological studies at a watershed where observed weather datasets are limited.
Alternative Fuels Data Center: Maintenance and Safety of Hybrid and Plug-In
Electric Vehicles Maintenance and Safety of Hybrid and Plug-In Electric Vehicles to someone by E-mail Share Alternative Fuels Data Center: Maintenance and Safety of Hybrid and Plug-In Electric Vehicles on Facebook Tweet about Alternative Fuels Data Center: Maintenance and Safety of Hybrid and Plug
Alternative Fuels Data Center: North Carolina Airport Advances With Plug-In
Electric BusesA> North Carolina Airport Advances With Plug-In Electric Buses to someone by E-mail passengers with plug-in hybrid electric buses. For information about this project, contact Centralina Clean . Provided by Maryland Public Television Related Videos Photo of a car Electric Vehicles Charge up at State
Alternative Fuels Data Center: Hybrid and Plug-In Electric Vehicle
Emissions Data Sources and Assumptions Hybrid and Plug-In Electric Vehicle Emissions Data Sources and Assumptions to someone by E-mail Share Alternative Fuels Data Center: Hybrid and Plug-In Electric Vehicle Emissions Data Sources and Assumptions on Facebook Tweet about Alternative Fuels Data
JPEG2000 and dissemination of cultural heritage over the Internet.
Politou, Eugenia A; Pavlidis, George P; Chamzas, Christodoulos
2004-03-01
By applying the latest technologies in image compression for managing the storage of massive image data within cultural heritage databases and by exploiting the universality of the Internet we are now able not only to effectively digitize, record and preserve, but also to promote the dissemination of cultural heritage. In this work we present an application of the latest image compression standard JPEG2000 in managing and browsing image databases, focusing on the image transmission aspect rather than database management and indexing. We combine the technologies of JPEG2000 image compression with client-server socket connections and client browser plug-in, as to provide with an all-in-one package for remote browsing of JPEG2000 compressed image databases, suitable for the effective dissemination of cultural heritage.
BCRA is an R package that projects absolute risk of invasive breast cancer according to NCI’s Breast Cancer Risk Assessment Tool (BCRAT) algorithm for specified race/ethnic groups and age intervals.
A fast hidden line algorithm with contour option. M.S. Thesis
NASA Technical Reports Server (NTRS)
Thue, R. E.
1984-01-01
The JonesD algorithm was modified to allow the processing of N-sided elements and implemented in conjunction with a 3-D contour generation algorithm. The total hidden line and contour subsystem is implemented in the MOVIE.BYU Display package, and is compared to the subsystems already existing in the MOVIE.BYU package. The comparison reveals that the modified JonesD hidden line and contour subsystem yields substantial processing time savings, when processing moderate sized models comprised of 1000 elements or less. There are, however, some limitations to the modified JonesD subsystem.
PathVisio-Faceted Search: an exploration tool for multi-dimensional navigation of large pathways
Fried, Jake Y.; Luna, Augustin
2013-01-01
Purpose: The PathVisio-Faceted Search plugin helps users explore and understand complex pathways by overlaying experimental data and data from webservices, such as Ensembl BioMart, onto diagrams drawn using formalized notations in PathVisio. The plugin then provides a filtering mechanism, known as a faceted search, to find and highlight diagram nodes (e.g. genes and proteins) of interest based on imported data. The tool additionally provides a flexible scripting mechanism to handle complex queries. Availability: The PathVisio-Faceted Search plugin is compatible with PathVisio 3.0 and above. PathVisio is compatible with Windows, Mac OS X and Linux. The plugin, documentation, example diagrams and Groovy scripts are available at http://PathVisio.org/wiki/PathVisioFacetedSearchHelp. The plugin is free, open-source and licensed by the Apache 2.0 License. Contact: augustin@mail.nih.gov or jakeyfried@gmail.com PMID:23547033
NASA Astrophysics Data System (ADS)
Niu, Yingli; Li, Wenqiang; Peng, Qian; Geng, Hua; Yi, Yuanping; Wang, Linjun; Nan, Guangjun; Wang, Dong; Shuai, Zhigang
2018-04-01
MOlecular MAterials Property Prediction Package (MOMAP) is a software toolkit for molecular materials property prediction. It focuses on luminescent properties and charge mobility properties. This article contains a brief descriptive introduction of key features, theoretical models and algorithms of the software, together with examples that illustrate the performance. First, we present the theoretical models and algorithms for molecular luminescent properties calculation, which includes the excited-state radiative/non-radiative decay rate constant and the optical spectra. Then, a multi-scale simulation approach and its algorithm for the molecular charge mobility are described. This approach is based on hopping model and combines with Kinetic Monte Carlo and molecular dynamics simulations, and it is especially applicable for describing a large category of organic semiconductors, whose inter-molecular electronic coupling is much smaller than intra-molecular charge reorganisation energy.
The Ettention software package.
Dahmen, Tim; Marsalek, Lukas; Marniok, Nico; Turoňová, Beata; Bogachev, Sviatoslav; Trampert, Patrick; Nickels, Stefan; Slusallek, Philipp
2016-02-01
We present a novel software package for the problem "reconstruction from projections" in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. Copyright © 2015 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Renke; Jin, Shuangshuang; Chen, Yousu
This paper presents a faster-than-real-time dynamic simulation software package that is designed for large-size power system dynamic simulation. It was developed on the GridPACKTM high-performance computing (HPC) framework. The key features of the developed software package include (1) faster-than-real-time dynamic simulation for a WECC system (17,000 buses) with different types of detailed generator, controller, and relay dynamic models, (2) a decoupled parallel dynamic simulation algorithm with optimized computation architecture to better leverage HPC resources and technologies, (3) options for HPC-based linear and iterative solvers, (4) hidden HPC details, such as data communication and distribution, to enable development centered on mathematicalmore » models and algorithms rather than on computational details for power system researchers, and (5) easy integration of new dynamic models and related algorithms into the software package.« less
Hybrid and Plug-in Electric Vehicles
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2014-05-20
Hybrid and plug-in electric vehicles use electricity either as their primary fuel or to improve the efficiency of conventional vehicle designs. This new generation of vehicles, often called electric drive vehicles, can be divided into three categories: hybrid electric vehicles (HEVs), plug-in hybrid electric vehicles(PHEVs), and all-electric vehicles (EVs). Together, they have great potential to reduce U.S. petroleum use.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bauer, Travis
2007-01-26
Toba is an extensible personal information retrieval system. It supports various plugins which the user uses to create and store bits of information. It comes configured to store meeting notes, task items, issue, and business development opportunities. Plugins could be written to support almost any kind of digital information. So with the right plugins, Toba could become a full fledged contact manager, project management application, programmer's toolkit, or almost any other type of data storage/search/retrieval application imaginable. Toba comes with a built in command line interface and via a plugin it has a fully scripting language (jython). The information storedmore » can be searched by keyword or through SQL queries.« less
Electronic-type vacuum gauges with replaceable elements
Edwards, D. Jr.
1984-09-18
In electronic devices for measuring pressures in vacuum systems, the metal elements which undergo thermal deterioration are made readily replaceable by making them parts of a simple plug-in unit. Thus, in ionization gauges, the filament and grid or electron collector are mounted on the novel plug-in unit. In thermocouple pressure gauges, the heater and attached thermocouple are mounted on the plug-in unit. Plug-in units have been designed to function, alternatively, as ionization gauge and as thermocouple gauge, thus providing new gauges capable of measuring broader pressure ranges than is possible with either an ionization gauge or a thermocouple gauge. 5 figs.
Integrand-level reduction of loop amplitudes by computational algebraic geometry methods
NASA Astrophysics Data System (ADS)
Zhang, Yang
2012-09-01
We present an algorithm for the integrand-level reduction of multi-loop amplitudes of renormalizable field theories, based on computational algebraic geometry. This algorithm uses (1) the Gröbner basis method to determine the basis for integrand-level reduction, (2) the primary decomposition of an ideal to classify all inequivalent solutions of unitarity cuts. The resulting basis and cut solutions can be used to reconstruct the integrand from unitarity cuts, via polynomial fitting techniques. The basis determination part of the algorithm has been implemented in the Mathematica package, BasisDet. The primary decomposition part can be readily carried out by algebraic geometry softwares, with the output of the package BasisDet. The algorithm works in both D = 4 and D = 4 - 2 ɛ dimensions, and we present some two and three-loop examples of applications of this algorithm.
TNA4OptFlux – a software tool for the analysis of strain optimization strategies
2013-01-01
Background Rational approaches for Metabolic Engineering (ME) deal with the identification of modifications that improve the microbes’ production capabilities of target compounds. One of the major challenges created by strain optimization algorithms used in these ME problems is the interpretation of the changes that lead to a given overproduction. Often, a single gene knockout induces changes in the fluxes of several reactions, as compared with the wild-type, and it is therefore difficult to evaluate the physiological differences of the in silico mutant. This is aggravated by the fact that genome-scale models per se are difficult to visualize, given the high number of reactions and metabolites involved. Findings We introduce a software tool, the Topological Network Analysis for OptFlux (TNA4OptFlux), a plug-in which adds to the open-source ME platform OptFlux the capability of creating and performing topological analysis over metabolic networks. One of the tool’s major advantages is the possibility of using these tools in the analysis and comparison of simulated phenotypes, namely those coming from the results of strain optimization algorithms. We illustrate the capabilities of the tool by using it to aid the interpretation of two E. coli strains designed in OptFlux for the overproduction of succinate and glycine. Conclusions Besides adding new functionalities to the OptFlux software tool regarding topological analysis, TNA4OptFlux methods greatly facilitate the interpretation of non-intuitive ME strategies by automating the comparison between perturbed and non-perturbed metabolic networks. The plug-in is available on the web site http://www.optflux.org, together with extensive documentation. PMID:23641878
Sokol, Serguei; Millard, Pierre; Portais, Jean-Charles
2012-03-01
The problem of stationary metabolic flux analysis based on isotope labelling experiments first appeared in the early 1950s and was basically solved in early 2000s. Several algorithms and software packages are available for this problem. However, the generic stochastic algorithms (simulated annealing or evolution algorithms) currently used in these software require a lot of time to achieve acceptable precision. For deterministic algorithms, a common drawback is the lack of convergence stability for ill-conditioned systems or when started from a random point. In this article, we present a new deterministic algorithm with significantly increased numerical stability and accuracy of flux estimation compared with commonly used algorithms. It requires relatively short CPU time (from several seconds to several minutes with a standard PC architecture) to estimate fluxes in the central carbon metabolism network of Escherichia coli. The software package influx_s implementing this algorithm is distributed under an OpenSource licence at http://metasys.insa-toulouse.fr/software/influx/. Supplementary data are available at Bioinformatics online.
DR-Integrator: a new analytic tool for integrating DNA copy number and gene expression data.
Salari, Keyan; Tibshirani, Robert; Pollack, Jonathan R
2010-02-01
DNA copy number alterations (CNA) frequently underlie gene expression changes by increasing or decreasing gene dosage. However, only a subset of genes with altered dosage exhibit concordant changes in gene expression. This subset is likely to be enriched for oncogenes and tumor suppressor genes, and can be identified by integrating these two layers of genome-scale data. We introduce DNA/RNA-Integrator (DR-Integrator), a statistical software tool to perform integrative analyses on paired DNA copy number and gene expression data. DR-Integrator identifies genes with significant correlations between DNA copy number and gene expression, and implements a supervised analysis that captures genes with significant alterations in both DNA copy number and gene expression between two sample classes. DR-Integrator is freely available for non-commercial use from the Pollack Lab at http://pollacklab.stanford.edu/ and can be downloaded as a plug-in application to Microsoft Excel and as a package for the R statistical computing environment. The R package is available under the name 'DRI' at http://cran.r-project.org/. An example analysis using DR-Integrator is included as supplemental material. Supplementary data are available at Bioinformatics online.
ImgLib2--generic image processing in Java.
Pietzsch, Tobias; Preibisch, Stephan; Tomancák, Pavel; Saalfeld, Stephan
2012-11-15
ImgLib2 is an open-source Java library for n-dimensional data representation and manipulation with focus on image processing. It aims at minimizing code duplication by cleanly separating pixel-algebra, data access and data representation in memory. Algorithms can be implemented for classes of pixel types and generic access patterns by which they become independent of the specific dimensionality, pixel type and data representation. ImgLib2 illustrates that an elegant high-level programming interface can be achieved without sacrificing performance. It provides efficient implementations of common data types, storage layouts and algorithms. It is the data model underlying ImageJ2, the KNIME Image Processing toolbox and an increasing number of Fiji-Plugins. ImgLib2 is licensed under BSD. Documentation and source code are available at http://imglib2.net and in a public repository at https://github.com/imagej/imglib. Supplementary data are available at Bioinformatics Online. saalfeld@mpi-cbg.de
DiversePathsJ: diverse shortest paths for bioimage analysis.
Uhlmann, Virginie; Haubold, Carsten; Hamprecht, Fred A; Unser, Michael
2018-02-01
We introduce a formulation for the general task of finding diverse shortest paths between two end-points. Our approach is not linked to a specific biological problem and can be applied to a large variety of images thanks to its generic implementation as a user-friendly ImageJ/Fiji plugin. It relies on the introduction of additional layers in a Viterbi path graph, which requires slight modifications to the standard Viterbi algorithm rules. This layered graph construction allows for the specification of various constraints imposing diversity between solutions. The software allows obtaining a collection of diverse shortest paths under some user-defined constraints through a convenient and user-friendly interface. It can be used alone or be integrated into larger image analysis pipelines. http://bigwww.epfl.ch/algorithms/diversepathsj. michael.unser@epfl.ch or fred.hamprecht@iwr.uni-heidelberg.de. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.
ZFP compression plugin (filter) for HDF5
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Mark C.
H5Z-ZFP is a compression plugin (filter) for the HDF5 library based upon the ZFP-0.5.0 compression library. It supports 4- or 8-byte integer or floating point HDF5 datasets of any dimension but partitioned in 1, 2, or 3 dimensional chunks. It supports ZFP's four fundamental modes of operation; rate, precision, accuracy or expert. It is a lossy compression plugin.
NREL Research Determines Integration of Plug-in Electric Vehicles Should
transportation and energy systems engineer at NREL and author of the new Nature Energy paper, "Impact of Muratori, author of the new Nature Energy paper "Impact of Uncoordinated Plug-in Electric Vehicle Integration of Plug-in Electric Vehicles Should Play a Big Role in Future Electric System Planning News
Sensitivity Analysis for Probabilistic Neural Network Structure Reduction.
Kowalski, Piotr A; Kusy, Maciej
2018-05-01
In this paper, we propose the use of local sensitivity analysis (LSA) for the structure simplification of the probabilistic neural network (PNN). Three algorithms are introduced. The first algorithm applies LSA to the PNN input layer reduction by selecting significant features of input patterns. The second algorithm utilizes LSA to remove redundant pattern neurons of the network. The third algorithm combines the proposed two and constitutes the solution of how they can work together. PNN with a product kernel estimator is used, where each multiplicand computes a one-dimensional Cauchy function. Therefore, the smoothing parameter is separately calculated for each dimension by means of the plug-in method. The classification qualities of the reduced and full structure PNN are compared. Furthermore, we evaluate the performance of PNN, for which global sensitivity analysis (GSA) and the common reduction methods are applied, both in the input layer and the pattern layer. The models are tested on the classification problems of eight repository data sets. A 10-fold cross validation procedure is used to determine the prediction ability of the networks. Based on the obtained results, it is shown that the LSA can be used as an alternative PNN reduction approach.
Cloud cover estimation optical package: New facility, algorithms and techniques
NASA Astrophysics Data System (ADS)
Krinitskiy, Mikhail
2017-02-01
Short- and long-wave radiation is an important component of surface heat budget over sea and land. For estimating them accurate observations of the cloud cover are needed. While massively observed visually, for building accurate parameterizations cloud cover needs also to be quantified using precise instrumental measurements. Major disadvantages of the most of existing cloud-cameras are associated with their complicated design and inaccuracy of post-processing algorithms which typically result in the uncertainties of 20% to 30% in the camera-based estimates of cloud cover. The accuracy of these types of algorithm in terms of true scoring compared to human-observed values is typically less than 10%. We developed new generation package for cloud cover estimating, which provides much more accurate results and also allows for measuring additional characteristics. New algorithm, namely SAIL GrIx, based on routine approach, also developed for this package. It uses the synthetic controlling index ("grayness rate index") which allows to suppress the background sunburn effect. This makes it possible to increase the reliability of the detection of the optically thin clouds. The accuracy of this algorithm in terms of true scoring became 30%. One more approach, namely SAIL GrIx ML, we have used to increase the cloud cover estimating accuracy is the algorithm that uses machine learning technique along with some other signal processing techniques. Sun disk condition appears to be a strong feature in this kind of models. Artificial Neural Networks type of model demonstrates the best quality. This model accuracy in terms of true scoring increases up to 95,5%. Application of a new algorithm lets us to modify the design of the optical sensing package and to avoid the use of the solar trackers. This made the design of the cloud camera much more compact. New cloud-camera has already been tested in several missions across Atlantic and Indian oceans on board of IORAS research vessels.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keefer, Donald A.; Shaffer, Eric G.; Storsved, Brynne
A free software application, RVA, has been developed as a plugin to the US DOE-funded ParaView visualization package, to provide support in the visualization and analysis of complex reservoirs being managed using multi-fluid EOR techniques. RVA, for Reservoir Visualization and Analysis, was developed as an open-source plugin to the 64 bit Windows version of ParaView 3.14. RVA was developed at the University of Illinois at Urbana-Champaign, with contributions from the Illinois State Geological Survey, Department of Computer Science and National Center for Supercomputing Applications. RVA was designed to utilize and enhance the state-of-the-art visualization capabilities within ParaView, readily allowing jointmore » visualization of geologic framework and reservoir fluid simulation model results. Particular emphasis was placed on enabling visualization and analysis of simulation results highlighting multiple fluid phases, multiple properties for each fluid phase (including flow lines), multiple geologic models and multiple time steps. Additional advanced functionality was provided through the development of custom code to implement data mining capabilities. The built-in functionality of ParaView provides the capacity to process and visualize data sets ranging from small models on local desktop systems to extremely large models created and stored on remote supercomputers. The RVA plugin that we developed and the associated User Manual provide improved functionality through new software tools, and instruction in the use of ParaView-RVA, targeted to petroleum engineers and geologists in industry and research. The RVA web site (http://rva.cs.illinois.edu) provides an overview of functions, and the development web site (https://github.com/shaffer1/RVA) provides ready access to the source code, compiled binaries, user manual, and a suite of demonstration data sets. Key functionality has been included to support a range of reservoirs visualization and analysis needs, including: sophisticated connectivity analysis, cross sections through simulation results between selected wells, simplified volumetric calculations, global vertical exaggeration adjustments, ingestion of UTChem simulation results, ingestion of Isatis geostatistical framework models, interrogation of joint geologic and reservoir modeling results, joint visualization and analysis of well history files, location-targeted visualization, advanced correlation analysis, visualization of flow paths, and creation of static images and animations highlighting targeted reservoir features.« less
RVA: A Plugin for ParaView 3.14
DOE Office of Scientific and Technical Information (OSTI.GOV)
2015-09-04
RVA is a plugin developed for the 64-bit Windows version of the ParaView 3.14 visualization package. RVA is designed to provide support in the visualization and analysis of complex reservoirs being managed using multi-fluid EOR techniques. RVA, for Reservoir Visualization and Analysis, was developed at the University of Illinois at Urbana-Champaign, with contributions from the Illinois State Geological Survey, Department of Computer Science and National Center for Supercomputing Applications. RVA was designed to utilize and enhance the state-of-the-art visualization capabilities within ParaView, readily allowing joint visualization of geologic framework and reservoir fluid simulation model results. Particular emphasis was placed onmore » enabling visualization and analysis of simulation results highlighting multiple fluid phases, multiple properties for each fluid phase (including flow lines), multiple geologic models and multiple time steps. Additional advanced functionality was provided through the development of custom code to implement data mining capabilities. The built-in functionality of ParaView provides the capacity to process and visualize data sets ranging from small models on local desktop systems to extremely large models created and stored on remote supercomputers. The RVA plugin that we developed and the associated User Manual provide improved functionality through new software tools, and instruction in the use of ParaView-RVA, targeted to petroleum engineers and geologists in industry and research. The RVA web site (http://rva.cs.illinois.edu) provides an overview of functions, and the development web site (https://github.com/shaffer1/RVA) provides ready access to the source code, compiled binaries, user manual, and a suite of demonstration data sets. Key functionality has been included to support a range of reservoirs visualization and analysis needs, including: sophisticated connectivity analysis, cross sections through simulation results between selected wells, simplified volumetric calculations, global vertical exaggeration adjustments, ingestion of UTChem simulation results, ingestion of Isatis geostatistical framework models, interrogation of joint geologic and reservoir modeling results, joint visualization and analysis of well history files, location-targeted visualization, advanced correlation analysis, visualization of flow paths, and creation of static images and animations highlighting targeted reservoir features.« less
GIS Application System Design Applied to Information Monitoring
NASA Astrophysics Data System (ADS)
Qun, Zhou; Yujin, Yuan; Yuena, Kang
Natural environment information management system involves on-line instrument monitoring, data communications, database establishment, information management software development and so on. Its core lies in collecting effective and reliable environmental information, increasing utilization rate and sharing degree of environment information by advanced information technology, and maximizingly providing timely and scientific foundation for environmental monitoring and management. This thesis adopts C# plug-in application development and uses a set of complete embedded GIS component libraries and tools libraries provided by GIS Engine to finish the core of plug-in GIS application framework, namely, the design and implementation of framework host program and each functional plug-in, as well as the design and implementation of plug-in GIS application framework platform. This thesis adopts the advantages of development technique of dynamic plug-in loading configuration, quickly establishes GIS application by visualized component collaborative modeling and realizes GIS application integration. The developed platform is applicable to any application integration related to GIS application (ESRI platform) and can be as basis development platform of GIS application development.
NASA Astrophysics Data System (ADS)
Herrmann, Matthias
2014-06-01
Nowadays, a large number of different electrochemical energy storage systems are known. In the last two decades the development was strongly driven by a continuously growing market of portable electronic devices (e.g. cellular phones, lap top computers, camcorders, cameras, tools). Current intensive efforts are under way to develop systems for automotive industry within the framework of electrically propelled mobility (e.g. hybrid electric vehicles, plug-in hybrid electric vehicles, full electric vehicles) and also for the energy storage market (e.g. electrical grid stability, renewable energies). Besides the different systems (cell chemistries), electrochemical cells and batteries were developed and are offered in many shapes, sizes and designs, in order to meet performance and design requirements of the widespread applications. Proper packaging is thereby one important technological step for designing optimum, reliable and safe batteries for operation. In this contribution, current packaging approaches of cells and batteries together with the corresponding materials are discussed. The focus is laid on rechargeable systems for industrial applications (i.e. alkaline systems, lithium-ion, lead-acid). In principle, four different cell types (shapes) can be identified - button, cylindrical, prismatic and pouch. Cell size can be either in accordance with international (e.g. International Electrotechnical Commission, IEC) or other standards or can meet application-specific dimensions. Since cell housing or container, terminals and, if necessary, safety installations as inactive (non-reactive) materials reduce energy density of the battery, the development of low-weight packages is a challenging task. In addition to that, other requirements have to be fulfilled: mechanical stability and durability, sealing (e.g. high permeation barrier against humidity for lithium-ion technology), high packing efficiency, possible installation of safety devices (current interrupt device, valve, etc.), chemical inertness, cost issues, and others. Finally, proper cell design has to be considered for effective thermal management (i.e. cooling and heating) of battery packs.
In-use measurement of activity, energy use, and emissions of a plug-in hybrid electric vehicle.
Graver, Brandon M; Frey, H Christopher; Choi, Hyung-Wook
2011-10-15
Plug-in hybrid electric vehicles (PHEVs) could reduce transportation air emissions and energy use. However, a method is needed for estimating on-road emissions of PHEVs. To develop a framework for quantifying microscale energy use and emissions (EU&E), measurements were conducted on a Toyota Prius retrofitted with a plug-in battery system on eight routes. Measurements were made using the following: (1) a data logger for the hybrid control system; (2) a portable emissions measurement system; and (3) a global positioning system with barometric altimeter. Trends in EU&E are estimated based on vehicle specific power. Energy economy is quantified based on gasoline consumed by the engine and grid energy consumed by the plug-in battery. Emissions from electricity consumption are estimated based on the power generation mix. Fuel use is approximately 30% lower during plug-in battery use. Grid emissions were higher for CO₂, NO(x), SO₂, and PM compared to tailpipe emissions but lower for CO and hydrocarbons. EU&E depends on engine and plug-in battery operation. The use of two energy sources must be addressed in characterizing fuel economy; overall energy economy is 11% lower if including grid energy use than accounting only for fuel consumption.
A Shifted Block Lanczos Algorithm 1: The Block Recurrence
NASA Technical Reports Server (NTRS)
Grimes, Roger G.; Lewis, John G.; Simon, Horst D.
1990-01-01
In this paper we describe a block Lanczos algorithm that is used as the key building block of a software package for the extraction of eigenvalues and eigenvectors of large sparse symmetric generalized eigenproblems. The software package comprises: a version of the block Lanczos algorithm specialized for spectrally transformed eigenproblems; an adaptive strategy for choosing shifts, and efficient codes for factoring large sparse symmetric indefinite matrices. This paper describes the algorithmic details of our block Lanczos recurrence. This uses a novel combination of block generalizations of several features that have only been investigated independently in the past. In particular new forms of partial reorthogonalization, selective reorthogonalization and local reorthogonalization are used, as is a new algorithm for obtaining the M-orthogonal factorization of a matrix. The heuristic shifting strategy, the integration with sparse linear equation solvers and numerical experience with the code are described in a companion paper.
Ford Plug-In Project: Bringing PHEVs to Market Demonstration and Validation Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
D'Annunzio, Julie; Slezak, Lee; Conley, John Jason
2014-03-26
This project is in support of our national goal to reduce our dependence on fossil fuels. By supporting efforts that contribute toward the successful mass production of plug-in hybrid electric vehicles, our nation’s transportation-related fuel consumption can be offset with energy from the grid. Over four and a half years ago, when this project was originally initiated, plug-in electric vehicles were not readily available in the mass marketplace. Through the creation of a 21 unit plug-in hybrid vehicle fleet, this program was designed to demonstrate the feasibility of the technology and to help build cross-industry familiarity with the technology andmore » interface of this technology with the grid. Ford Escape PHEV Demonstration Fleet 3 March 26, 2014 Since then, however, plug-in vehicles have become increasingly more commonplace in the market. Ford, itself, now offers an all-electric vehicle and two plug-in hybrid vehicles in North America and has announced a third plug-in vehicle offering for Europe. Lessons learned from this project have helped in these production vehicle launches and are mentioned throughout this report. While the technology of plugging in a vehicle to charge a high voltage battery with energy from the grid is now in production, the ability for vehicle-to-grid or bi-directional energy flow was farther away than originally expected. Several technical, regulatory and potential safety issues prevented progressing the vehicle-to-grid energy flow (V2G) demonstration and, after a review with the DOE, V2G was removed from this demonstration project. Also proving challenging were communications between a plug-in vehicle and the grid or smart meter. While this project successfully demonstrated the vehicle to smart meter interface, cross-industry and regulatory work is still needed to define the vehicle-to-grid communication interface.« less
GDAL Enhancements for Interoperability with EOS Data (GEE)
NASA Astrophysics Data System (ADS)
Tisdale, B.
2015-12-01
Historically, Earth Observing Satellite (EOS) data products have been difficult to consume by GIS tools, weather commercial or open-source. This has resulted in a reduced acceptance of these data products by GIS and general user communities. Common problems and challenges experienced by these data users include difficulty when: Consuming data products from NASA Distributed Active Archive Centers (DAACs) that pre-date modern application software with commercial and open-source geospatial tools; Identifying[MI1] an initial approach for developing a framework and plug-ins that interpret non-compliant data; Defining a methodology that is extensible across NASA Earth Observing System Data and Information System (EOSDIS), scientific communities, and GIS communities by enabling other data centers to construct their own plug-ins and adjust specific data products; and Promoting greater use of NASA Data and new analysis utilizing GIS tools. To address these challenges and make EOS data products more accessible and interpretable by GIS applications, a collaborative approach has been taken that includes the NASA Langley Atmospheric Science Data Center (ASDC), Esri, George Mason University (GMU), and the Hierarchical Data Format (HDF) Group to create a framework and plugins to be applied to Geospatial Data Abstraction Library (GDAL). This framework and its plugins offer advantages of extensibility within NASA EOSDIS, permitting other data centers to construct their own plugins necessary to adjust their data products. In this session findings related to the framework and the development of GDAL plugins will be reviewed. Specifically, this session will offer a workshop to review documentation and training materials that have been generated for the purpose of guiding other NASA DAACs through the process of constructing plug-ins consistent with the framework as well as a review of the certification process by which the plugins can be independently verified as properly converting the data to the format and content required for use in GIS software.
GDAL Enhancements for Interoperability with EOS Data
NASA Astrophysics Data System (ADS)
Tisdale, M.; Mathews, T. J.; Tisdale, B.; Sun, M.; Yang, C. P.; Lee, H.; Habermann, T.
2015-12-01
Historically, Earth Observing Satellite (EOS) data products have been difficult to consume by GIS tools, weather commercial or open-source. This has resulted in a reduced acceptance of these data products by GIS and general user communities. Common problems and challenges experienced by these data users include difficulty when: Consuming data products from NASA Distributed Active Archive Centers (DAACs) that pre-date modern application software with commercial and open-source geospatial tools; Identifying an initial approach for developing a framework and plug-ins that interpret non-compliant data; Defining a methodology that is extensible across NASA Earth Observing System Data and Information System (EOSDIS), scientific communities, and GIS communities by enabling other data centers to construct their own plug-ins and adjust specific data products; and Promoting greater use of NASA Data and new analysis utilizing GIS tools. To address these challenges and to make EOS data products more accessible and interpretable by GIS applications, a collaborative approach has been taken that includes the NASA Langley Atmospheric Science Data Center (ASDC), Esri, George Mason University (GMU), and the Hierarchical Data Format (HDF) Group to create a framework and plugins to be applied to Geospatial Data Abstraction Library (GDAL). This framework and its plugins offer advantages of extensibility within NASA EOSDIS, permitting other data centers to construct their own plugins necessary to adjust their data products. In this session findings related to the framework and the development of GDAL plugins will be reviewed. Specifically, this session will offer a workshop to review documentation and training materials that have been generated for the purpose of guiding other NASA DAACs through the process of constructing plug-ins consistent with the framework as well as a review of the certification process by which the plugins can be independently verified as properly converting the data to the format and content required for use in GIS software.
OptFlux: an open-source software platform for in silico metabolic engineering.
Rocha, Isabel; Maia, Paulo; Evangelista, Pedro; Vilaça, Paulo; Soares, Simão; Pinto, José P; Nielsen, Jens; Patil, Kiran R; Ferreira, Eugénio C; Rocha, Miguel
2010-04-19
Over the last few years a number of methods have been proposed for the phenotype simulation of microorganisms under different environmental and genetic conditions. These have been used as the basis to support the discovery of successful genetic modifications of the microbial metabolism to address industrial goals. However, the use of these methods has been restricted to bioinformaticians or other expert researchers. The main aim of this work is, therefore, to provide a user-friendly computational tool for Metabolic Engineering applications. OptFlux is an open-source and modular software aimed at being the reference computational application in the field. It is the first tool to incorporate strain optimization tasks, i.e., the identification of Metabolic Engineering targets, using Evolutionary Algorithms/Simulated Annealing metaheuristics or the previously proposed OptKnock algorithm. It also allows the use of stoichiometric metabolic models for (i) phenotype simulation of both wild-type and mutant organisms, using the methods of Flux Balance Analysis, Minimization of Metabolic Adjustment or Regulatory on/off Minimization of Metabolic flux changes, (ii) Metabolic Flux Analysis, computing the admissible flux space given a set of measured fluxes, and (iii) pathway analysis through the calculation of Elementary Flux Modes. OptFlux also contemplates several methods for model simplification and other pre-processing operations aimed at reducing the search space for optimization algorithms. The software supports importing/exporting to several flat file formats and it is compatible with the SBML standard. OptFlux has a visualization module that allows the analysis of the model structure that is compatible with the layout information of Cell Designer, allowing the superimposition of simulation results with the model graph. The OptFlux software is freely available, together with documentation and other resources, thus bridging the gap from research in strain optimization algorithms and the final users. It is a valuable platform for researchers in the field that have available a number of useful tools. Its open-source nature invites contributions by all those interested in making their methods available for the community. Given its plug-in based architecture it can be extended with new functionalities. Currently, several plug-ins are being developed, including network topology analysis tools and the integration with Boolean network based regulatory models.
OptFlux: an open-source software platform for in silico metabolic engineering
2010-01-01
Background Over the last few years a number of methods have been proposed for the phenotype simulation of microorganisms under different environmental and genetic conditions. These have been used as the basis to support the discovery of successful genetic modifications of the microbial metabolism to address industrial goals. However, the use of these methods has been restricted to bioinformaticians or other expert researchers. The main aim of this work is, therefore, to provide a user-friendly computational tool for Metabolic Engineering applications. Results OptFlux is an open-source and modular software aimed at being the reference computational application in the field. It is the first tool to incorporate strain optimization tasks, i.e., the identification of Metabolic Engineering targets, using Evolutionary Algorithms/Simulated Annealing metaheuristics or the previously proposed OptKnock algorithm. It also allows the use of stoichiometric metabolic models for (i) phenotype simulation of both wild-type and mutant organisms, using the methods of Flux Balance Analysis, Minimization of Metabolic Adjustment or Regulatory on/off Minimization of Metabolic flux changes, (ii) Metabolic Flux Analysis, computing the admissible flux space given a set of measured fluxes, and (iii) pathway analysis through the calculation of Elementary Flux Modes. OptFlux also contemplates several methods for model simplification and other pre-processing operations aimed at reducing the search space for optimization algorithms. The software supports importing/exporting to several flat file formats and it is compatible with the SBML standard. OptFlux has a visualization module that allows the analysis of the model structure that is compatible with the layout information of Cell Designer, allowing the superimposition of simulation results with the model graph. Conclusions The OptFlux software is freely available, together with documentation and other resources, thus bridging the gap from research in strain optimization algorithms and the final users. It is a valuable platform for researchers in the field that have available a number of useful tools. Its open-source nature invites contributions by all those interested in making their methods available for the community. Given its plug-in based architecture it can be extended with new functionalities. Currently, several plug-ins are being developed, including network topology analysis tools and the integration with Boolean network based regulatory models. PMID:20403172
You, Yun-Wen; Chang, Hsun-Yun; Liao, Hua-Yang; Kao, Wei-Lun; Yen, Guo-Ji; Chang, Chi-Jen; Tsai, Meng-Hung; Shyue, Jing-Jong
2012-10-01
Based on a scanning electron microscope operated at 30 kV with a homemade specimen holder and a multiangle solid-state detector behind the sample, low-kV scanning transmission electron microscopy (STEM) is presented with subsequent electron tomography for three-dimensional (3D) volume structure. Because of the low acceleration voltage, the stronger electron-atom scattering leads to a stronger contrast in the resulting image than standard TEM, especially for light elements. Furthermore, the low-kV STEM yields less radiation damage to the specimen, hence the structure can be preserved. In this work, two-dimensional STEM images of a 1-μm-thick cell section with projection angles between ±50° were collected, and the 3D volume structure was reconstructed using the simultaneous iterative reconstructive technique algorithm with the TomoJ plugin for ImageJ, which are both public domain software. Furthermore, the cross-sectional structure was obtained with the Volume Viewer plugin in ImageJ. Although the tilting angle is constrained and limits the resulting structural resolution, slicing the reconstructed volume generated the depth profile of the thick specimen with sufficient resolution to examine cellular uptake of Au nanoparticles, and the final position of these nanoparticles inside the cell was imaged.
Use of the NetBeans Platform for NASA Robotic Conjunction Assessment Risk Analysis
NASA Technical Reports Server (NTRS)
Sabey, Nickolas J.
2014-01-01
The latest Java and JavaFX technologies are very attractive software platforms for customers involved in space mission operations such as those of NASA and the US Air Force. For NASA Robotic Conjunction Assessment Risk Analysis (CARA), the NetBeans platform provided an environment in which scalable software solutions could be developed quickly and efficiently. Both Java 8 and the NetBeans platform are in the process of simplifying CARA development in secure environments by providing a significant amount of capability in a single accredited package, where accreditation alone can account for 6-8 months for each library or software application. Capabilities either in use or being investigated by CARA include: 2D and 3D displays with JavaFX, parallelization with the new Streams API, and scalability through the NetBeans plugin architecture.
Variations in algorithm implementation among quantitative texture analysis software packages
NASA Astrophysics Data System (ADS)
Foy, Joseph J.; Mitta, Prerana; Nowosatka, Lauren R.; Mendel, Kayla R.; Li, Hui; Giger, Maryellen L.; Al-Hallaq, Hania; Armato, Samuel G.
2018-02-01
Open-source texture analysis software allows for the advancement of radiomics research. Variations in texture features, however, result from discrepancies in algorithm implementation. Anatomically matched regions of interest (ROIs) that captured normal breast parenchyma were placed in the magnetic resonance images (MRI) of 20 patients at two time points. Six first-order features and six gray-level co-occurrence matrix (GLCM) features were calculated for each ROI using four texture analysis packages. Features were extracted using package-specific default GLCM parameters and using GLCM parameters modified to yield the greatest consistency among packages. Relative change in the value of each feature between time points was calculated for each ROI. Distributions of relative feature value differences were compared across packages. Absolute agreement among feature values was quantified by the intra-class correlation coefficient. Among first-order features, significant differences were found for max, range, and mean, and only kurtosis showed poor agreement. All six second-order features showed significant differences using package-specific default GLCM parameters, and five second-order features showed poor agreement; with modified GLCM parameters, no significant differences among second-order features were found, and all second-order features showed poor agreement. While relative texture change discrepancies existed across packages, these differences were not significant when consistent parameters were used.
Multi-threaded integration of HTC-Vive and MeVisLab
NASA Astrophysics Data System (ADS)
Gunacker, Simon; Gall, Markus; Schmalstieg, Dieter; Egger, Jan
2018-03-01
This work presents how Virtual Reality (VR) can easily be integrated into medical applications via a plugin for a medical image processing framework called MeVisLab. A multi-threaded plugin has been developed using OpenVR, a VR library that can be used for developing vendor and platform independent VR applications. The plugin is tested using the HTC Vive, a head-mounted display developed by HTC and Valve Corporation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
This is a Spanish-language brochure about hybrid and plug-in electric vehicles, which use electricity as their primary fuel or to improve the efficiency of conventional vehicle designs. These vehicles can be divided into three categories: hybrid electric vehicles (HEVs), plug-in hybrid electric vehicles (PHEVs), all-electric vehicles (EVs). Together, they have great potential to cut U.S. petroleum use and vehicle emissions.
City of Las Vegas Plug-in Hybrid Electric Vehicle Demonstration Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2013-12-31
The City of Las Vegas was awarded Department of Energy (DOE) project funding in 2009, for the City of Las Vegas Plug-in Hybrid Electric Vehicle Demonstration Program. This project allowed the City of Las Vegas to purchase electric and plug-in hybrid electric vehicles and associated electric vehicle charging infrastructure. The City anticipated the electric vehicles having lower overall operating costs and emissions similar to traditional and hybrid vehicles.
Molecular Isotopic Distribution Analysis (MIDAs) with Adjustable Mass Accuracy
NASA Astrophysics Data System (ADS)
Alves, Gelio; Ogurtsov, Aleksey Y.; Yu, Yi-Kuo
2014-01-01
In this paper, we present Molecular Isotopic Distribution Analysis (MIDAs), a new software tool designed to compute molecular isotopic distributions with adjustable accuracies. MIDAs offers two algorithms, one polynomial-based and one Fourier-transform-based, both of which compute molecular isotopic distributions accurately and efficiently. The polynomial-based algorithm contains few novel aspects, whereas the Fourier-transform-based algorithm consists mainly of improvements to other existing Fourier-transform-based algorithms. We have benchmarked the performance of the two algorithms implemented in MIDAs with that of eight software packages (BRAIN, Emass, Mercury, Mercury5, NeutronCluster, Qmass, JFC, IC) using a consensus set of benchmark molecules. Under the proposed evaluation criteria, MIDAs's algorithms, JFC, and Emass compute with comparable accuracy the coarse-grained (low-resolution) isotopic distributions and are more accurate than the other software packages. For fine-grained isotopic distributions, we compared IC, MIDAs's polynomial algorithm, and MIDAs's Fourier transform algorithm. Among the three, IC and MIDAs's polynomial algorithm compute isotopic distributions that better resemble their corresponding exact fine-grained (high-resolution) isotopic distributions. MIDAs can be accessed freely through a user-friendly web-interface at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/midas/index.html.
Molecular Isotopic Distribution Analysis (MIDAs) with adjustable mass accuracy.
Alves, Gelio; Ogurtsov, Aleksey Y; Yu, Yi-Kuo
2014-01-01
In this paper, we present Molecular Isotopic Distribution Analysis (MIDAs), a new software tool designed to compute molecular isotopic distributions with adjustable accuracies. MIDAs offers two algorithms, one polynomial-based and one Fourier-transform-based, both of which compute molecular isotopic distributions accurately and efficiently. The polynomial-based algorithm contains few novel aspects, whereas the Fourier-transform-based algorithm consists mainly of improvements to other existing Fourier-transform-based algorithms. We have benchmarked the performance of the two algorithms implemented in MIDAs with that of eight software packages (BRAIN, Emass, Mercury, Mercury5, NeutronCluster, Qmass, JFC, IC) using a consensus set of benchmark molecules. Under the proposed evaluation criteria, MIDAs's algorithms, JFC, and Emass compute with comparable accuracy the coarse-grained (low-resolution) isotopic distributions and are more accurate than the other software packages. For fine-grained isotopic distributions, we compared IC, MIDAs's polynomial algorithm, and MIDAs's Fourier transform algorithm. Among the three, IC and MIDAs's polynomial algorithm compute isotopic distributions that better resemble their corresponding exact fine-grained (high-resolution) isotopic distributions. MIDAs can be accessed freely through a user-friendly web-interface at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/midas/index.html.
Ckmeans.1d.dp: Optimal k-means Clustering in One Dimension by Dynamic Programming.
Wang, Haizhou; Song, Mingzhou
2011-12-01
The heuristic k -means algorithm, widely used for cluster analysis, does not guarantee optimality. We developed a dynamic programming algorithm for optimal one-dimensional clustering. The algorithm is implemented as an R package called Ckmeans.1d.dp . We demonstrate its advantage in optimality and runtime over the standard iterative k -means algorithm.
TADtool: visual parameter identification for TAD-calling algorithms.
Kruse, Kai; Hug, Clemens B; Hernández-Rodríguez, Benjamín; Vaquerizas, Juan M
2016-10-15
Eukaryotic genomes are hierarchically organized into topologically associating domains (TADs). The computational identification of these domains and their associated properties critically depends on the choice of suitable parameters of TAD-calling algorithms. To reduce the element of trial-and-error in parameter selection, we have developed TADtool: an interactive plot to find robust TAD-calling parameters with immediate visual feedback. TADtool allows the direct export of TADs called with a chosen set of parameters for two of the most common TAD calling algorithms: directionality and insulation index. It can be used as an intuitive, standalone application or as a Python package for maximum flexibility. TADtool is available as a Python package from GitHub (https://github.com/vaquerizaslab/tadtool) or can be installed directly via PyPI, the Python package index (tadtool). kai.kruse@mpi-muenster.mpg.de, jmv@mpi-muenster.mpg.deSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
NASA Astrophysics Data System (ADS)
Pierleoni, Arnaldo; Casagrande, Luca; Bellezza, Michele; Casadei, Stefano
2010-05-01
The need for increasingly complex geospatial algorithms dedicated to the management of water resources, the fact that many of them require specific knowledge and the need for dedicated computing machines has led to the necessity of centralizing and sharing all the server applications and the plugins developed. For this purpose, a Web Processing Service (WPS) that can make available to users a range of geospatial analysis algorithms, geostatistics, remote sensing procedures and that can be used simply by providing data and input parameters and download the results has been developed. The core of the system infrastructure is a GRASS GIS, which acts as a computational engine, providing more than 350 forms of analysis and the opportunity to create new and ad hoc procedures. The implementation of the WPS was performed using the software PyWPS written in Python that is easily manageable and configurable. All these instruments are managed by a daemon named "Arcibald" specifically created for the purpose of listing the order of the requests that come from the users. In fact, it may happen that there are already ongoing processes so the system will queue the new ones registering the request and running it only when the previous calculations have been completed. However, individual Geoprocessing have an indicator to assess the resources necessary to implement it, enabling you to run geoprocesses that do not require excessive computing time in parallel. This assessment is also made in relation to the size of the input file provided. The WPS standard defines methods for accessing and running Geoprocessing regardless of the client used, however, the project has been developed specifically for a graphical client to access the resources. The client was built as a plugin for the GIS QGis Software which provides the most common tools for the view and the consultation of geographically referenced data. The tool was tested using the data taken during the bathymetric campaign at the Montedoglio Reservoir on the Tiber River in order to generate a digital model of the reservoir bed. Starting from a text file containing coordinates and the depth of the points (previously statistically treated to remove any inaccuracy), we used the plugin for QGis to connect to the Web service and started the process of cross validation in order to obtain the parameters to be used for interpolation. This makes possible to highlight the morphological variations of the basin of reservoirs due to silting phenomena, therefore to consider the actual capacity of the basin for a proper evaluation of the available water resource. Indeed, this is a critical step for the next phase of management. In this case, since the procedure is very long (order of days), the system automatically choose to send the results via email. Moreover the system, once the procedures invoked end, allows to choose whether to share data and results or to remove all traces of the calculation. This because in some cases data and sensitive information are used and this could violate privacy policies if shared. The entire project is made only with open-source software.
A software package for evaluating the performance of a star sensor operation
NASA Astrophysics Data System (ADS)
Sarpotdar, Mayuresh; Mathew, Joice; Sreejith, A. G.; Nirmal, K.; Ambily, S.; Prakash, Ajin; Safonova, Margarita; Murthy, Jayant
2017-02-01
We have developed a low-cost off-the-shelf component star sensor ( StarSense) for use in minisatellites and CubeSats to determine the attitude of a satellite in orbit. StarSense is an imaging camera with a limiting magnitude of 6.5, which extracts information from star patterns it records in the images. The star sensor implements a centroiding algorithm to find centroids of the stars in the image, a Geometric Voting algorithm for star pattern identification, and a QUEST algorithm for attitude quaternion calculation. Here, we describe the software package to evaluate the performance of these algorithms as a star sensor single operating system. We simulate the ideal case where sky background and instrument errors are omitted, and a more realistic case where noise and camera parameters are added to the simulated images. We evaluate such performance parameters of the algorithms as attitude accuracy, calculation time, required memory, star catalog size, sky coverage, etc., and estimate the errors introduced by each algorithm. This software package is written for use in MATLAB. The testing is parametrized for different hardware parameters, such as the focal length of the imaging setup, the field of view (FOV) of the camera, angle measurement accuracy, distortion effects, etc., and therefore, can be applied to evaluate the performance of such algorithms in any star sensor. For its hardware implementation on our StarSense, we are currently porting the codes in form of functions written in C. This is done keeping in view its easy implementation on any star sensor electronics hardware.
Plug-in nanoliter pneumatic liquid dispenser with nozzle design flexibility
Choi, In Ho; Kim, Hojin; Lee, Sanghyun; Baek, Seungbum; Kim, Joonwon
2015-01-01
This paper presents a novel plug-in nanoliter liquid dispensing system with a plug-and-play interface for simple and reversible, yet robust integration of the dispenser. A plug-in type dispenser was developed to facilitate assembly and disassembly with an actuating part through efficient modularization. The entire process for assembly and operation of the plug-in dispenser is performed via the plug-and-play interface in less than a minute without loss of dispensing quality. The minimum volume of droplets pneumatically dispensed using the plug-in dispenser was 124 nl with a coefficient of variation of 1.6%. The dispensed volume increased linearly with the nozzle size. Utilizing this linear relationship, two types of multinozzle dispensers consisting of six parallel channels (emerging from an inlet) and six nozzles were developed to demonstrate a novel strategy for volume gradient dispensing at a single operating condition. The droplet volume dispensed from each nozzle also increased linearly with nozzle size, demonstrating that nozzle size is a dominant factor on dispensed volume, even for multinozzle dispensing. Therefore, the proposed plug-in dispenser enables flexible design of nozzles and reversible integration to dispense droplets with different volumes, depending on the application. Furthermore, to demonstrate the practicality of the proposed dispensing system, we developed a pencil-type dispensing system as an alternative to a conventional pipette for rapid and reliable dispensing of minute volume droplets. PMID:26594263
Plug-in nanoliter pneumatic liquid dispenser with nozzle design flexibility.
Choi, In Ho; Kim, Hojin; Lee, Sanghyun; Baek, Seungbum; Kim, Joonwon
2015-11-01
This paper presents a novel plug-in nanoliter liquid dispensing system with a plug-and-play interface for simple and reversible, yet robust integration of the dispenser. A plug-in type dispenser was developed to facilitate assembly and disassembly with an actuating part through efficient modularization. The entire process for assembly and operation of the plug-in dispenser is performed via the plug-and-play interface in less than a minute without loss of dispensing quality. The minimum volume of droplets pneumatically dispensed using the plug-in dispenser was 124 nl with a coefficient of variation of 1.6%. The dispensed volume increased linearly with the nozzle size. Utilizing this linear relationship, two types of multinozzle dispensers consisting of six parallel channels (emerging from an inlet) and six nozzles were developed to demonstrate a novel strategy for volume gradient dispensing at a single operating condition. The droplet volume dispensed from each nozzle also increased linearly with nozzle size, demonstrating that nozzle size is a dominant factor on dispensed volume, even for multinozzle dispensing. Therefore, the proposed plug-in dispenser enables flexible design of nozzles and reversible integration to dispense droplets with different volumes, depending on the application. Furthermore, to demonstrate the practicality of the proposed dispensing system, we developed a pencil-type dispensing system as an alternative to a conventional pipette for rapid and reliable dispensing of minute volume droplets.
NASA Astrophysics Data System (ADS)
Pandey, Palak; Kunte, Pravin D.
2016-10-01
This study presents an easy, modular, user-friendly, and flexible software package for processing of Landsat 7 ETM and Landsat 8 OLI-TIRS data for estimating suspended particulate matter concentrations in the coastal waters. This package includes 1) algorithm developed using freely downloadable SCILAB package, 2) ERDAS Models for iterative processing of Landsat images and 3) ArcMAP tool for plotting and map making. Utilizing SCILAB package, a module is written for geometric corrections, radiometric corrections and obtaining normalized water-leaving reflectance by incorporating Landsat 8 OLI-TIRS and Landsat 7 ETM+ data. Using ERDAS models, a sequence of modules are developed for iterative processing of Landsat images and estimating suspended particulate matter concentrations. Processed images are used for preparing suspended sediment concentration maps. The applicability of this software package is demonstrated by estimating and plotting seasonal suspended sediment concentration maps off the Bengal delta. The software is flexible enough to accommodate other remotely sensed data like Ocean Color monitor (OCM) data, Indian Remote Sensing data (IRS), MODIS data etc. by replacing a few parameters in the algorithm, for estimating suspended sediment concentration in coastal waters.
A Skyline Plugin for Pathway-Centric Data Browsing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Degan, Michael G.; Ryadinskiy, Lillian; Fujimoto, Grant M.
For targeted proteomics to be broadly adopted in biological laboratories as a routine experimental protocol, wet-bench biologists must be able to approach SRM assay design in the same way they approach biological experimental design. Most often, biological hypotheses are envisioned in a set of protein interactions, networks and pathways. We present a plugin for the popular Skyline tool that presents public mass spectrometry data in a pathway-centric view to assist users in browsing available data and determining how to design quantitative experiments. Selected proteins and their underlying mass spectra are imported to Skyline for further assay design (transition selection). Themore » same plugin can be used for hypothesis-drive DIA data analysis, again utilizing the pathway view to help narrow down the set of proteins which will be investigated. The plugin is backed by the PNNL Biodiversity Library, a corpus of 3 million peptides from >100 organisms, and the draft human proteome. Users can upload personal data to the plugin to use the pathway navigation prior to importing their own data into Skyline.« less
A Skyline Plugin for Pathway-Centric Data Browsing
NASA Astrophysics Data System (ADS)
Degan, Michael G.; Ryadinskiy, Lillian; Fujimoto, Grant M.; Wilkins, Christopher S.; Lichti, Cheryl F.; Payne, Samuel H.
2016-11-01
For targeted proteomics to be broadly adopted in biological laboratories as a routine experimental protocol, wet-bench biologists must be able to approach selected reaction monitoring (SRM) and parallel reaction monitoring (PRM) assay design in the same way they approach biological experimental design. Most often, biological hypotheses are envisioned in a set of protein interactions, networks, and pathways. We present a plugin for the popular Skyline tool that presents public mass spectrometry data in a pathway-centric view to assist users in browsing available data and determining how to design quantitative experiments. Selected proteins and their underlying mass spectra are imported to Skyline for further assay design (transition selection). The same plugin can be used for hypothesis-driven data-independent acquisition (DIA) data analysis, again utilizing the pathway view to help narrow down the set of proteins that will be investigated. The plugin is backed by the Pacific Northwest National Laboratory (PNNL) Biodiversity Library, a corpus of 3 million peptides from >100 organisms, and the draft human proteome. Users can upload personal data to the plugin to use the pathway navigation prior to importing their own data into Skyline.
NASA Astrophysics Data System (ADS)
Harris, A. T.; Ramachandran, R.; Maskey, M.
2013-12-01
The Exelis-developed IDL and ENVI software are ubiquitous tools in Earth science research environments. The IDL Workbench is used by the Earth science community for programming custom data analysis and visualization modules. ENVI is a software solution for processing and analyzing geospatial imagery that combines support for multiple Earth observation scientific data types (optical, thermal, multi-spectral, hyperspectral, SAR, LiDAR) with advanced image processing and analysis algorithms. The ENVI & IDL Services Engine (ESE) is an Earth science data processing engine that allows researchers to use open standards to rapidly create, publish and deploy advanced Earth science data analytics within any existing enterprise infrastructure. Although powerful in many ways, the tools lack collaborative features out-of-box. Thus, as part of the NASA funded project, Collaborative Workbench to Accelerate Science Algorithm Development, researchers at the University of Alabama in Huntsville and Exelis have developed plugins that allow seamless research collaboration from within IDL workbench. Such additional features within IDL workbench are possible because IDL workbench is built using the Eclipse Rich Client Platform (RCP). RCP applications allow custom plugins to be dropped in for extended functionalities. Specific functionalities of the plugins include creating complex workflows based on IDL application source code, submitting workflows to be executed by ESE in the cloud, and sharing and cloning of workflows among collaborators. All these functionalities are available to scientists without leaving their IDL workbench. Because ESE can interoperate with any middleware, scientific programmers can readily string together IDL processing tasks (or tasks written in other languages like C++, Java or Python) to create complex workflows for deployment within their current enterprise architecture (e.g. ArcGIS Server, GeoServer, Apache ODE or SciFlo from JPL). Using the collaborative IDL Workbench, coupled with ESE for execution in the cloud, asynchronous workflows could be executed in batch mode on large data in the cloud. We envision that a scientist will initially develop a scientific workflow locally on a small set of data. Once tested, the scientist will deploy the workflow to the cloud for execution. Depending on the results, the scientist may share the workflow and results, allowing them to be stored in a community catalog and instantly loaded into the IDL Workbench of other scientists. Thereupon, scientists can clone and modify or execute the workflow with different input parameters. The Collaborative Workbench will provide a platform for collaboration in the cloud, helping Earth scientists solve big-data problems in the Earth and planetary sciences.
Facets : a Cloudcompare Plugin to Extract Geological Planes from Unstructured 3d Point Clouds
NASA Astrophysics Data System (ADS)
Dewez, T. J. B.; Girardeau-Montaut, D.; Allanic, C.; Rohmer, J.
2016-06-01
Geological planar facets (stratification, fault, joint…) are key features to unravel the tectonic history of rock outcrop or appreciate the stability of a hazardous rock cliff. Measuring their spatial attitude (dip and strike) is generally performed by hand with a compass/clinometer, which is time consuming, requires some degree of censoring (i.e. refusing to measure some features judged unimportant at the time), is not always possible for fractures higher up on the outcrop and is somewhat hazardous. 3D virtual geological outcrop hold the potential to alleviate these issues. Efficiently segmenting massive 3D point clouds into individual planar facets, inside a convenient software environment was lacking. FACETS is a dedicated plugin within CloudCompare v2.6.2 (http://cloudcompare.org/ ) implemented to perform planar facet extraction, calculate their dip and dip direction (i.e. azimuth of steepest decent) and report the extracted data in interactive stereograms. Two algorithms perform the segmentation: Kd-Tree and Fast Marching. Both divide the point cloud into sub-cells, then compute elementary planar objects and aggregate them progressively according to a planeity threshold into polygons. The boundaries of the polygons are adjusted around segmented points with a tension parameter, and the facet polygons can be exported as 3D polygon shapefiles towards third party GIS software or simply as ASCII comma separated files. One of the great features of FACETS is the capability to explore planar objects but also 3D points with normals with the stereogram tool. Poles can be readily displayed, queried and manually segmented interactively. The plugin blends seamlessly into CloudCompare to leverage all its other 3D point cloud manipulation features. A demonstration of the tool is presented to illustrate these different features. While designed for geological applications, FACETS could be more widely applied to any planar objects.
Estimating the Entropy of Binary Time Series: Methodology, Some Theory and a Simulation Study
NASA Astrophysics Data System (ADS)
Gao, Yun; Kontoyiannis, Ioannis; Bienenstock, Elie
2008-06-01
Partly motivated by entropy-estimation problems in neuroscience, we present a detailed and extensive comparison between some of the most popular and effective entropy estimation methods used in practice: The plug-in method, four different estimators based on the Lempel-Ziv (LZ) family of data compression algorithms, an estimator based on the Context-Tree Weighting (CTW) method, and the renewal entropy estimator. METHODOLOGY: Three new entropy estimators are introduced; two new LZ-based estimators, and the “renewal entropy estimator,” which is tailored to data generated by a binary renewal process. For two of the four LZ-based estimators, a bootstrap procedure is described for evaluating their standard error, and a practical rule of thumb is heuristically derived for selecting the values of their parameters in practice. THEORY: We prove that, unlike their earlier versions, the two new LZ-based estimators are universally consistent, that is, they converge to the entropy rate for every finite-valued, stationary and ergodic process. An effective method is derived for the accurate approximation of the entropy rate of a finite-state hidden Markov model (HMM) with known distribution. Heuristic calculations are presented and approximate formulas are derived for evaluating the bias and the standard error of each estimator. SIMULATION: All estimators are applied to a wide range of data generated by numerous different processes with varying degrees of dependence and memory. The main conclusions drawn from these experiments include: (i) For all estimators considered, the main source of error is the bias. (ii) The CTW method is repeatedly and consistently seen to provide the most accurate results. (iii) The performance of the LZ-based estimators is often comparable to that of the plug-in method. (iv) The main drawback of the plug-in method is its computational inefficiency; with small word-lengths it fails to detect longer-range structure in the data, and with longer word-lengths the empirical distribution is severely undersampled, leading to large biases.
Plug-in electric vehicles: future market conditions and adoption rates
2017-01-01
This report, the first of four Issues in Focus articles from the International Energy Outlook 2017, analyzes the effects of uncertainties in the adoption of plug-in electric vehicles (PEVs) on worldwide transportation energy consumption. Uncertainties surrounding consumer acceptance, vehicle cost, policies, and other market conditions could affect future adoption rates of plug-in electric vehicles. Two side cases are presented in this report that assume different levels of PEV adoption and result in different levels of worldwide transportation energy consumption.
NASA Technical Reports Server (NTRS)
Howard, Andrew B.; Ansar, Adnan I.; Litwin, Todd E.; Goldberg, Steven B.
2009-01-01
The JPL Robot Vision Library (JPLV) provides real-time robot vision algorithms for developers who are not vision specialists. The package includes algorithms for stereo ranging, visual odometry and unsurveyed camera calibration, and has unique support for very wideangle lenses
XDesign: an open-source software package for designing X-ray imaging phantoms and experiments.
Ching, Daniel J; Gürsoy, Dogˇa
2017-03-01
The development of new methods or utilization of current X-ray computed tomography methods is impeded by the substantial amount of expertise required to design an X-ray computed tomography experiment from beginning to end. In an attempt to make material models, data acquisition schemes and reconstruction algorithms more accessible to researchers lacking expertise in some of these areas, a software package is described here which can generate complex simulated phantoms and quantitatively evaluate new or existing data acquisition schemes and image reconstruction algorithms for targeted applications.
XDesign: An open-source software package for designing X-ray imaging phantoms and experiments
Ching, Daniel J.; Gursoy, Dogˇa
2017-02-21
Here, the development of new methods or utilization of current X-ray computed tomography methods is impeded by the substantial amount of expertise required to design an X-ray computed tomography experiment from beginning to end. In an attempt to make material models, data acquisition schemes and reconstruction algorithms more accessible to researchers lacking expertise in some of these areas, a software package is described here which can generate complex simulated phantoms and quantitatively evaluate new or existing data acquisition schemes and image reconstruction algorithms for targeted applications.
Automated Functional Analysis of Astrocytes from Chronic Time-Lapse Calcium Imaging Data
Wang, Yinxue; Shi, Guilai; Miller, David J.; Wang, Yizhi; Wang, Congchao; Broussard, Gerard; Wang, Yue; Tian, Lin; Yu, Guoqiang
2017-01-01
Recent discoveries that astrocytes exert proactive regulatory effects on neural information processing and that they are deeply involved in normal brain development and disease pathology have stimulated broad interest in understanding astrocyte functional roles in brain circuit. Measuring astrocyte functional status is now technically feasible, due to recent advances in modern microscopy and ultrasensitive cell-type specific genetically encoded Ca2+ indicators for chronic imaging. However, there is a big gap between the capability of generating large dataset via calcium imaging and the availability of sophisticated analytical tools for decoding the astrocyte function. Current practice is essentially manual, which not only limits analysis throughput but also risks introducing bias and missing important information latent in complex, dynamic big data. Here, we report a suite of computational tools, called Functional AStrocyte Phenotyping (FASP), for automatically quantifying the functional status of astrocytes. Considering the complex nature of Ca2+ signaling in astrocytes and low signal to noise ratio, FASP is designed with data-driven and probabilistic principles, to flexibly account for various patterns and to perform robustly with noisy data. In particular, FASP explicitly models signal propagation, which rules out the applicability of tools designed for other types of data. We demonstrate the effectiveness of FASP using extensive synthetic and real data sets. The findings by FASP were verified by manual inspection. FASP also detected signals that were missed by purely manual analysis but could be confirmed by more careful manual examination under the guidance of automatic analysis. All algorithms and the analysis pipeline are packaged into a plugin for Fiji (ImageJ), with the source code freely available online at https://github.com/VTcbil/FASP. PMID:28769780
Automated Functional Analysis of Astrocytes from Chronic Time-Lapse Calcium Imaging Data.
Wang, Yinxue; Shi, Guilai; Miller, David J; Wang, Yizhi; Wang, Congchao; Broussard, Gerard; Wang, Yue; Tian, Lin; Yu, Guoqiang
2017-01-01
Recent discoveries that astrocytes exert proactive regulatory effects on neural information processing and that they are deeply involved in normal brain development and disease pathology have stimulated broad interest in understanding astrocyte functional roles in brain circuit. Measuring astrocyte functional status is now technically feasible, due to recent advances in modern microscopy and ultrasensitive cell-type specific genetically encoded Ca 2+ indicators for chronic imaging. However, there is a big gap between the capability of generating large dataset via calcium imaging and the availability of sophisticated analytical tools for decoding the astrocyte function. Current practice is essentially manual, which not only limits analysis throughput but also risks introducing bias and missing important information latent in complex, dynamic big data. Here, we report a suite of computational tools, called Functional AStrocyte Phenotyping (FASP), for automatically quantifying the functional status of astrocytes. Considering the complex nature of Ca 2+ signaling in astrocytes and low signal to noise ratio, FASP is designed with data-driven and probabilistic principles, to flexibly account for various patterns and to perform robustly with noisy data. In particular, FASP explicitly models signal propagation, which rules out the applicability of tools designed for other types of data. We demonstrate the effectiveness of FASP using extensive synthetic and real data sets. The findings by FASP were verified by manual inspection. FASP also detected signals that were missed by purely manual analysis but could be confirmed by more careful manual examination under the guidance of automatic analysis. All algorithms and the analysis pipeline are packaged into a plugin for Fiji (ImageJ), with the source code freely available online at https://github.com/VTcbil/FASP.
Jayaseelan, Kalai Vanii; Steinbeck, Christoph
2014-07-05
In metabolomics experiments, spectral fingerprints of metabolites with no known structural identity are detected routinely. Computer-assisted structure elucidation (CASE) has been used to determine the structural identities of unknown compounds. It is generally accepted that a single 1D NMR spectrum or mass spectrum is usually not sufficient to establish the identity of a hitherto unknown compound. When a suite of spectra from 1D and 2D NMR experiments supplemented with a molecular formula are available, the successful elucidation of the chemical structure for candidates with up to 30 heavy atoms has been reported previously by one of the authors. In high-throughput metabolomics, usually 1D NMR or mass spectrometry experiments alone are conducted for rapid analysis of samples. This method subsequently requires that the spectral patterns are analyzed automatically to quickly identify known and unknown structures. In this study, we investigated whether additional existing knowledge, such as the fact that the unknown compound is a natural product, can be used to improve the ranking of the correct structure in the result list after the structure elucidation process. To identify unknowns using as little spectroscopic information as possible, we implemented an evolutionary algorithm-based CASE mechanism to elucidate candidates in a fully automated fashion, with input of the molecular formula and 13C NMR spectrum of the isolated compound. We also tested how filters like natural product-likeness, a measure that calculates the similarity of the compounds to known natural product space, might enhance the performance and quality of the structure elucidation. The evolutionary algorithm is implemented within the SENECA package for CASE reported previously, and is available for free download under artistic license at http://sourceforge.net/projects/seneca/. The natural product-likeness calculator is incorporated as a plugin within SENECA and is available as a GUI client and command-line executable. Significant improvements in candidate ranking were demonstrated for 41 small test molecules when the CASE system was supplemented by a natural product-likeness filter. In spectroscopically underdetermined structure elucidation problems, natural product-likeness can contribute to a better ranking of the correct structure in the results list.
NASA Astrophysics Data System (ADS)
Gassmöller, Rene; Bangerth, Wolfgang
2016-04-01
Particle-in-cell methods have a long history and many applications in geodynamic modelling of mantle convection, lithospheric deformation and crustal dynamics. They are primarily used to track material information, the strain a material has undergone, the pressure-temperature history a certain material region has experienced, or the amount of volatiles or partial melt present in a region. However, their efficient parallel implementation - in particular combined with adaptive finite-element meshes - is complicated due to the complex communication patterns and frequent reassignment of particles to cells. Consequently, many current scientific software packages accomplish this efficient implementation by specifically designing particle methods for a single purpose, like the advection of scalar material properties that do not evolve over time (e.g., for chemical heterogeneities). Design choices for particle integration, data storage, and parallel communication are then optimized for this single purpose, making the code relatively rigid to changing requirements. Here, we present the implementation of a flexible, scalable and efficient particle-in-cell method for massively parallel finite-element codes with adaptively changing meshes. Using a modular plugin structure, we allow maximum flexibility of the generation of particles, the carried tracer properties, the advection and output algorithms, and the projection of properties to the finite-element mesh. We present scaling tests ranging up to tens of thousands of cores and tens of billions of particles. Additionally, we discuss efficient load-balancing strategies for particles in adaptive meshes with their strengths and weaknesses, local particle-transfer between parallel subdomains utilizing existing communication patterns from the finite element mesh, and the use of established parallel output algorithms like the HDF5 library. Finally, we show some relevant particle application cases, compare our implementation to a modern advection-field approach, and demonstrate under which conditions which method is more efficient. We implemented the presented methods in ASPECT (aspect.dealii.org), a freely available open-source community code for geodynamic simulations. The structure of the particle code is highly modular, and segregated from the PDE solver, and can thus be easily transferred to other programs, or adapted for various application cases.
Nipype: a flexible, lightweight and extensible neuroimaging data processing framework in python.
Gorgolewski, Krzysztof; Burns, Christopher D; Madison, Cindee; Clark, Dav; Halchenko, Yaroslav O; Waskom, Michael L; Ghosh, Satrajit S
2011-01-01
Current neuroimaging software offer users an incredible opportunity to analyze their data in different ways, with different underlying assumptions. Several sophisticated software packages (e.g., AFNI, BrainVoyager, FSL, FreeSurfer, Nipy, R, SPM) are used to process and analyze large and often diverse (highly multi-dimensional) data. However, this heterogeneous collection of specialized applications creates several issues that hinder replicable, efficient, and optimal use of neuroimaging analysis approaches: (1) No uniform access to neuroimaging analysis software and usage information; (2) No framework for comparative algorithm development and dissemination; (3) Personnel turnover in laboratories often limits methodological continuity and training new personnel takes time; (4) Neuroimaging software packages do not address computational efficiency; and (5) Methods sections in journal articles are inadequate for reproducing results. To address these issues, we present Nipype (Neuroimaging in Python: Pipelines and Interfaces; http://nipy.org/nipype), an open-source, community-developed, software package, and scriptable library. Nipype solves the issues by providing Interfaces to existing neuroimaging software with uniform usage semantics and by facilitating interaction between these packages using Workflows. Nipype provides an environment that encourages interactive exploration of algorithms, eases the design of Workflows within and between packages, allows rapid comparative development of algorithms and reduces the learning curve necessary to use different packages. Nipype supports both local and remote execution on multi-core machines and clusters, without additional scripting. Nipype is Berkeley Software Distribution licensed, allowing anyone unrestricted usage. An open, community-driven development philosophy allows the software to quickly adapt and address the varied needs of the evolving neuroimaging community, especially in the context of increasing demand for reproducible research.
Nipype: A Flexible, Lightweight and Extensible Neuroimaging Data Processing Framework in Python
Gorgolewski, Krzysztof; Burns, Christopher D.; Madison, Cindee; Clark, Dav; Halchenko, Yaroslav O.; Waskom, Michael L.; Ghosh, Satrajit S.
2011-01-01
Current neuroimaging software offer users an incredible opportunity to analyze their data in different ways, with different underlying assumptions. Several sophisticated software packages (e.g., AFNI, BrainVoyager, FSL, FreeSurfer, Nipy, R, SPM) are used to process and analyze large and often diverse (highly multi-dimensional) data. However, this heterogeneous collection of specialized applications creates several issues that hinder replicable, efficient, and optimal use of neuroimaging analysis approaches: (1) No uniform access to neuroimaging analysis software and usage information; (2) No framework for comparative algorithm development and dissemination; (3) Personnel turnover in laboratories often limits methodological continuity and training new personnel takes time; (4) Neuroimaging software packages do not address computational efficiency; and (5) Methods sections in journal articles are inadequate for reproducing results. To address these issues, we present Nipype (Neuroimaging in Python: Pipelines and Interfaces; http://nipy.org/nipype), an open-source, community-developed, software package, and scriptable library. Nipype solves the issues by providing Interfaces to existing neuroimaging software with uniform usage semantics and by facilitating interaction between these packages using Workflows. Nipype provides an environment that encourages interactive exploration of algorithms, eases the design of Workflows within and between packages, allows rapid comparative development of algorithms and reduces the learning curve necessary to use different packages. Nipype supports both local and remote execution on multi-core machines and clusters, without additional scripting. Nipype is Berkeley Software Distribution licensed, allowing anyone unrestricted usage. An open, community-driven development philosophy allows the software to quickly adapt and address the varied needs of the evolving neuroimaging community, especially in the context of increasing demand for reproducible research. PMID:21897815
Recovery Act. Advanced Load Identification and Management for Buildings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Yi; Casey, Patrick; Du, Liang
2014-02-12
In response to the U.S. Department of Energy (DoE)’s goal of achieving market ready, net-zero energy residential and commercial buildings by 2020 and 2025, Eaton partnered with the Department of Energy’s National Renewable Energy Laboratory (NREL) and Georgia Institute of Technology to develop an intelligent load identification and management technology enabled by a novel “smart power strip” to provide critical intelligence and information to improve the capability and functionality of building load analysis and building power management systems. Buildings account for 41% of the energy consumption in the United States, significantly more than either transportation or industrial. Within the buildingmore » sector, plug loads account for a significant portion of energy consumption. Plug load consumes 15-20% of building energy on average. As building managers implement aggressive energy conservation measures, the proportion of plug load energy can increase to as much as 50% of building energy leaving plug loads as the largest remaining single source of energy consumption. This project focused on addressing plug-in load control and management to further improve building energy efficiency accomplished through effective load identification. The execution of the project falls into the following three major aspects; An intelligent load modeling, identification and prediction technology was developed to automatically determine the type, energy consumption, power quality, operation status and performance status of plug-in loads, using electric waveforms at a power outlet level. This project demonstrated the effectiveness of the developed technology through a large set of plug-in loads measurements and testing; A novel “Smart Power Strip (SPS) / Receptacle” prototype was developed to act as a vehicle to demonstrate the feasibility of load identification technology as a low-cost, embedded solution; and Market environment for plug-in load control and management solutions, in particular, advanced power strips (APSs) was studied. The project evaluated the market potential for Smart Power Strips (SPSs) with load identification and the likely impact of a load identification feature on APS adoption and effectiveness. The project also identified other success factors required for widespread APS adoption and market acceptance. Even though the developed technology is applicable for both residential and commercial buildings, this project is focused on effective plug-in load control and management for commercial buildings, accomplished through effective load identification. The project has completed Smart Receptacle (SR) prototype development with integration of Load ID, Control/Management, WiFi communication, and Web Service. Twenty SR units were built, tested, and demonstrated in the Eaton lab; eight SR units were tested in the National Renewable Energy Lab (NREL) for one-month of field testing. Load ID algorithm testing for extended load sets was conducted within the Eaton facility and at local university campuses. This report is to summarize the major achievements, activities, and outcomes under the execution of the project.« less
Implementation of GenePattern within the Stanford Microarray Database.
Hubble, Jeremy; Demeter, Janos; Jin, Heng; Mao, Maria; Nitzberg, Michael; Reddy, T B K; Wymore, Farrell; Zachariah, Zachariah K; Sherlock, Gavin; Ball, Catherine A
2009-01-01
Hundreds of researchers across the world use the Stanford Microarray Database (SMD; http://smd.stanford.edu/) to store, annotate, view, analyze and share microarray data. In addition to providing registered users at Stanford access to their own data, SMD also provides access to public data, and tools with which to analyze those data, to any public user anywhere in the world. Previously, the addition of new microarray data analysis tools to SMD has been limited by available engineering resources, and in addition, the existing suite of tools did not provide a simple way to design, execute and share analysis pipelines, or to document such pipelines for the purposes of publication. To address this, we have incorporated the GenePattern software package directly into SMD, providing access to many new analysis tools, as well as a plug-in architecture that allows users to directly integrate and share additional tools through SMD. In this article, we describe our implementation of the GenePattern microarray analysis software package into the SMD code base. This extension is available with the SMD source code that is fully and freely available to others under an Open Source license, enabling other groups to create a local installation of SMD with an enriched data analysis capability.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whitaker, B; Barkley, A; Cole, Z
2014-05-01
This paper presents an isolated on-board vehicular battery charger that utilizes silicon carbide (SiC) power devices to achieve high density and high efficiency for application in electric vehicles (EVs) and plug-in hybrid EVs (PHEVs). The proposed level 2 charger has a two-stage architecture where the first stage is a bridgeless boost ac-dc converter and the second stage is a phase-shifted full-bridge isolated dc-dc converter. The operation of both topologies is presented and the specific advantages gained through the use of SiC power devices are discussed. The design of power stage components, the packaging of the multichip power module, and themore » system-level packaging is presented with a primary focus on system density and a secondary focus on system efficiency. In this work, a hardware prototype is developed and a peak system efficiency of 95% is measured while operating both power stages with a switching frequency of 200 kHz. A maximum output power of 6.1 kW results in a volumetric power density of 5.0 kW/L and a gravimetric power density of 3.8 kW/kg when considering the volume and mass of the system including a case.« less
FASTSIM2: a second-order accurate frictional rolling contact algorithm
NASA Astrophysics Data System (ADS)
Vollebregt, E. A. H.; Wilders, P.
2011-01-01
In this paper we consider the frictional (tangential) steady rolling contact problem. We confine ourselves to the simplified theory, instead of using full elastostatic theory, in order to be able to compute results fast, as needed for on-line application in vehicle system dynamics simulation packages. The FASTSIM algorithm is the leading technology in this field and is employed in all dominant railway vehicle system dynamics packages (VSD) in the world. The main contribution of this paper is a new version "FASTSIM2" of the FASTSIM algorithm, which is second-order accurate. This is relevant for VSD, because with the new algorithm 16 times less grid points are required for sufficiently accurate computations of the contact forces. The approach is based on new insights in the characteristics of the rolling contact problem when using the simplified theory, and on taking precise care of the contact conditions in the numerical integration scheme employed.
Master-slave control scheme in electric vehicle smart charging infrastructure.
Chung, Ching-Yen; Chynoweth, Joshua; Chu, Chi-Cheng; Gadh, Rajit
2014-01-01
WINSmartEV is a software based plug-in electric vehicle (PEV) monitoring, control, and management system. It not only incorporates intelligence at every level so that charge scheduling can avoid grid bottlenecks, but it also multiplies the number of PEVs that can be plugged into a single circuit. This paper proposes, designs, and executes many upgrades to WINSmartEV. These upgrades include new hardware that makes the level 1 and level 2 chargers faster, more robust, and more scalable. It includes algorithms that provide a more optimal charge scheduling for the level 2 (EVSE) and an enhanced vehicle monitoring/identification module (VMM) system that can automatically identify PEVs and authorize charging.
Master-Slave Control Scheme in Electric Vehicle Smart Charging Infrastructure
Chung, Ching-Yen; Chynoweth, Joshua; Chu, Chi-Cheng; Gadh, Rajit
2014-01-01
WINSmartEV is a software based plug-in electric vehicle (PEV) monitoring, control, and management system. It not only incorporates intelligence at every level so that charge scheduling can avoid grid bottlenecks, but it also multiplies the number of PEVs that can be plugged into a single circuit. This paper proposes, designs, and executes many upgrades to WINSmartEV. These upgrades include new hardware that makes the level 1 and level 2 chargers faster, more robust, and more scalable. It includes algorithms that provide a more optimal charge scheduling for the level 2 (EVSE) and an enhanced vehicle monitoring/identification module (VMM) system that can automatically identify PEVs and authorize charging. PMID:24982956
AstroCloud: An Agile platform for data visualization and specific analyzes in 2D and 3D
NASA Astrophysics Data System (ADS)
Molina, F. Z.; Salgado, R.; Bergel, A.; Infante, A.
2017-07-01
Nowadays, astronomers commonly run their own tools, or distributed computational packages, for data analysis and then visualizing the results with generic applications. This chain of processes comes at high cost: (a) analyses are manually applied, they are therefore difficult to be automatized, and (b) data have to be serialized, thus increasing the cost of parsing and saving intermediary data. We are developing AstroCloud, an agile visualization multipurpose platform intended for specific analyses of astronomical images (https://astrocloudy.wordpress.com). This platform incorporates domain-specific languages which make it easily extensible. AstroCloud supports customized plug-ins, which translate into time reduction on data analysis. Moreover, it also supports 2D and 3D rendering, including interactive features in real time. AstroCloud is under development, we are currently implementing different choices for data reduction and physical analyzes.
D3GB: An Interactive Genome Browser for R, Python, and WordPress.
Barrios, David; Prieto, Carlos
2017-05-01
Genome browsers are useful not only for showing final results but also for improving analysis protocols, testing data quality, and generating result drafts. Its integration in analysis pipelines allows the optimization of parameters, which leads to better results. New developments that facilitate the creation and utilization of genome browsers could contribute to improving analysis results and supporting the quick visualization of genomic data. D3 Genome Browser is an interactive genome browser that can be easily integrated in analysis protocols and shared on the Web. It is distributed as an R package, a Python module, and a WordPress plugin to facilitate its integration in pipelines and the utilization of platform capabilities. It is compatible with popular data formats such as GenBank, GFF, BED, FASTA, and VCF, and enables the exploration of genomic data with a Web browser.
Youpi: YOUr processing PIpeline
NASA Astrophysics Data System (ADS)
Monnerville, Mathias; Sémah, Gregory
2012-03-01
Youpi is a portable, easy to use web application providing high level functionalities to perform data reduction on scientific FITS images. Built on top of various open source reduction tools released to the community by TERAPIX (http://terapix.iap.fr), Youpi can help organize data, manage processing jobs on a computer cluster in real time (using Condor) and facilitate teamwork by allowing fine-grain sharing of results and data. Youpi is modular and comes with plugins which perform, from within a browser, various processing tasks such as evaluating the quality of incoming images (using the QualityFITS software package), computing astrometric and photometric solutions (using SCAMP), resampling and co-adding FITS images (using SWarp) and extracting sources and building source catalogues from astronomical images (using SExtractor). Youpi is useful for small to medium-sized data reduction projects; it is free and is published under the GNU General Public License.
Jflow: a workflow management system for web applications.
Mariette, Jérôme; Escudié, Frédéric; Bardou, Philippe; Nabihoudine, Ibouniyamine; Noirot, Céline; Trotard, Marie-Stéphane; Gaspin, Christine; Klopp, Christophe
2016-02-01
Biologists produce large data sets and are in demand of rich and simple web portals in which they can upload and analyze their files. Providing such tools requires to mask the complexity induced by the needed High Performance Computing (HPC) environment. The connection between interface and computing infrastructure is usually specific to each portal. With Jflow, we introduce a Workflow Management System (WMS), composed of jQuery plug-ins which can easily be embedded in any web application and a Python library providing all requested features to setup, run and monitor workflows. Jflow is available under the GNU General Public License (GPL) at http://bioinfo.genotoul.fr/jflow. The package is coming with full documentation, quick start and a running test portal. Jerome.Mariette@toulouse.inra.fr. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Using quantum filters to process images of diffuse axonal injury
NASA Astrophysics Data System (ADS)
Pineda Osorio, Mateo
2014-06-01
Some images corresponding to a diffuse axonal injury (DAI) are processed using several quantum filters such as Hermite Weibull and Morse. Diffuse axonal injury is a particular, common and severe case of traumatic brain injury (TBI). DAI involves global damage on microscopic scale of brain tissue and causes serious neurologic abnormalities. New imaging techniques provide excellent images showing cellular damages related to DAI. Said images can be processed with quantum filters, which accomplish high resolutions of dendritic and axonal structures both in normal and pathological state. Using the Laplacian operators from the new quantum filters, excellent edge detectors for neurofiber resolution are obtained. Image quantum processing of DAI images is made using computer algebra, specifically Maple. Quantum filter plugins construction is proposed as a future research line, which can incorporated to the ImageJ software package, making its use simpler for medical personnel.
Flexible network reconstruction from relational databases with Cytoscape and CytoSQL
2010-01-01
Background Molecular interaction networks can be efficiently studied using network visualization software such as Cytoscape. The relevant nodes, edges and their attributes can be imported in Cytoscape in various file formats, or directly from external databases through specialized third party plugins. However, molecular data are often stored in relational databases with their own specific structure, for which dedicated plugins do not exist. Therefore, a more generic solution is presented. Results A new Cytoscape plugin 'CytoSQL' is developed to connect Cytoscape to any relational database. It allows to launch SQL ('Structured Query Language') queries from within Cytoscape, with the option to inject node or edge features of an existing network as SQL arguments, and to convert the retrieved data to Cytoscape network components. Supported by a set of case studies we demonstrate the flexibility and the power of the CytoSQL plugin in converting specific data subsets into meaningful network representations. Conclusions CytoSQL offers a unified approach to let Cytoscape interact with relational databases. Thanks to the power of the SQL syntax, this tool can rapidly generate and enrich networks according to very complex criteria. The plugin is available at http://www.ptools.ua.ac.be/CytoSQL. PMID:20594316
Flexible network reconstruction from relational databases with Cytoscape and CytoSQL.
Laukens, Kris; Hollunder, Jens; Dang, Thanh Hai; De Jaeger, Geert; Kuiper, Martin; Witters, Erwin; Verschoren, Alain; Van Leemput, Koenraad
2010-07-01
Molecular interaction networks can be efficiently studied using network visualization software such as Cytoscape. The relevant nodes, edges and their attributes can be imported in Cytoscape in various file formats, or directly from external databases through specialized third party plugins. However, molecular data are often stored in relational databases with their own specific structure, for which dedicated plugins do not exist. Therefore, a more generic solution is presented. A new Cytoscape plugin 'CytoSQL' is developed to connect Cytoscape to any relational database. It allows to launch SQL ('Structured Query Language') queries from within Cytoscape, with the option to inject node or edge features of an existing network as SQL arguments, and to convert the retrieved data to Cytoscape network components. Supported by a set of case studies we demonstrate the flexibility and the power of the CytoSQL plugin in converting specific data subsets into meaningful network representations. CytoSQL offers a unified approach to let Cytoscape interact with relational databases. Thanks to the power of the SQL syntax, this tool can rapidly generate and enrich networks according to very complex criteria. The plugin is available at http://www.ptools.ua.ac.be/CytoSQL.
An Inverse Modeling Plugin for HydroDesktop using the Method of Anchored Distributions (MAD)
NASA Astrophysics Data System (ADS)
Ames, D. P.; Osorio, C.; Over, M. W.; Rubin, Y.
2011-12-01
The CUAHSI Hydrologic Information System (HIS) software stack is based on an open and extensible architecture that facilitates the addition of new functions and capabilities at both the server side (using HydroServer) and the client side (using HydroDesktop). The HydroDesktop client plugin architecture is used here to expose a new scripting based plugin that makes use of the R statistics software as a means for conducting inverse modeling using the Method of Anchored Distributions (MAD). MAD is a Bayesian inversion technique for conditioning computational model parameters on relevant field observations yielding probabilistic distributions of the model parameters, related to the spatial random variable of interest, by assimilating multi-type and multi-scale data. The implementation of a desktop software tool for using the MAD technique is expected to significantly lower the barrier to use of inverse modeling in education, research, and resource management. The HydroDesktop MAD plugin is being developed following a community-based, open-source approach that will help both its adoption and long term sustainability as a user tool. This presentation will briefly introduce MAD, HydroDesktop, and the MAD plugin and software development effort.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
The U.S. Department of Energy and the U.S. Department of Transportation have published a guide to highlight examples of federal support and technical assistance for plug-in electric vehicles (PEVs) and charging stations. The guide provides a description of each opportunity and a point of contact to assist those interested in advancing PEV technology. The Department of Energy’s Alternative Fuels Data Center provides a comprehensive database of federal and state programs that support plug-in electric vehicles and infrastructure.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schey, Stephen; Francfort, Jim
This report focuses on the NASA White Sands Test Facility (WSTF) fleet to identify daily operational characteristics of select vehicles and report findings on vehicle and mission characterizations to support the successful introduction of plug-in electric vehicles (PEVs) into the agencies’ fleets. Individual observations of these selected vehicles provide the basis for recommendations related to electric vehicle adoption and whether a battery electric vehicle (BEV) or plug-in hybrid electric vehicle (PHEV) (collectively plug-in electric vehicles, or PEVs) can fulfill the mission requirements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schey, Stephen; Francfort, Jim
2014-11-01
This report focuses on the National Institute of Health (NIH) fleet to identify daily operational characteristics of select vehicles and report findings on vehicle and mission characterizations to support the successful introduction of plug-in electric vehicles (PEVs) into the agencies’ fleets. Individual observations of these selected vehicles provide the basis for recommendations related to electric vehicle adoption and whether a battery electric vehicle (BEV) or plug-in hybrid electric vehicle (PHEV) (collectively plug-in electric vehicles, or PEVs) can fulfill the mission requirements.
CytoscapeRPC: a plugin to create, modify and query Cytoscape networks from scripting languages.
Bot, Jan J; Reinders, Marcel J T
2011-09-01
CytoscapeRPC is a plugin for Cytoscape which allows users to create, query and modify Cytoscape networks from any programming language which supports XML-RPC. This enables them to access Cytoscape functionality and visualize their data interactively without leaving the programming environment with which they are familiar. Install through the Cytoscape plugin manager or visit the web page: http://wiki.nbic.nl/index.php/CytoscapeRPC for the user tutorial and download. j.j.bot@tudelft.nl; j.j.bot@tudelft.nl.
AITSO: A Tool for Spatial Optimization Based on Artificial Immune Systems
Zhao, Xiang; Liu, Yaolin; Liu, Dianfeng; Ma, Xiaoya
2015-01-01
A great challenge facing geocomputation and spatial analysis is spatial optimization, given that it involves various high-dimensional, nonlinear, and complicated relationships. Many efforts have been made with regard to this specific issue, and the strong ability of artificial immune system algorithms has been proven in previous studies. However, user-friendly professional software is still unavailable, which is a great impediment to the popularity of artificial immune systems. This paper describes a free, universal tool, named AITSO, which is capable of solving various optimization problems. It provides a series of standard application programming interfaces (APIs) which can (1) assist researchers in the development of their own problem-specific application plugins to solve practical problems and (2) allow the implementation of some advanced immune operators into the platform to improve the performance of an algorithm. As an integrated, flexible, and convenient tool, AITSO contributes to knowledge sharing and practical problem solving. It is therefore believed that it will advance the development and popularity of spatial optimization in geocomputation and spatial analysis. PMID:25678911
Maximizing the Benefits of Plug-in Electric Vehicles - Continuum Magazine
Testing and Integration Facility. Photo by Dennis Schroeder, NREL Maximizing the Benefits of Plug-in . Electric vehicle charging stations in NREL's parking garage. Photo by Dennis Schroder, NREL An NREL
Reference Gene Validation for RT-qPCR, a Note on Different Available Software Packages
De Spiegelaere, Ward; Dern-Wieloch, Jutta; Weigel, Roswitha; Schumacher, Valérie; Schorle, Hubert; Nettersheim, Daniel; Bergmann, Martin; Brehm, Ralph; Kliesch, Sabine; Vandekerckhove, Linos; Fink, Cornelia
2015-01-01
Background An appropriate normalization strategy is crucial for data analysis from real time reverse transcription polymerase chain reactions (RT-qPCR). It is widely supported to identify and validate stable reference genes, since no single biological gene is stably expressed between cell types or within cells under different conditions. Different algorithms exist to validate optimal reference genes for normalization. Applying human cells, we here compare the three main methods to the online available RefFinder tool that integrates these algorithms along with R-based software packages which include the NormFinder and GeNorm algorithms. Results 14 candidate reference genes were assessed by RT-qPCR in two sample sets, i.e. a set of samples of human testicular tissue containing carcinoma in situ (CIS), and a set of samples from the human adult Sertoli cell line (FS1) either cultured alone or in co-culture with the seminoma like cell line (TCam-2) or with equine bone marrow derived mesenchymal stem cells (eBM-MSC). Expression stabilities of the reference genes were evaluated using geNorm, NormFinder, and BestKeeper. Similar results were obtained by the three approaches for the most and least stably expressed genes. The R-based packages NormqPCR, SLqPCR and the NormFinder for R script gave identical gene rankings. Interestingly, different outputs were obtained between the original software packages and the RefFinder tool, which is based on raw Cq values for input. When the raw data were reanalysed assuming 100% efficiency for all genes, then the outputs of the original software packages were similar to the RefFinder software, indicating that RefFinder outputs may be biased because PCR efficiencies are not taken into account. Conclusions This report shows that assay efficiency is an important parameter for reference gene validation. New software tools that incorporate these algorithms should be carefully validated prior to use. PMID:25825906
Reference gene validation for RT-qPCR, a note on different available software packages.
De Spiegelaere, Ward; Dern-Wieloch, Jutta; Weigel, Roswitha; Schumacher, Valérie; Schorle, Hubert; Nettersheim, Daniel; Bergmann, Martin; Brehm, Ralph; Kliesch, Sabine; Vandekerckhove, Linos; Fink, Cornelia
2015-01-01
An appropriate normalization strategy is crucial for data analysis from real time reverse transcription polymerase chain reactions (RT-qPCR). It is widely supported to identify and validate stable reference genes, since no single biological gene is stably expressed between cell types or within cells under different conditions. Different algorithms exist to validate optimal reference genes for normalization. Applying human cells, we here compare the three main methods to the online available RefFinder tool that integrates these algorithms along with R-based software packages which include the NormFinder and GeNorm algorithms. 14 candidate reference genes were assessed by RT-qPCR in two sample sets, i.e. a set of samples of human testicular tissue containing carcinoma in situ (CIS), and a set of samples from the human adult Sertoli cell line (FS1) either cultured alone or in co-culture with the seminoma like cell line (TCam-2) or with equine bone marrow derived mesenchymal stem cells (eBM-MSC). Expression stabilities of the reference genes were evaluated using geNorm, NormFinder, and BestKeeper. Similar results were obtained by the three approaches for the most and least stably expressed genes. The R-based packages NormqPCR, SLqPCR and the NormFinder for R script gave identical gene rankings. Interestingly, different outputs were obtained between the original software packages and the RefFinder tool, which is based on raw Cq values for input. When the raw data were reanalysed assuming 100% efficiency for all genes, then the outputs of the original software packages were similar to the RefFinder software, indicating that RefFinder outputs may be biased because PCR efficiencies are not taken into account. This report shows that assay efficiency is an important parameter for reference gene validation. New software tools that incorporate these algorithms should be carefully validated prior to use.
Colombet, B; Woodman, M; Badier, J M; Bénar, C G
2015-03-15
The importance of digital signal processing in clinical neurophysiology is growing steadily, involving clinical researchers and methodologists. There is a need for crossing the gap between these communities by providing efficient delivery of newly designed algorithms to end users. We have developed such a tool which both visualizes and processes data and, additionally, acts as a software development platform. AnyWave was designed to run on all common operating systems. It provides access to a variety of data formats and it employs high fidelity visualization techniques. It also allows using external tools as plug-ins, which can be developed in languages including C++, MATLAB and Python. In the current version, plug-ins allow computation of connectivity graphs (non-linear correlation h2) and time-frequency representation (Morlet wavelets). The software is freely available under the LGPL3 license. AnyWave is designed as an open, highly extensible solution, with an architecture that permits rapid delivery of new techniques to end users. We have developed AnyWave software as an efficient neurophysiological data visualizer able to integrate state of the art techniques. AnyWave offers an interface well suited to the needs of clinical research and an architecture designed for integrating new tools. We expect this software to strengthen the collaboration between clinical neurophysiologists and researchers in biomedical engineering and signal processing. Copyright © 2015 Elsevier B.V. All rights reserved.
Rattner, Alexander S.; Guillen, Donna Post; Joshi, Alark; ...
2016-03-17
Photo- and physically realistic techniques are often insufficient for visualization of fluid flow simulations, especially for 3D and time-varying studies. Substantial research effort has been dedicated to the development of non-photorealistic and illustration-inspired visualization techniques for compact and intuitive presentation of such complex datasets. However, a great deal of work has been reproduced in this field, as many research groups have developed specialized visualization software. Additionally, interoperability between illustrative visualization software is limited due to diverse processing and rendering architectures employed in different studies. In this investigation, a framework for illustrative visualization is proposed, and implemented in MarmotViz, a ParaViewmore » plug-in, enabling its use on a variety of computing platforms with various data file formats and mesh geometries. Region-of-interest identification and feature-tracking algorithms incorporated into this tool are described. Implementations of multiple illustrative effect algorithms are also presented to demonstrate the use and flexibility of this framework. Here, by providing an integrated framework for illustrative visualization of CFD data, MarmotViz can serve as a valuable asset for the interpretation of simulations of ever-growing scale.« less
Xiao, Xun; Geyer, Veikko F.; Bowne-Anderson, Hugo; Howard, Jonathon; Sbalzarini, Ivo F.
2016-01-01
Biological filaments, such as actin filaments, microtubules, and cilia, are often imaged using different light-microscopy techniques. Reconstructing the filament curve from the acquired images constitutes the filament segmentation problem. Since filaments have lower dimensionality than the image itself, there is an inherent trade-off between tracing the filament with sub-pixel accuracy and avoiding noise artifacts. Here, we present a globally optimal filament segmentation method based on B-spline vector level-sets and a generalized linear model for the pixel intensity statistics. We show that the resulting optimization problem is convex and can hence be solved with global optimality. We introduce a simple and efficient algorithm to compute such optimal filament segmentations, and provide an open-source implementation as an ImageJ/Fiji plugin. We further derive an information-theoretic lower bound on the filament segmentation error, quantifying how well an algorithm could possibly do given the information in the image. We show that our algorithm asymptotically reaches this bound in the spline coefficients. We validate our method in comprehensive benchmarks, compare with other methods, and show applications from fluorescence, phase-contrast, and dark-field microscopy. PMID:27104582
High-Performance Computing for the Electromagnetic Modeling and Simulation of Interconnects
NASA Technical Reports Server (NTRS)
Schutt-Aine, Jose E.
1996-01-01
The electromagnetic modeling of packages and interconnects plays a very important role in the design of high-speed digital circuits, and is most efficiently performed by using computer-aided design algorithms. In recent years, packaging has become a critical area in the design of high-speed communication systems and fast computers, and the importance of the software support for their development has increased accordingly. Throughout this project, our efforts have focused on the development of modeling and simulation techniques and algorithms that permit the fast computation of the electrical parameters of interconnects and the efficient simulation of their electrical performance.
Valuation of plug-in vehicle life-cycle air emissions and oil displacement benefits
Michalek, Jeremy J.; Chester, Mikhail; Jaramillo, Paulina; Samaras, Constantine; Shiau, Ching-Shin Norman; Lave, Lester B.
2011-01-01
We assess the economic value of life-cycle air emissions and oil consumption from conventional vehicles, hybrid-electric vehicles (HEVs), plug-in hybrid-electric vehicles (PHEVs), and battery electric vehicles in the US. We find that plug-in vehicles may reduce or increase externality costs relative to grid-independent HEVs, depending largely on greenhouse gas and SO2 emissions produced during vehicle charging and battery manufacturing. However, even if future marginal damages from emissions of battery and electricity production drop dramatically, the damage reduction potential of plug-in vehicles remains small compared to ownership cost. As such, to offer a socially efficient approach to emissions and oil consumption reduction, lifetime cost of plug-in vehicles must be competitive with HEVs. Current subsidies intended to encourage sales of plug-in vehicles with large capacity battery packs exceed our externality estimates considerably, and taxes that optimally correct for externality damages would not close the gap in ownership cost. In contrast, HEVs and PHEVs with small battery packs reduce externality damages at low (or no) additional cost over their lifetime. Although large battery packs allow vehicles to travel longer distances using electricity instead of gasoline, large packs are more expensive, heavier, and more emissions intensive to produce, with lower utilization factors, greater charging infrastructure requirements, and life-cycle implications that are more sensitive to uncertain, time-sensitive, and location-specific factors. To reduce air emission and oil dependency impacts from passenger vehicles, strategies to promote adoption of HEVs and PHEVs with small battery packs offer more social benefits per dollar spent. PMID:21949359
Valuation of plug-in vehicle life-cycle air emissions and oil displacement benefits.
Michalek, Jeremy J; Chester, Mikhail; Jaramillo, Paulina; Samaras, Constantine; Shiau, Ching-Shin Norman; Lave, Lester B
2011-10-04
We assess the economic value of life-cycle air emissions and oil consumption from conventional vehicles, hybrid-electric vehicles (HEVs), plug-in hybrid-electric vehicles (PHEVs), and battery electric vehicles in the US. We find that plug-in vehicles may reduce or increase externality costs relative to grid-independent HEVs, depending largely on greenhouse gas and SO(2) emissions produced during vehicle charging and battery manufacturing. However, even if future marginal damages from emissions of battery and electricity production drop dramatically, the damage reduction potential of plug-in vehicles remains small compared to ownership cost. As such, to offer a socially efficient approach to emissions and oil consumption reduction, lifetime cost of plug-in vehicles must be competitive with HEVs. Current subsidies intended to encourage sales of plug-in vehicles with large capacity battery packs exceed our externality estimates considerably, and taxes that optimally correct for externality damages would not close the gap in ownership cost. In contrast, HEVs and PHEVs with small battery packs reduce externality damages at low (or no) additional cost over their lifetime. Although large battery packs allow vehicles to travel longer distances using electricity instead of gasoline, large packs are more expensive, heavier, and more emissions intensive to produce, with lower utilization factors, greater charging infrastructure requirements, and life-cycle implications that are more sensitive to uncertain, time-sensitive, and location-specific factors. To reduce air emission and oil dependency impacts from passenger vehicles, strategies to promote adoption of HEVs and PHEVs with small battery packs offer more social benefits per dollar spent.
Dual-Drive Production Prototype Project
DOT National Transportation Integrated Search
2009-06-01
This project was an initiative to engineer, develop and build a plug-in hybrid-electric vehicle using the Dual-Drive system. The project aimed to build a plug-in hybrid utilitarian vehicle on a light commercial vehicle platform. The hybrid vehicle wi...
Collaborative WorkBench for Researchers - Work Smarter, Not Harder
NASA Technical Reports Server (NTRS)
Ramachandran, Rahul; Kuo, Kwo-sen; Maskey, Manil; Lynnes, Christopher
2014-01-01
It is important to define some commonly used terminology related to collaboration to facilitate clarity in later discussions. We define provisioning as infrastructure capabilities such as computation, storage, data, and tools provided by some agency or similarly trusted institution. Sharing is defined as the process of exchanging data, programs, and knowledge among individuals (often strangers) and groups. Collaboration is a specialized case of sharing. In collaboration, sharing with others (usually known colleagues) is done in pursuit of a common scientific goal or objective. Collaboration entails more dynamic and frequent interactions and can occur at different speeds. Synchronous collaboration occurs in real time such as editing a shared document on the fly, chatting, video conference, etc., and typically requires a peer-to-peer connection. Asynchronous collaboration is episodic in nature based on a push-pull model. Examples of asynchronous collaboration include email exchanges, blogging, repositories, etc. The purpose of a workbench is to provide a customizable framework for different applications. Since the workbench will be common to all the customized tools, it promotes building modular functionality that can be used and reused by multiple tools. The objective of our Collaborative Workbench (CWB) is thus to create such an open and extensible framework for the Earth Science community via a set of plug-ins. Our CWB is based on the Eclipse [2] Integrated Development Environment (IDE), which is designed as a small kernel containing a plug-in loader for hundreds of plug-ins. The kernel itself is an implementation of a known specification to provide an environment for the plug-ins to execute. This design enables modularity, where discrete chunks of functionality can be reused to build new applications. The minimal set of plug-ins necessary to create a client application is called the Eclipse Rich Client Platform (RCP) [3]; The Eclipse RCP also supports thousands of community-contributed plug-ins, making it a popular development platform for many diverse applications including the Science Activity Planner developed at JPL for the Mars rovers [4] and the scientific experiment tool Gumtree [5]. By leveraging the Eclipse RCP to provide an open, extensible framework, a CWB supports customizations via plug-ins to build rich user applications specific for Earth Science. More importantly, CWB plug-ins can be used by existing science tools built off Eclipse such as IDL or PyDev to provide seamless collaboration functionalities.
NASA Astrophysics Data System (ADS)
Mohamed, Ahmed
Efficient and reliable techniques for power delivery and utilization are needed to account for the increased penetration of renewable energy sources in electric power systems. Such methods are also required for current and future demands of plug-in electric vehicles and high-power electronic loads. Distributed control and optimal power network architectures will lead to viable solutions to the energy management issue with high level of reliability and security. This dissertation is aimed at developing and verifying new techniques for distributed control by deploying DC microgrids, involving distributed renewable generation and energy storage, through the operating AC power system. To achieve the findings of this dissertation, an energy system architecture was developed involving AC and DC networks, both with distributed generations and demands. The various components of the DC microgrid were designed and built including DC-DC converters, voltage source inverters (VSI) and AC-DC rectifiers featuring novel designs developed by the candidate. New control techniques were developed and implemented to maximize the operating range of the power conditioning units used for integrating renewable energy into the DC bus. The control and operation of the DC microgrids in the hybrid AC/DC system involve intelligent energy management. Real-time energy management algorithms were developed and experimentally verified. These algorithms are based on intelligent decision-making elements along with an optimization process. This was aimed at enhancing the overall performance of the power system and mitigating the effect of heavy non-linear loads with variable intensity and duration. The developed algorithms were also used for managing the charging/discharging process of plug-in electric vehicle emulators. The protection of the proposed hybrid AC/DC power system was studied. Fault analysis and protection scheme and coordination, in addition to ideas on how to retrofit currently available protection concepts and devices for AC systems in a DC network, were presented. A study was also conducted on the effect of changing the distribution architecture and distributing the storage assets on the various zones of the network on the system's dynamic security and stability. A practical shipboard power system was studied as an example of a hybrid AC/DC power system involving pulsed loads. Generally, the proposed hybrid AC/DC power system, besides most of the ideas, controls and algorithms presented in this dissertation, were experimentally verified at the Smart Grid Testbed, Energy Systems Research Laboratory. All the developments in this dissertation were experimentally verified at the Smart Grid Testbed.
DOT National Transportation Integrated Search
2018-02-02
This research project explores the plug-in electric vehicle (PEV) market, including both Battery Electric Vehicles (BEVs) and Plug-in Hybrid Electric Vehicles (PHEVs), and the sociodemographic characteristics of purchasing households. We use detailed...
Innovations for ISS Plug-In Plan (IPiP) Operations
NASA Technical Reports Server (NTRS)
Moore, Kevin D.
2013-01-01
Limited resources and increasing requirements will continue to influence decisions on ISS. The ISS Plug-In Plan (IPiP) supports power and data for utilization, systems, and daily operations through the Electrical Power System (EPS) Secondary Power/Data Subsystem. Given the fluid launch schedule, the focus of the Plug-In Plan has evolved to anticipate future requirements by judicious development and delivery of power supplies, power strips, Alternating Current (AC) power inverters, along with innovative deployment strategies. A partnership of ISS Program Office, Engineering Directorate, Mission Operations, and International Partners poses unique solutions with existing on-board equipment and resources.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schey, Stephen; Francfort, Jim
This report focuses on the Sleeping Bear Dunes National Lakeshore (SLBE) fleet to identify daily operational characteristics of select vehicles and report findings on vehicle and mission characterizations to support the successful introduction of plug-in electric vehicles (PEVs) into the agencies’ fleets. Individual observations of these selected vehicles provide the basis for recommendations related to electric vehicle adoption and whether a battery electric vehicle (BEV) or plug-in hybrid electric vehicle (PHEV) (collectively plug-in electric vehicles, or PEVs) can fulfill the mission requirements.
Research on laser marking speed optimization by using genetic algorithm.
Wang, Dongyun; Yu, Qiwei; Zhang, Yu
2015-01-01
Laser Marking Machine is the most common coding equipment on product packaging lines. However, the speed of laser marking has become a bottleneck of production. In order to remove this bottleneck, a new method based on a genetic algorithm is designed. On the basis of this algorithm, a controller was designed and simulations and experiments were performed. The results show that using this algorithm could effectively improve laser marking efficiency by 25%.
Integrated Network Decompositions and Dynamic Programming for Graph Optimization (INDDGO)
DOE Office of Scientific and Technical Information (OSTI.GOV)
The INDDGO software package offers a set of tools for finding exact solutions to graph optimization problems via tree decompositions and dynamic programming algorithms. Currently the framework offers serial and parallel (distributed memory) algorithms for finding tree decompositions and solving the maximum weighted independent set problem. The parallel dynamic programming algorithm is implemented on top of the MADNESS task-based runtime.
Application Transparent HTTP Over a Disruption Tolerant Smartnet
2014-09-01
American Standard Code for Information Interchange BP Bundle Protocol BPA bundle protocol agent CLA convergence layer adapters CPU central processing...forwarding them through the plugin pipeline. The initial version of the DTNInput plugin uses the BBN Spindle bundle protocol agent ( BPA ) implementation
ARC SDK: A toolbox for distributed computing and data applications
NASA Astrophysics Data System (ADS)
Skou Andersen, M.; Cameron, D.; Lindemann, J.
2014-06-01
Grid middleware suites provide tools to perform the basic tasks of job submission and retrieval and data access, however these tools tend to be low-level, operating on individual jobs or files and lacking in higher-level concepts. User communities therefore generally develop their own application-layer software catering to their specific communities' needs on top of the Grid middleware. It is thus important for the Grid middleware to provide a friendly, well documented and simple to use interface for the applications to build upon. The Advanced Resource Connector (ARC), developed by NorduGrid, provides a Software Development Kit (SDK) which enables applications to use the middleware for job and data management. This paper presents the architecture and functionality of the ARC SDK along with an example graphical application developed with the SDK. The SDK consists of a set of libraries accessible through Application Programming Interfaces (API) in several languages. It contains extensive documentation and example code and is available on multiple platforms. The libraries provide generic interfaces and rely on plugins to support a given technology or protocol and this modular design makes it easy to add a new plugin if the application requires supporting additional technologies.The ARC Graphical Clients package is a graphical user interface built on top of the ARC SDK and the Qt toolkit and it is presented here as a fully functional example of an application. It provides a graphical interface to enable job submission and management at the click of a button, and allows data on any Grid storage system to be manipulated using a visual file system hierarchy, as if it were a regular file system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herrmann, Matthias
2014-06-16
Nowadays, a large number of different electrochemical energy storage systems are known. In the last two decades the development was strongly driven by a continuously growing market of portable electronic devices (e.g. cellular phones, lap top computers, camcorders, cameras, tools). Current intensive efforts are under way to develop systems for automotive industry within the framework of electrically propelled mobility (e.g. hybrid electric vehicles, plug-in hybrid electric vehicles, full electric vehicles) and also for the energy storage market (e.g. electrical grid stability, renewable energies). Besides the different systems (cell chemistries), electrochemical cells and batteries were developed and are offered in manymore » shapes, sizes and designs, in order to meet performance and design requirements of the widespread applications. Proper packaging is thereby one important technological step for designing optimum, reliable and safe batteries for operation. In this contribution, current packaging approaches of cells and batteries together with the corresponding materials are discussed. The focus is laid on rechargeable systems for industrial applications (i.e. alkaline systems, lithium-ion, lead-acid). In principle, four different cell types (shapes) can be identified - button, cylindrical, prismatic and pouch. Cell size can be either in accordance with international (e.g. International Electrotechnical Commission, IEC) or other standards or can meet application-specific dimensions. Since cell housing or container, terminals and, if necessary, safety installations as inactive (non-reactive) materials reduce energy density of the battery, the development of low-weight packages is a challenging task. In addition to that, other requirements have to be fulfilled: mechanical stability and durability, sealing (e.g. high permeation barrier against humidity for lithium-ion technology), high packing efficiency, possible installation of safety devices (current interrupt device, valve, etc.), chemical inertness, cost issues, and others. Finally, proper cell design has to be considered for effective thermal management (i.e. cooling and heating) of battery packs.« less
Winston, R.B.
1999-01-01
This report describes enhancements to a Graphical-User Interface (GUI) for MODFLOW-96, the U.S. Geological Survey (USGS) modular, three-dimensional, finitedifference ground-water flow model, and MOC3D, the USGS three-dimensional, method-ofcharacteristics solute-transport model. The GUI is a plug-in extension (PIE) for the commercial program Argus ONEe. The GUI has been modified to support MODPATH (a particle tracking post-processing package for MODFLOW), ZONEBDGT (a computer program for calculating subregional water budgets), and the Stream, Horizontal-Flow Barrier, and Flow and Head Boundary packages in MODFLOW. Context-sensitive help has been added to make the GUI easier to use and to understand. In large part, the help consists of quotations from the relevant sections of this report and its predecessors. The revised interface includes automatic creation of geospatial information layers required for the added programs and packages, and menus and dialog boxes for input of parameters for simulation control. The GUI creates formatted ASCII files that can be read by MODFLOW-96, MOC3D, MODPATH, and ZONEBDGT. All four programs can be executed within the Argus ONEe application (Argus Interware, Inc., 1997). Spatial results of MODFLOW-96, MOC3D, and MODPATH can be visualized within Argus ONEe. Results from ZONEBDGT can be visualized in an independent program that can also be used to view budget data from MODFLOW, MOC3D, and SUTRA. Another independent program extracts hydrographs of head or drawdown at individual cells from formatted MODFLOW head and drawdown files. A web-based tutorial on the use of MODFLOW with Argus ONE has also been updated. The internal structure of the GUI has been modified to make it possible for advanced users to easily customize the GUI. Two additional, independent PIE?s were developed to allow users to edit the positions of nodes and to facilitate exporting the grid geometry to external programs.
LFSTAT - An R-Package for Low-Flow Analysis
NASA Astrophysics Data System (ADS)
Koffler, D.; Laaha, G.
2012-04-01
When analysing daily streamflow data focusing on low flow and drought, the state of the art is well documented in the Manual on Low-Flow Estimation and Prediction [1] published by the WMO. While it is clear what has to be done, it is not so clear how to preform the analysis and make the calculation as reproducible as possible. Our software solution expands the high preforming statistical open source software package R to analyse daily stream flow data focusing on low-flows. As command-line based programs are not everyone's preference, we also offer a plug-in for the R-Commander, an easy to use graphical user interface (GUI) to analyse data in R. Functionality includes estimation of the most important low-flow indices. Beside standardly used flow indices also BFI and Recession constants can be computed. The main applications of L-moment based Extreme value analysis and regional frequency analysis (RFA) are available. Calculation of streamflow deficits is another important feature. The most common graphics are prepared and can easily be modified according to the users preferences. Graphics include hydrographs for different periods, flexible streamflow deficit plots, baseflow visualisation, flow duration curves as well as double mass curves just to name a few. The package uses a S3-class called lfobj (low-flow objects). Once this objects are created, analysis can be preformed by mouse-click, and a script can be saved to make the analysis easy reproducible. At the moment we are offering implementation of all major methods proposed in the WMO manual on Low-flow Estimation and Predictions. Future plans include e.g. report export in odt-file using odf-weave. We hope to offer a tool to ease and structure the analysis of stream flow data focusing on low-flows and to make analysis transparent and communicable. The package is designed for hydrological research and water management practice, but can also be used in teaching students the first steps in low-flow hydrology.
NASA Technical Reports Server (NTRS)
Csank, Jeffrey; Stueber, Thomas
2012-01-01
An inlet system is being tested to evaluate methodologies for a turbine based combined cycle propulsion system to perform a controlled inlet mode transition. Prior to wind tunnel based hardware testing of controlled mode transitions, simulation models are used to test, debug, and validate potential control algorithms. One candidate simulation package for this purpose is the High Mach Transient Engine Cycle Code (HiTECC). The HiTECC simulation package models the inlet system, propulsion systems, thermal energy, geometry, nozzle, and fuel systems. This paper discusses the modification and redesign of the simulation package and control system to represent the NASA large-scale inlet model for Combined Cycle Engine mode transition studies, mounted in NASA Glenn s 10-foot by 10-foot Supersonic Wind Tunnel. This model will be used for designing and testing candidate control algorithms before implementation.
NASA Technical Reports Server (NTRS)
Csank, Jeffrey T.; Stueber, Thomas J.
2012-01-01
An inlet system is being tested to evaluate methodologies for a turbine based combined cycle propulsion system to perform a controlled inlet mode transition. Prior to wind tunnel based hardware testing of controlled mode transitions, simulation models are used to test, debug, and validate potential control algorithms. One candidate simulation package for this purpose is the High Mach Transient Engine Cycle Code (HiTECC). The HiTECC simulation package models the inlet system, propulsion systems, thermal energy, geometry, nozzle, and fuel systems. This paper discusses the modification and redesign of the simulation package and control system to represent the NASA large-scale inlet model for Combined Cycle Engine mode transition studies, mounted in NASA Glenn s 10- by 10-Foot Supersonic Wind Tunnel. This model will be used for designing and testing candidate control algorithms before implementation.
A midas plugin to enable construction of reproducible web-based image processing pipelines
Grauer, Michael; Reynolds, Patrick; Hoogstoel, Marion; Budin, Francois; Styner, Martin A.; Oguz, Ipek
2013-01-01
Image processing is an important quantitative technique for neuroscience researchers, but difficult for those who lack experience in the field. In this paper we present a web-based platform that allows an expert to create a brain image processing pipeline, enabling execution of that pipeline even by those biomedical researchers with limited image processing knowledge. These tools are implemented as a plugin for Midas, an open-source toolkit for creating web based scientific data storage and processing platforms. Using this plugin, an image processing expert can construct a pipeline, create a web-based User Interface, manage jobs, and visualize intermediate results. Pipelines are executed on a grid computing platform using BatchMake and HTCondor. This represents a new capability for biomedical researchers and offers an innovative platform for scientific collaboration. Current tools work well, but can be inaccessible for those lacking image processing expertise. Using this plugin, researchers in collaboration with image processing experts can create workflows with reasonable default settings and streamlined user interfaces, and data can be processed easily from a lab environment without the need for a powerful desktop computer. This platform allows simplified troubleshooting, centralized maintenance, and easy data sharing with collaborators. These capabilities enable reproducible science by sharing datasets and processing pipelines between collaborators. In this paper, we present a description of this innovative Midas plugin, along with results obtained from building and executing several ITK based image processing workflows for diffusion weighted MRI (DW MRI) of rodent brain images, as well as recommendations for building automated image processing pipelines. Although the particular image processing pipelines developed were focused on rodent brain MRI, the presented plugin can be used to support any executable or script-based pipeline. PMID:24416016
A midas plugin to enable construction of reproducible web-based image processing pipelines.
Grauer, Michael; Reynolds, Patrick; Hoogstoel, Marion; Budin, Francois; Styner, Martin A; Oguz, Ipek
2013-01-01
Image processing is an important quantitative technique for neuroscience researchers, but difficult for those who lack experience in the field. In this paper we present a web-based platform that allows an expert to create a brain image processing pipeline, enabling execution of that pipeline even by those biomedical researchers with limited image processing knowledge. These tools are implemented as a plugin for Midas, an open-source toolkit for creating web based scientific data storage and processing platforms. Using this plugin, an image processing expert can construct a pipeline, create a web-based User Interface, manage jobs, and visualize intermediate results. Pipelines are executed on a grid computing platform using BatchMake and HTCondor. This represents a new capability for biomedical researchers and offers an innovative platform for scientific collaboration. Current tools work well, but can be inaccessible for those lacking image processing expertise. Using this plugin, researchers in collaboration with image processing experts can create workflows with reasonable default settings and streamlined user interfaces, and data can be processed easily from a lab environment without the need for a powerful desktop computer. This platform allows simplified troubleshooting, centralized maintenance, and easy data sharing with collaborators. These capabilities enable reproducible science by sharing datasets and processing pipelines between collaborators. In this paper, we present a description of this innovative Midas plugin, along with results obtained from building and executing several ITK based image processing workflows for diffusion weighted MRI (DW MRI) of rodent brain images, as well as recommendations for building automated image processing pipelines. Although the particular image processing pipelines developed were focused on rodent brain MRI, the presented plugin can be used to support any executable or script-based pipeline.
NASA Astrophysics Data System (ADS)
Trowler, Derik Wesley
The research objective of this study was to develop a sizing method for community energy storage systems with emphasis on preventing distribution transformer overloading due to plug-in electric vehicle charging. The method as developed showed the formulation of a diversified load profile based upon residential load data for several customers on the American Electric Power system. Once a load profile was obtained, plug-in electric vehicle charging scenarios which were based upon expected adoption and charging trends were superimposed on the load profile to show situations where transformers (in particular 25 kVA, 50 kVA, and 100 kVA) would be overloaded during peak hours. Once the total load profiles were derived, the energy and power requirements of community energy storage systems were calculated for a number of scenarios with different combinations of numbers of homes and plug-in electric vehicles. The results were recorded and illustrated into charts so that one could determine the minimum size per application. Other topics that were covered in this thesis were the state of the art and future trends in plug-in electric vehicle and battery chemistry adoption and development. The goal of the literature review was to confirm the already suspected notion that Li-ion batteries are best suited and soon to be most cost-effective solution for applications requiring small, efficient, reliable, and light-weight battery systems such as plug-in electric vehicles and community energy storage systems. This thesis also includes a chapter showing system modeling in MATLAB/SimulinkRTM. All in all, this thesis covers a wide variety of considerations involved in the designing and deploying of community energy storage systems intended to mitigate the effects of distribution transformer overloading.
Tuikkala, Johannes; Vähämaa, Heidi; Salmela, Pekka; Nevalainen, Olli S; Aittokallio, Tero
2012-03-26
Graph drawing is an integral part of many systems biology studies, enabling visual exploration and mining of large-scale biological networks. While a number of layout algorithms are available in popular network analysis platforms, such as Cytoscape, it remains poorly understood how well their solutions reflect the underlying biological processes that give rise to the network connectivity structure. Moreover, visualizations obtained using conventional layout algorithms, such as those based on the force-directed drawing approach, may become uninformative when applied to larger networks with dense or clustered connectivity structure. We implemented a modified layout plug-in, named Multilevel Layout, which applies the conventional layout algorithms within a multilevel optimization framework to better capture the hierarchical modularity of many biological networks. Using a wide variety of real life biological networks, we carried out a systematic evaluation of the method in comparison with other layout algorithms in Cytoscape. The multilevel approach provided both biologically relevant and visually pleasant layout solutions in most network types, hence complementing the layout options available in Cytoscape. In particular, it could improve drawing of large-scale networks of yeast genetic interactions and human physical interactions. In more general terms, the biological evaluation framework developed here enables one to assess the layout solutions from any existing or future graph drawing algorithm as well as to optimize their performance for a given network type or structure. By making use of the multilevel modular organization when visualizing biological networks, together with the biological evaluation of the layout solutions, one can generate convenient visualizations for many network biology applications.
QDENSITY—A Mathematica Quantum Computer simulation
NASA Astrophysics Data System (ADS)
Juliá-Díaz, Bruno; Burdis, Joseph M.; Tabakin, Frank
2006-06-01
This Mathematica 5.2 package is a simulation of a Quantum Computer. The program provides a modular, instructive approach for generating the basic elements that make up a quantum circuit. The main emphasis is on using the density matrix, although an approach using state vectors is also implemented in the package. The package commands are defined in Qdensity.m which contains the tools needed in quantum circuits, e.g., multiqubit kets, projectors, gates, etc. Selected examples of the basic commands are presented here and a tutorial notebook, Tutorial.nb is provided with the package (available on our website) that serves as a full guide to the package. Finally, application is made to a variety of relevant cases, including Teleportation, Quantum Fourier transform, Grover's search and Shor's algorithm, in separate notebooks: QFT.nb, Teleportation.nb, Grover.nb and Shor.nb where each algorithm is explained in detail. Finally, two examples of the construction and manipulation of cluster states, which are part of "one way computing" ideas, are included as an additional tool in the notebook Cluster.nb. A Mathematica palette containing most commands in QDENSITY is also included: QDENSpalette.nb. Program summaryTitle of program: QDENSITY Catalogue identifier: ADXH_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXH_v1_0 Program available from: CPC Program Library, Queen's University of Belfast, N. Ireland Operating systems: Any which supports Mathematica; tested under Microsoft Windows XP, Macintosh OS X, and Linux FC4 Programming language used: Mathematica 5.2 No. of bytes in distributed program, including test data, etc.: 180 581 No. of lines in distributed program, including test data, etc.: 19 382 Distribution format: tar.gz Method of solution: A Mathematica package is provided which contains commands to create and analyze quantum circuits. Several Mathematica notebooks containing relevant examples: Teleportation, Shor's Algorithm and Grover's search are explained in detail. A tutorial, Tutorial.nb is also enclosed. QDENSITY is available at http://www.pitt.edu/~tabakin/QDENSITY.
Smooth 2D manifold extraction from 3D image stack
Shihavuddin, Asm; Basu, Sreetama; Rexhepaj, Elton; Delestro, Felipe; Menezes, Nikita; Sigoillot, Séverine M; Del Nery, Elaine; Selimi, Fekrije; Spassky, Nathalie; Genovesio, Auguste
2017-01-01
Three-dimensional fluorescence microscopy followed by image processing is routinely used to study biological objects at various scales such as cells and tissue. However, maximum intensity projection, the most broadly used rendering tool, extracts a discontinuous layer of voxels, obliviously creating important artifacts and possibly misleading interpretation. Here we propose smooth manifold extraction, an algorithm that produces a continuous focused 2D extraction from a 3D volume, hence preserving local spatial relationships. We demonstrate the usefulness of our approach by applying it to various biological applications using confocal and wide-field microscopy 3D image stacks. We provide a parameter-free ImageJ/Fiji plugin that allows 2D visualization and interpretation of 3D image stacks with maximum accuracy. PMID:28561033
DOE Office of Scientific and Technical Information (OSTI.GOV)
PANDOLFI, RONALD; KUMAR, DINESH; VENKATAKRISHNAN, SINGANALLUR
Xi-CAM aims to provide a community driven platform for multimodal analysis in synchrotron science. The platform core provides a robust plugin infrastructure for extensibility, allowing continuing development to simply add further functionality. Current modules include tools for characterization with (GI)SAXS, Tomography, and XAS. This will continue to serve as a development base as algorithms for multimodal analysis develop. Seamless remote data access, visualization and analysis are key elements of Xi-CAM, and will become critical to synchrotron data infrastructure as expectations for future data volume and acquisition rates rise with continuously increasing throughputs. The highly interactive design elements of Xi-cam willmore » similarly support a generation of users which depend on immediate data quality feedback during high-throughput or burst acquisition modes.« less
Plug-In Electric Vehicle Handbook for Fleet Managers (Brochure)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2012-04-01
Plug-in electric vehicles (PEVs) are entering the automobile market and are viable alternatives to conventional vehicles. This guide for fleet managers describes the basics of PEV technology, PEV benefits for fleets, how to select the right PEV, charging a PEV, and PEV maintenance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smart, John Galloway; Salisbury, Shawn Douglas
2015-07-01
This report summarizes key findings in two national plug-in electric vehicle charging infrastructure demonstrations: The EV Project and ChargePoint America. It will be published to the INL/AVTA website for the general public.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Melaina, Marc; Helwig, Michael
The California Statewide Plug-In Electric Vehicle Infrastructure Assessment conveys to interested parties the Energy Commission’s conclusions, recommendations, and intentions with respect to plug-in electric vehicle (PEV) infrastructure development. There are several relatively low-risk and high-priority electric vehicle supply equipment (EVSE) deployment options that will encourage PEV sales and
NASA Astrophysics Data System (ADS)
Oberhauser, Nils; Nurisso, Alessandra; Carrupt, Pierre-Alain
2014-05-01
The molecular lipophilicity potential (MLP) is a well-established method to calculate and visualize lipophilicity on molecules. We are here introducing a new computational tool named MLP Tools, written in the programming language Python, and conceived as a free plugin for the popular open source molecular viewer PyMOL. The plugin is divided into several sub-programs which allow the visualization of the MLP on molecular surfaces, as well as in three-dimensional space in order to analyze lipophilic properties of binding pockets. The sub-program Log MLP also implements the virtual log P which allows the prediction of the octanol/water partition coefficients on multiple three-dimensional conformations of the same molecule. An implementation on the recently introduced MLP GOLD procedure, improving the GOLD docking performance in hydrophobic pockets, is also part of the plugin. In this article, all functions of the MLP Tools will be described through a few chosen examples.
Neural Network Prototyping Package Within IRAF
NASA Technical Reports Server (NTRS)
Bazell, David
1997-01-01
The purpose of this contract was to develop a neural network package within the IRAF environment to allow users to easily understand and use different neural network algorithms the analysis of astronomical data. The package was developed for use within IRAF to allow portability to different computing environments and to provide a familiar and easy to use interface with the routines. In addition to developing the software and supporting documentation, we planned to use the system for the analysis of several sample problems to prove its viability and usefulness.
Application of GA package in functional packaging
NASA Astrophysics Data System (ADS)
Belousova, D. A.; Noskova, E. E.; Kapulin, D. V.
2018-05-01
The approach to application program for the task of configuration of the elements of the commutation circuit for design of the radio-electronic equipment on the basis of the genetic algorithm is offered. The efficiency of the used approach for commutation circuits with different characteristics for computer-aided design on radio-electronic manufacturing is shown. The prototype of the computer-aided design subsystem on the basis of a package GA for R with a set of the general functions for optimization of multivariate models is programmed.
A Maple package for computing Gröbner bases for linear recurrence relations
NASA Astrophysics Data System (ADS)
Gerdt, Vladimir P.; Robertz, Daniel
2006-04-01
A Maple package for computing Gröbner bases of linear difference ideals is described. The underlying algorithm is based on Janet and Janet-like monomial divisions associated with finite difference operators. The package can be used, for example, for automatic generation of difference schemes for linear partial differential equations and for reduction of multiloop Feynman integrals. These two possible applications are illustrated by simple examples of the Laplace equation and a one-loop scalar integral of propagator type.
Artificial Intelligence Software for Assessing Postural Stability
NASA Technical Reports Server (NTRS)
Lieberman, Erez; Forth, Katharine; Paloski, William
2013-01-01
A software package reads and analyzes pressure distributions from sensors mounted under a person's feet. Pressure data from sensors mounted in shoes, or in a platform, can be used to provide a description of postural stability (assessing competence to deficiency) and enables the determination of the person's present activity (running, walking, squatting, falling). This package has three parts: a preprocessing algorithm for reading input from pressure sensors; a Hidden Markov Model (HMM), which is used to determine the person's present activity and level of sensing-motor competence; and a suite of graphical algorithms, which allows visual representation of the person's activity and vestibular function over time.
Stan : A Probabilistic Programming Language
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carpenter, Bob; Gelman, Andrew; Hoffman, Matthew D.
Stan is a probabilistic programming language for specifying statistical models. A Stan program imperatively defines a log probability function over parameters conditioned on specified data and constants. As of version 2.14.0, Stan provides full Bayesian inference for continuous-variable models through Markov chain Monte Carlo methods such as the No-U-Turn sampler, an adaptive form of Hamiltonian Monte Carlo sampling. Penalized maximum likelihood estimates are calculated using optimization methods such as the limited memory Broyden-Fletcher-Goldfarb-Shanno algorithm. Stan is also a platform for computing log densities and their gradients and Hessians, which can be used in alternative algorithms such as variational Bayes, expectationmore » propagation, and marginal inference using approximate integration. To this end, Stan is set up so that the densities, gradients, and Hessians, along with intermediate quantities of the algorithm such as acceptance probabilities, are easily accessible. Stan can also be called from the command line using the cmdstan package, through R using the rstan package, and through Python using the pystan package. All three interfaces support sampling and optimization-based inference with diagnostics and posterior analysis. rstan and pystan also provide access to log probabilities, gradients, Hessians, parameter transforms, and specialized plotting.« less
Extended shortest path selection for package routing of complex networks
NASA Astrophysics Data System (ADS)
Ye, Fan; Zhang, Lei; Wang, Bing-Hong; Liu, Lu; Zhang, Xing-Yi
The routing strategy plays a very important role in complex networks such as Internet system and Peer-to-Peer networks. However, most of the previous work concentrates only on the path selection, e.g. Flooding and Random Walk, or finding the shortest path (SP) and rarely considering the local load information such as SP and Distance Vector Routing. Flow-based Routing mainly considers load balance and still cannot achieve best optimization. Thus, in this paper, we propose a novel dynamic routing strategy on complex network by incorporating the local load information into SP algorithm to enhance the traffic flow routing optimization. It was found that the flow in a network is greatly affected by the waiting time of the network, so we should not consider only choosing optimized path for package transformation but also consider node congestion. As a result, the packages should be transmitted with a global optimized path with smaller congestion and relatively short distance. Analysis work and simulation experiments show that the proposed algorithm can largely enhance the network flow with the maximum throughput within an acceptable calculating time. The detailed analysis of the algorithm will also be provided for explaining the efficiency.
Stan : A Probabilistic Programming Language
Carpenter, Bob; Gelman, Andrew; Hoffman, Matthew D.; ...
2017-01-01
Stan is a probabilistic programming language for specifying statistical models. A Stan program imperatively defines a log probability function over parameters conditioned on specified data and constants. As of version 2.14.0, Stan provides full Bayesian inference for continuous-variable models through Markov chain Monte Carlo methods such as the No-U-Turn sampler, an adaptive form of Hamiltonian Monte Carlo sampling. Penalized maximum likelihood estimates are calculated using optimization methods such as the limited memory Broyden-Fletcher-Goldfarb-Shanno algorithm. Stan is also a platform for computing log densities and their gradients and Hessians, which can be used in alternative algorithms such as variational Bayes, expectationmore » propagation, and marginal inference using approximate integration. To this end, Stan is set up so that the densities, gradients, and Hessians, along with intermediate quantities of the algorithm such as acceptance probabilities, are easily accessible. Stan can also be called from the command line using the cmdstan package, through R using the rstan package, and through Python using the pystan package. All three interfaces support sampling and optimization-based inference with diagnostics and posterior analysis. rstan and pystan also provide access to log probabilities, gradients, Hessians, parameter transforms, and specialized plotting.« less
OntoCheck: verifying ontology naming conventions and metadata completeness in Protégé 4.
Schober, Daniel; Tudose, Ilinca; Svatek, Vojtech; Boeker, Martin
2012-09-21
Although policy providers have outlined minimal metadata guidelines and naming conventions, ontologies of today still display inter- and intra-ontology heterogeneities in class labelling schemes and metadata completeness. This fact is at least partially due to missing or inappropriate tools. Software support can ease this situation and contribute to overall ontology consistency and quality by helping to enforce such conventions. We provide a plugin for the Protégé Ontology editor to allow for easy checks on compliance towards ontology naming conventions and metadata completeness, as well as curation in case of found violations. In a requirement analysis, derived from a prior standardization approach carried out within the OBO Foundry, we investigate the needed capabilities for software tools to check, curate and maintain class naming conventions. A Protégé tab plugin was implemented accordingly using the Protégé 4.1 libraries. The plugin was tested on six different ontologies. Based on these test results, the plugin could be refined, also by the integration of new functionalities. The new Protégé plugin, OntoCheck, allows for ontology tests to be carried out on OWL ontologies. In particular the OntoCheck plugin helps to clean up an ontology with regard to lexical heterogeneity, i.e. enforcing naming conventions and metadata completeness, meeting most of the requirements outlined for such a tool. Found test violations can be corrected to foster consistency in entity naming and meta-annotation within an artefact. Once specified, check constraints like name patterns can be stored and exchanged for later re-use. Here we describe a first version of the software, illustrate its capabilities and use within running ontology development efforts and briefly outline improvements resulting from its application. Further, we discuss OntoChecks capabilities in the context of related tools and highlight potential future expansions. The OntoCheck plugin facilitates labelling error detection and curation, contributing to lexical quality assurance in OWL ontologies. Ultimately, we hope this Protégé extension will ease ontology alignments as well as lexical post-processing of annotated data and hence can increase overall secondary data usage by humans and computers.
ISLE (Image and Signal Processing LISP Environment) reference manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sherwood, R.J.; Searfus, R.M.
1990-01-01
ISLE is a rapid prototyping system for performing image and signal processing. It is designed to meet the needs of a person doing development of image and signal processing algorithms in a research environment. The image and signal processing modules in ISLE form a very capable package in themselves. They also provide a rich environment for quickly and easily integrating user-written software modules into the package. ISLE is well suited to applications in which there is a need to develop a processing algorithm in an interactive manner. It is straightforward to develop the algorithms, load it into ISLE, apply themore » algorithm to an image or signal, display the results, then modify the algorithm and repeat the develop-load-apply-display cycle. ISLE consists of a collection of image and signal processing modules integrated into a cohesive package through a standard command interpreter. ISLE developer elected to concentrate their effort on developing image and signal processing software rather than developing a command interpreter. A COMMON LISP interpreter was selected for the command interpreter because it already has the features desired in a command interpreter, it supports dynamic loading of modules for customization purposes, it supports run-time parameter and argument type checking, it is very well documented, and it is a commercially supported product. This manual is intended to be a reference manual for the ISLE functions The functions are grouped into a number of categories and briefly discussed in the Function Summary chapter. The full descriptions of the functions and all their arguments are given in the Function Descriptions chapter. 6 refs.« less
Research on Laser Marking Speed Optimization by Using Genetic Algorithm
Wang, Dongyun; Yu, Qiwei; Zhang, Yu
2015-01-01
Laser Marking Machine is the most common coding equipment on product packaging lines. However, the speed of laser marking has become a bottleneck of production. In order to remove this bottleneck, a new method based on a genetic algorithm is designed. On the basis of this algorithm, a controller was designed and simulations and experiments were performed. The results show that using this algorithm could effectively improve laser marking efficiency by 25%. PMID:25955831
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sera White
2012-04-01
This thesis presents a research study using one year of driving data obtained from plug-in hybrid electric vehicles (PHEV) located in Sacramento and San Francisco, California to determine the effectiveness of incorporating geographic information into vehicle performance algorithms. Sacramento and San Francisco were chosen because of the availability of high resolution (1/9 arc second) digital elevation data. First, I present a method for obtaining instantaneous road slope, given a latitude and longitude, and introduce its use into common driving intensity algorithms. I show that for trips characterized by >40m of net elevation change (from key on to key off), themore » use of instantaneous road slope significantly changes the results of driving intensity calculations. For trips exhibiting elevation loss, algorithms ignoring road slope overestimated driving intensity by as much as 211 Wh/mile, while for trips exhibiting elevation gain these algorithms underestimated driving intensity by as much as 333 Wh/mile. Second, I describe and test an algorithm that incorporates vehicle route type into computations of city and highway fuel economy. Route type was determined by intersecting trip GPS points with ESRI StreetMap road types and assigning each trip as either city or highway route type according to whichever road type comprised the largest distance traveled. The fuel economy results produced by the geographic classification were compared to the fuel economy results produced by algorithms that assign route type based on average speed or driving style. Most results were within 1 mile per gallon ({approx}3%) of one another; the largest difference was 1.4 miles per gallon for charge depleting highway trips. The methods for acquiring and using geographic data introduced in this thesis will enable other vehicle technology researchers to incorporate geographic data into their research problems.« less
Parallax visualization of full motion video using the Pursuer GUI
NASA Astrophysics Data System (ADS)
Mayhew, Christopher A.; Forgues, Mark B.
2014-06-01
In 2013, the Authors reported to the SPIE on the Phase 1 development of a Parallax Visualization (PV) plug-in toolset for Wide Area Motion Imaging (WAMI) data using the Pursuer Graphical User Interface (GUI).1 In addition to the ability to PV WAMI data, the Phase 1 plug-in toolset also featured a limited ability to visualize Full Motion video (FMV) data. The ability to visualize both WAMI and FMV data is highly advantageous capability for an Electric Light Table (ELT) toolset. This paper reports on the Phase 2 development and addition of a full featured FMV capability to the Pursuer WAMI PV Plug-in.
Modern software approaches applied to a Hydrological model: the GEOtop Open-Source Software Project
NASA Astrophysics Data System (ADS)
Cozzini, Stefano; Endrizzi, Stefano; Cordano, Emanuele; Bertoldi, Giacomo; Dall'Amico, Matteo
2017-04-01
The GEOtop hydrological scientific package is an integrated hydrological model that simulates the heat and water budgets at and below the soil surface. It describes the three-dimensional water flow in the soil and the energy exchange with the atmosphere, considering the radiative and turbulent fluxes. Furthermore, it reproduces the highly non-linear interactions between the water and energy balance during soil freezing and thawing, and simulates the temporal evolution of snow cover, soil temperature and moisture. The core components of the package were presented in the 2.0 version (Endrizzi et al, 2014), which was released as Free Software Open-source project. However, despite the high scientific quality of the project, a modern software engineering approach was still missing. Such weakness hindered its scientific potential and its use both as a standalone package and, more importantly, in an integrate way with other hydrological software tools. In this contribution we present our recent software re-engineering efforts to create a robust and stable scientific software package open to the hydrological community, easily usable by researchers and experts, and interoperable with other packages. The activity takes as a starting point the 2.0 version, scientifically tested and published. This version, together with several test cases based on recent published or available GEOtop applications (Cordano and Rigon, 2013, WRR, Kollet et al, 2016, WRR) provides the baseline code and a certain number of referenced results as benchmark. Comparison and scientific validation can then be performed for each software re-engineering activity performed on the package. To keep track of any single change the package is published on its own github repository geotopmodel.github.io/geotop/ under GPL v3.0 license. A Continuous Integration mechanism by means of Travis-CI has been enabled on the github repository on master and main development branches. The usage of CMake configuration tool and the suite of tests (easily manageable by means of ctest tools) greatly reduces the burden of the installation and allows us to enhance portability on different compilers and Operating system platforms. The package was also complemented by several software tools which provide web-based visualization of results based on R plugins, in particular "shiny" (Chang at al, 2016), "geotopbricks" and "geotopOptim2" (Cordano et al, 2016) packages, which allow rapid and efficient scientific validation of new examples and tests. The software re-engineering activities are still under development. However, our first results are promising enough to eventually reach a robust and stable software project that manages in a flexible way a complex state-of-the-art hydrological model like GEOtop and integrates it into wider workflows.
Clean Cities Plug-In Electric Vehicle Handbook for Fleet Managers
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2012-04-01
Plug-in electric vehicles (PEVs) are entering the automobile market and are viable alternatives to conventional vehicles. This guide for fleet managers describes the basics of PEV technology, PEV benefits for fleets, how to select the right PEV, charging a PEV, and PEV maintenance.
Predicting the market potential of plug-in electric vehicles using multiday GPS data.
DOT National Transportation Integrated Search
2011-12-01
"Detailed GPS data for a years worth of travel by 255 households from the Seattle area were used to : investigate how plug-in electric vehicle types may affect adoption rates and use levels. The results suggest : that a battery-electric vehicle (B...
DOT National Transportation Integrated Search
2015-12-01
Plug-in hybrid electric vehicles (PHEV) are likely to increase in popularity in the near future. However, the : environmental benefits of PHEVs involve tradeoffs between the benefits of reduced tailpipe emissions : against the drawbacks of increased ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Electric-drive vehicles use electricity as their primary fuel or to improve the efficiency of conventional vehicle designs. These vehicles can be divided into three categories: Hybrid electric vehicles (HEVs), Plug-in hybrid electric vehicles (PHEVs), All-electric vehicles (EVs). Together, PHEVs and EVs can also be referred to as plug-in electric vehicles (PEVs).
76 FR 72028 - Buy America Waiver Notification
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-21
...-battery electric vehicles, 12 plug-in hybrid vehicles, and 5 neighborhood electric vehicles in San... a partial Buy America waiver is appropriate for the purchase of 12 all-battery electric vehicles, 12 plug-in hybrid vehicles, and 5 neighborhood electric vehicles in San Francisco County, California. In...
77 FR 64379 - Proposed Collection; Comment Request for Notice 2009-58
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-19
... Notice 2009-58, Manufacturers' Certification of Specified Plug-in Electric Vehicles. DATES: Written... Electric Vehicles. OMB Number: 1545-2150. Notice Number: Notice 2009-58. Abstract: The American Recovery... certain new specified plug-in electric drive vehicles. This notice provides procedures for a vehicle...
Parachute Dynamics Investigations Using a Sensor Package Airdropped from a Small-Scale Airplane
NASA Technical Reports Server (NTRS)
Dooley, Jessica; Lorenz, Ralph D.
2005-01-01
We explore the utility of various sensors by recovering parachute-probe dynamics information from a package released from a small-scale, remote-controlled airplane. The airdrops aid in the development of datasets for the exploration of planetary probe trajectory recovery algorithms, supplementing data collected from instrumented, full-scale tests and computer models.
What's the Big Deal? Collection Evaluation at the National Level
ERIC Educational Resources Information Center
Jurczyk, Eva; Jacobs, Pamela
2014-01-01
This article discusses a project undertaken to assess the journals in a Big Deal package by applying a weighted value algorithm measuring quality, utility, and value of individual titles. Carried out by a national library consortium in Canada, the project confirmed the value of the Big Deal package while providing a quantitative approach for…
Feigl, Guenther C; Hiergeist, Wolfgang; Fellner, Claudia; Schebesch, Karl-Michael M; Doenitz, Christian; Finkenzeller, Thomas; Brawanski, Alexander; Schlaier, Juergen
2014-01-01
Diffusion tensor imaging (DTI)-based tractography has become an integral part of preoperative diagnostic imaging in many neurosurgical centers, and other nonsurgical specialties depend increasingly on DTI tractography as a diagnostic tool. The aim of this study was to analyze the anatomic accuracy of visualized white matter fiber pathways using different, readily available DTI tractography software programs. Magnetic resonance imaging scans of the head of 20 healthy volunteers were acquired using a Siemens Symphony TIM 1.5T scanner and a 12-channel head array coil. The standard settings of the scans in this study were 12 diffusion directions and 5-mm slices. The fornices were chosen as an anatomic structure for the comparative fiber tracking. Identical data sets were loaded into nine different fiber tracking packages that used different algorithms. The nine software packages and algorithms used were NeuroQLab (modified tensor deflection [TEND] algorithm), Sörensen DTI task card (modified streamline tracking technique algorithm), Siemens DTI module (modified fourth-order Runge-Kutta algorithm), six different software packages from Trackvis (interpolated streamline algorithm, modified FACT algorithm, second-order Runge-Kutta algorithm, Q-ball [FACT algorithm], tensorline algorithm, Q-ball [second-order Runge-Kutta algorithm]), DTI Query (modified streamline tracking technique algorithm), Medinria (modified TEND algorithm), Brainvoyager (modified TEND algorithm), DTI Studio modified FACT algorithm, and the BrainLab DTI module based on the modified Runge-Kutta algorithm. Three examiners (a neuroradiologist, a magnetic resonance imaging physicist, and a neurosurgeon) served as examiners. They were double-blinded with respect to the test subject and the fiber tracking software used in the presented images. Each examiner evaluated 301 images. The examiners were instructed to evaluate screenshots from the different programs based on two main criteria: (i) anatomic accuracy of the course of the displayed fibers and (ii) number of fibers displayed outside the anatomic boundaries. The mean overall grade for anatomic accuracy was 2.2 (range, 1.1-3.6) with a standard deviation (SD) of 0.9. The mean overall grade for incorrectly displayed fibers was 2.5 (range, 1.6-3.5) with a SD of 0.6. The mean grade of the overall program ranking was 2.3 with a SD of 0.6. The overall mean grade of the program ranked number one (NeuroQLab) was 1.7 (range, 1.5-2.8). The mean overall grade of the program ranked last (BrainLab iPlan Cranial 2.6 DTI Module) was 3.3 (range, 1.7-4). The difference between the mean grades of these two programs was statistically highly significant (P < 0.0001). There was no statistically significant difference between the programs ranked 1-3: NeuroQLab, Sörensen DTI Task Card, and Siemens DTI module. The results of this study show that there is a statistically significant difference in the anatomic accuracy of the tested DTI fiber tracking programs. Although incorrectly displayed fibers could lead to wrong conclusions in the neurosciences field, which relies heavily on this noninvasive imaging technique, incorrectly displayed fibers in neurosurgery could lead to surgical decisions potentially harmful for the patient if used without intraoperative cortical stimulation. DTI fiber tracking presents a valuable noninvasive preoperative imaging tool, which requires further validation after important standardization of the acquisition and processing techniques currently available. Copyright © 2014 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Lisitsa, Y. V.; Yatskou, M. M.; Apanasovich, V. V.; Apanasovich, T. V.
2015-09-01
We have developed an algorithm for segmentation of cancer cell nuclei in three-channel luminescent images of microbiological specimens. The algorithm is based on using a correlation between fluorescence signals in the detection channels for object segmentation, which permits complete automation of the data analysis procedure. We have carried out a comparative analysis of the proposed method and conventional algorithms implemented in the CellProfiler and ImageJ software packages. Our algorithm has an object localization uncertainty which is 2-3 times smaller than for the conventional algorithms, with comparable segmentation accuracy.
Plug-in connector socket accepts coaxial cable end
NASA Technical Reports Server (NTRS)
Mitchell, D.; Van Loon, J.
1966-01-01
Connector which includes a spring-loaded contact to receive a protruding center conductor and an internal collet to clamp against a collar attached to a woven outer conductor, is used as a receptacle for the end of a coaxial cable. This plug-in connector socket is used successfully with remote manipulators.
Consumer adoption and grid impact models for plug-in hybrid electric vehicles in Wisconsin.
DOT National Transportation Integrated Search
2010-05-01
This proposed study focuses on assessing the demand for plug-in hybrid electric vehicles (PHEV) in Wisconsin and its economic : impacts on the States energy market and the electric grid. PHEVs are expected to provide a range of about 40 miles per ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Septon, Kendall K
Electric-drive vehicles use electricity as their primary fuel or to improve the efficiency of conventional vehicle designs. These vehicles can be divided into three categories: Hybrid electric vehicles (HEVs), Plug-in hybrid electric vehicles (PHEVs), All-electric vehicles (EVs). Together, PHEVs and EVs can also be referred to as plug-in electric vehicles (PEVs).
40 CFR 86.1816-18 - Emission standards for heavy-duty vehicles.
Code of Federal Regulations, 2014 CFR
2014-07-01
... as specified in this section. (4) Measure emissions from hybrid electric vehicles (including plug-in hybrid electric vehicles) as described in 40 CFR part 1066, subpart F, except that these procedures do not apply for plug-in hybrid electric vehicles during charge-depleting operation. (b) Tier 3 exhaust...
Plug-in Sensors for Air Pollution Monitoring.
ERIC Educational Resources Information Center
Shaw, Manny
Faristors, a type of plug-in sensors used in analyzing equipment, are described in this technical report presented at the 12th Conference on Methods in Air Pollution and Industrial Hygiene Studies, University of Southern California, April, 1971. Their principles of operation, interchangeability, and versatility for measuring air pollution at…
Energy Storage Laboratory | Energy Systems Integration Facility | NREL
technologies. Key Infrastructure Energy storage system inverter, energy storage system simulators, research Plug-In Vehicles/Mobile Storage The plug-in vehicles/mobile storage hub includes connections for small integration. Key Infrastructure Ample house power, REDB access, charging stations, easy vehicle parking access
Charging Rate Incentive - Georgia Power Georgia Power offers a Plug-in Electric Vehicle (PEV) time -of-use electricity rate for residential customers who own an electric or plug-in hybrid electric vehicle. The PEV rate is optional and does not require a separate meter. For more information, see the
Integrated, Kerberized Login on MacOS X
NASA Technical Reports Server (NTRS)
Hotz, Henry B.
2006-01-01
Context for this information. MacOS X login process and available hooks. Authorization Services configuration. Authorization Services plug-in s. Kerberos plug-in s. Other bugs and recommendations. Authorization Services Called by loginwindow, screen saver and fast user switching. It calls Directory Services, Login Hook, and Login Items (System Preferences).
Viewing Files — EDRN Public Portal
In addition to standard HTML Web pages, our web site contain other file formats. You may need additional software or browser plug-ins to view some of the information available on our site. This document lists show each format, along with links to the corresponding freely available plug-ins or viewers.
In addition to standard HTML webpages, our website contains files in other formats. You may need additional software or browser plug-ins to view some of these files. The following list shows each format along with links to the corresponding freely available plug-ins or viewers. Documents Adobe Acrobat Reader (.pdf)
Electric and Plug-In Hybrid Electric Vehicle Publications | Transportation
, Kandler Smith, and Kevin Walkowicz. (2016) Medium-Duty Plug-in Electric Delivery Truck Fleet Evaluation . (2014) Smith Newton Electric Delivery Trucks Smith Newton Vehicle Performance Evaluation (Gen 1 ), Cumulative Report: November 2011-June 2014. Adam Ragatz. (2014) Smith Newton Vehicle Performance Evaluation
An Elementary Algorithm to Evaluate Trigonometric Functions to High Precision
ERIC Educational Resources Information Center
Johansson, B. Tomas
2018-01-01
Evaluation of the cosine function is done via a simple Cordic-like algorithm, together with a package for handling arbitrary-precision arithmetic in the computer program Matlab. Approximations to the cosine function having hundreds of correct decimals are presented with a discussion around errors and implementation.
Simulation Based Evaluation of Integrated Adaptive Control and Flight Planning Technologies
NASA Technical Reports Server (NTRS)
Campbell, Stefan Forrest; Kaneshige, John T.
2008-01-01
The objective of this work is to leverage NASA resources to enable effective evaluation of resilient aircraft technologies through simulation. This includes examining strengths and weaknesses of adaptive controllers, emergency flight planning algorithms, and flight envelope determination algorithms both individually and as an integrated package.
The fastclime Package for Linear Programming and Large-Scale Precision Matrix Estimation in R.
Pang, Haotian; Liu, Han; Vanderbei, Robert
2014-02-01
We develop an R package fastclime for solving a family of regularized linear programming (LP) problems. Our package efficiently implements the parametric simplex algorithm, which provides a scalable and sophisticated tool for solving large-scale linear programs. As an illustrative example, one use of our LP solver is to implement an important sparse precision matrix estimation method called CLIME (Constrained L 1 Minimization Estimator). Compared with existing packages for this problem such as clime and flare, our package has three advantages: (1) it efficiently calculates the full piecewise-linear regularization path; (2) it provides an accurate dual certificate as stopping criterion; (3) it is completely coded in C and is highly portable. This package is designed to be useful to statisticians and machine learning researchers for solving a wide range of problems.
Analysis of Rhythms in Experimental Signals
NASA Astrophysics Data System (ADS)
Desherevskii, A. V.; Zhuravlev, V. I.; Nikolsky, A. N.; Sidorin, A. Ya.
2017-12-01
We compare algorithms designed to extract quasiperiodic components of a signal and estimate the amplitude, phase, stability, and other characteristics of a rhythm in a sliding window in the presence of data gaps. Each algorithm relies on its own rhythm model; therefore, it is necessary to use different algorithms depending on the research objectives. The described set of algorithms and methods is implemented in the WinABD software package, which includes a time-series database management system, a powerful research complex, and an interactive data-visualization environment.
An Object-Oriented Serial DSMC Simulation Package
NASA Astrophysics Data System (ADS)
Liu, Hongli; Cai, Chunpei
2011-05-01
A newly developed three-dimensional direct simulation Monte Carlo (DSMC) simulation package, named GRASP ("Generalized Rarefied gAs Simulation Package"), is reported in this paper. This package utilizes the concept of simulation engine, many C++ features and software design patterns. The package has an open architecture which can benefit further development and maintenance of the code. In order to reduce the engineering time for three-dimensional models, a hybrid grid scheme, combined with a flexible data structure compiled by C++ language, are implemented in this package. This scheme utilizes a local data structure based on the computational cell to achieve high performance on workstation processors. This data structure allows the DSMC algorithm to be very efficiently parallelized with domain decomposition and it provides much flexibility in terms of grid types. This package can utilize traditional structured, unstructured or hybrid grids within the framework of a single code to model arbitrarily complex geometries and to simulate rarefied gas flows. Benchmark test cases indicate that this package has satisfactory accuracy for complex rarefied gas flows.
In silico strain optimization by adding reactions to metabolic models.
Correia, Sara; Rocha, Miguel
2012-07-24
Nowadays, the concerns about the environment and the needs to increase the productivity at low costs, demand for the search of new ways to produce compounds with industrial interest. Based on the increasing knowledge of biological processes, through genome sequencing projects, and high-throughput experimental techniques as well as the available computational tools, the use of microorganisms has been considered as an approach to produce desirable compounds. However, this usually requires to manipulate these organisms by genetic engineering and/ or changing the enviromental conditions to make the production of these compounds possible. In many cases, it is necessary to enrich the genetic material of those microbes with hereologous pathways from other species and consequently adding the potential to produce novel compounds. This paper introduces a new plug-in for the OptFlux Metabolic Engineering platform, aimed at finding suitable sets of reactions to add to the genomes of selected microbes (wild type strain), as well as finding complementary sets of deletions, so that the mutant becomes able to overproduce compounds with industrial interest, while preserving their viability. The necessity of adding reactions to the metabolic model arises from existing gaps in the original model or motivated by the productions of new compounds by the organism. The optimization methods used are metaheuristics such as Evolutionary Algorithms and Simulated Annealing. The usefulness of this plug-in is demonstrated by a case study, regarding the production of vanillin by the bacterium E. coli.
In silico strain optimization by adding reactions to metabolic models.
Correia, Sara; Rocha, Miguel
2012-12-01
Nowadays, the concerns about the environment and the needs to increase the productivity at low costs, demand for the search of new ways to produce compounds with industrial interest. Based on the increasing knowledge of biological processes, through genome sequencing projects, and high-throughput experimental techniques as well as the available computational tools, the use of microorganisms has been considered as an approach to produce desirable compounds. However, this usually requires to manipulate these organisms by genetic engineering and/ or changing the enviromental conditions to make the production of these compounds possible. In many cases, it is necessary to enrich the genetic material of those microbes with hereologous pathways from other species and consequently adding the potential to produce novel compounds. This paper introduces a new plug-in for the OptFlux Metabolic Engineering platform, aimed at finding suitable sets of reactions to add to the genomes of selected microbes (wild type strain), as well as finding complementary sets of deletions, so that the mutant becomes able to overproduce compounds with industrial interest, while preserving their viability. The necessity of adding reactions to the metabolic model arises from existing gaps in the original model or motivated by the productions of new compounds by the organism. The optimization methods used are metaheuristics such as Evolutionary Algorithms and Simulated Annealing. The usefulness of this plug-in is demonstrated by a case study, regarding the production of vanillin by the bacterium E. coli.
SBSI: an extensible distributed software infrastructure for parameter estimation in systems biology.
Adams, Richard; Clark, Allan; Yamaguchi, Azusa; Hanlon, Neil; Tsorman, Nikos; Ali, Shakir; Lebedeva, Galina; Goltsov, Alexey; Sorokin, Anatoly; Akman, Ozgur E; Troein, Carl; Millar, Andrew J; Goryanin, Igor; Gilmore, Stephen
2013-03-01
Complex computational experiments in Systems Biology, such as fitting model parameters to experimental data, can be challenging to perform. Not only do they frequently require a high level of computational power, but the software needed to run the experiment needs to be usable by scientists with varying levels of computational expertise, and modellers need to be able to obtain up-to-date experimental data resources easily. We have developed a software suite, the Systems Biology Software Infrastructure (SBSI), to facilitate the parameter-fitting process. SBSI is a modular software suite composed of three major components: SBSINumerics, a high-performance library containing parallelized algorithms for performing parameter fitting; SBSIDispatcher, a middleware application to track experiments and submit jobs to back-end servers; and SBSIVisual, an extensible client application used to configure optimization experiments and view results. Furthermore, we have created a plugin infrastructure to enable project-specific modules to be easily installed. Plugin developers can take advantage of the existing user-interface and application framework to customize SBSI for their own uses, facilitated by SBSI's use of standard data formats. All SBSI binaries and source-code are freely available from http://sourceforge.net/projects/sbsi under an Apache 2 open-source license. The server-side SBSINumerics runs on any Unix-based operating system; both SBSIVisual and SBSIDispatcher are written in Java and are platform independent, allowing use on Windows, Linux and Mac OS X. The SBSI project website at http://www.sbsi.ed.ac.uk provides documentation and tutorials.
DIANA-microT web server v5.0: service integration into miRNA functional analysis workflows.
Paraskevopoulou, Maria D; Georgakilas, Georgios; Kostoulas, Nikos; Vlachos, Ioannis S; Vergoulis, Thanasis; Reczko, Martin; Filippidis, Christos; Dalamagas, Theodore; Hatzigeorgiou, A G
2013-07-01
MicroRNAs (miRNAs) are small endogenous RNA molecules that regulate gene expression through mRNA degradation and/or translation repression, affecting many biological processes. DIANA-microT web server (http://www.microrna.gr/webServer) is dedicated to miRNA target prediction/functional analysis, and it is being widely used from the scientific community, since its initial launch in 2009. DIANA-microT v5.0, the new version of the microT server, has been significantly enhanced with an improved target prediction algorithm, DIANA-microT-CDS. It has been updated to incorporate miRBase version 18 and Ensembl version 69. The in silico-predicted miRNA-gene interactions in Homo sapiens, Mus musculus, Drosophila melanogaster and Caenorhabditis elegans exceed 11 million in total. The web server was completely redesigned, to host a series of sophisticated workflows, which can be used directly from the on-line web interface, enabling users without the necessary bioinformatics infrastructure to perform advanced multi-step functional miRNA analyses. For instance, one available pipeline performs miRNA target prediction using different thresholds and meta-analysis statistics, followed by pathway enrichment analysis. DIANA-microT web server v5.0 also supports a complete integration with the Taverna Workflow Management System (WMS), using the in-house developed DIANA-Taverna Plug-in. This plug-in provides ready-to-use modules for miRNA target prediction and functional analysis, which can be used to form advanced high-throughput analysis pipelines.
DIANA-microT web server v5.0: service integration into miRNA functional analysis workflows
Paraskevopoulou, Maria D.; Georgakilas, Georgios; Kostoulas, Nikos; Vlachos, Ioannis S.; Vergoulis, Thanasis; Reczko, Martin; Filippidis, Christos; Dalamagas, Theodore; Hatzigeorgiou, A.G.
2013-01-01
MicroRNAs (miRNAs) are small endogenous RNA molecules that regulate gene expression through mRNA degradation and/or translation repression, affecting many biological processes. DIANA-microT web server (http://www.microrna.gr/webServer) is dedicated to miRNA target prediction/functional analysis, and it is being widely used from the scientific community, since its initial launch in 2009. DIANA-microT v5.0, the new version of the microT server, has been significantly enhanced with an improved target prediction algorithm, DIANA-microT-CDS. It has been updated to incorporate miRBase version 18 and Ensembl version 69. The in silico-predicted miRNA–gene interactions in Homo sapiens, Mus musculus, Drosophila melanogaster and Caenorhabditis elegans exceed 11 million in total. The web server was completely redesigned, to host a series of sophisticated workflows, which can be used directly from the on-line web interface, enabling users without the necessary bioinformatics infrastructure to perform advanced multi-step functional miRNA analyses. For instance, one available pipeline performs miRNA target prediction using different thresholds and meta-analysis statistics, followed by pathway enrichment analysis. DIANA-microT web server v5.0 also supports a complete integration with the Taverna Workflow Management System (WMS), using the in-house developed DIANA-Taverna Plug-in. This plug-in provides ready-to-use modules for miRNA target prediction and functional analysis, which can be used to form advanced high-throughput analysis pipelines. PMID:23680784
NASA Astrophysics Data System (ADS)
Morozov, A.; Defendi, I.; Engels, R.; Fraga, F. A. F.; Fraga, M. M. F. R.; Gongadze, A.; Guerard, B.; Jurkovic, M.; Kemmerling, G.; Manzin, G.; Margato, L. M. S.; Niko, H.; Pereira, L.; Petrillo, C.; Peyaud, A.; Piscitelli, F.; Raspino, D.; Rhodes, N. J.; Sacchetti, F.; Schooneveld, E. M.; Solovov, V.; Van Esch, P.; Zeitelhack, K.
2013-05-01
The software package ANTS (Anger-camera type Neutron detector: Toolkit for Simulations), developed for simulation of Anger-type gaseous detectors for thermal neutron imaging was extended to include a module for experimental data processing. Data recorded with a sensor array containing up to 100 photomultiplier tubes (PMT) or silicon photomultipliers (SiPM) in a custom configuration can be loaded and the positions and energies of the events can be reconstructed using the Center-of-Gravity, Maximum Likelihood or Least Squares algorithm. A particular strength of the new module is the ability to reconstruct the light response functions and relative gains of the photomultipliers from flood field illumination data using adaptive algorithms. The performance of the module is demonstrated with simulated data generated in ANTS and experimental data recorded with a 19 PMT neutron detector. The package executables are publicly available at http://coimbra.lip.pt/~andrei/
GST-PRIME: an algorithm for genome-wide primer design.
Leister, Dario; Varotto, Claudio
2007-01-01
The profiling of mRNA expression based on DNA arrays has become a powerful tool to study genome-wide transcription of genes in a number of organisms. GST-PRIME is a software package created to facilitate large-scale primer design for the amplification of probes to be immobilized on arrays for transcriptome analyses, even though it can be also applied in low-throughput approaches. GST-PRIME allows highly efficient, direct amplification of gene-sequence tags (GSTs) from genomic DNA (gDNA), starting from annotated genome or transcript sequences. GST-PRIME provides a customer-friendly platform for automatic primer design, and despite the relative simplicity of the algorithm, experimental tests in the model plant species Arabidopsis thaliana confirmed the reliability of the software. This chapter describes the algorithm used for primer design, its input and output files, and the installation of the standalone package and its use.
Model-based clustering for RNA-seq data.
Si, Yaqing; Liu, Peng; Li, Pinghua; Brutnell, Thomas P
2014-01-15
RNA-seq technology has been widely adopted as an attractive alternative to microarray-based methods to study global gene expression. However, robust statistical tools to analyze these complex datasets are still lacking. By grouping genes with similar expression profiles across treatments, cluster analysis provides insight into gene functions and networks, and hence is an important technique for RNA-seq data analysis. In this manuscript, we derive clustering algorithms based on appropriate probability models for RNA-seq data. An expectation-maximization algorithm and another two stochastic versions of expectation-maximization algorithms are described. In addition, a strategy for initialization based on likelihood is proposed to improve the clustering algorithms. Moreover, we present a model-based hybrid-hierarchical clustering method to generate a tree structure that allows visualization of relationships among clusters as well as flexibility of choosing the number of clusters. Results from both simulation studies and analysis of a maize RNA-seq dataset show that our proposed methods provide better clustering results than alternative methods such as the K-means algorithm and hierarchical clustering methods that are not based on probability models. An R package, MBCluster.Seq, has been developed to implement our proposed algorithms. This R package provides fast computation and is publicly available at http://www.r-project.org
Intelligent QoS routing algorithm based on improved AODV protocol for Ad Hoc networks
NASA Astrophysics Data System (ADS)
Huibin, Liu; Jun, Zhang
2016-04-01
Mobile Ad Hoc Networks were playing an increasingly important part in disaster reliefs, military battlefields and scientific explorations. However, networks routing difficulties are more and more outstanding due to inherent structures. This paper proposed an improved cuckoo searching-based Ad hoc On-Demand Distance Vector Routing protocol (CSAODV). It elaborately designs the calculation methods of optimal routing algorithm used by protocol and transmission mechanism of communication-package. In calculation of optimal routing algorithm by CS Algorithm, by increasing QoS constraint, the found optimal routing algorithm can conform to the requirements of specified bandwidth and time delay, and a certain balance can be obtained among computation spending, bandwidth and time delay. Take advantage of NS2 simulation software to take performance test on protocol in three circumstances and validate the feasibility and validity of CSAODV protocol. In results, CSAODV routing protocol is more adapt to the change of network topological structure than AODV protocol, which improves package delivery fraction of protocol effectively, reduce the transmission time delay of network, reduce the extra burden to network brought by controlling information, and improve the routing efficiency of network.
NASA Astrophysics Data System (ADS)
Al, Can Mert; Yaman, Ulas
2018-05-01
In the scope of this study, an alternative automated method to the conventional design and fabrication pipeline of 3D printers is developed by using an integrated CAD/CAE/CAM approach. It increases the load carrying capacity of the parts by constructing heterogeneous infill structures. Traditional CAM software of Additive Manufacturing machinery starts with a design model in STL file format which only includes data about the outer boundary in the triangular mesh form. Depending on the given infill percentage, the algorithm running behind constructs the interior of the artifact by using homogeneous infill structures. As opposed to the current CAM software, the proposed method provides a way to construct heterogeneous infill structures with respect to the Von Misses stress field results obtained from a finite element analysis. Throughout the work, Rhinoceros3D is used for the design of the parts along with Grasshopper3D, an algorithmic design tool for Rhinoceros3D. In addition, finite element analyses are performed using Karamba3D, a plug-in for Grasshopper3D. According to the results of the tensile tests, the method offers an improvement of load carrying capacity about 50% compared to traditional slicing algorithms of 3D printing.
Xiao, Xun; Geyer, Veikko F; Bowne-Anderson, Hugo; Howard, Jonathon; Sbalzarini, Ivo F
2016-08-01
Biological filaments, such as actin filaments, microtubules, and cilia, are often imaged using different light-microscopy techniques. Reconstructing the filament curve from the acquired images constitutes the filament segmentation problem. Since filaments have lower dimensionality than the image itself, there is an inherent trade-off between tracing the filament with sub-pixel accuracy and avoiding noise artifacts. Here, we present a globally optimal filament segmentation method based on B-spline vector level-sets and a generalized linear model for the pixel intensity statistics. We show that the resulting optimization problem is convex and can hence be solved with global optimality. We introduce a simple and efficient algorithm to compute such optimal filament segmentations, and provide an open-source implementation as an ImageJ/Fiji plugin. We further derive an information-theoretic lower bound on the filament segmentation error, quantifying how well an algorithm could possibly do given the information in the image. We show that our algorithm asymptotically reaches this bound in the spline coefficients. We validate our method in comprehensive benchmarks, compare with other methods, and show applications from fluorescence, phase-contrast, and dark-field microscopy. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
VIP: Vortex Image Processing Package for High-contrast Direct Imaging
NASA Astrophysics Data System (ADS)
Gomez Gonzalez, Carlos Alberto; Wertz, Olivier; Absil, Olivier; Christiaens, Valentin; Defrère, Denis; Mawet, Dimitri; Milli, Julien; Absil, Pierre-Antoine; Van Droogenbroeck, Marc; Cantalloube, Faustine; Hinz, Philip M.; Skemer, Andrew J.; Karlsson, Mikael; Surdej, Jean
2017-07-01
We present the Vortex Image Processing (VIP) library, a python package dedicated to astronomical high-contrast imaging. Our package relies on the extensive python stack of scientific libraries and aims to provide a flexible framework for high-contrast data and image processing. In this paper, we describe the capabilities of VIP related to processing image sequences acquired using the angular differential imaging (ADI) observing technique. VIP implements functionalities for building high-contrast data processing pipelines, encompassing pre- and post-processing algorithms, potential source position and flux estimation, and sensitivity curve generation. Among the reference point-spread function subtraction techniques for ADI post-processing, VIP includes several flavors of principal component analysis (PCA) based algorithms, such as annular PCA and incremental PCA algorithms capable of processing big datacubes (of several gigabytes) on a computer with limited memory. Also, we present a novel ADI algorithm based on non-negative matrix factorization, which comes from the same family of low-rank matrix approximations as PCA and provides fairly similar results. We showcase the ADI capabilities of the VIP library using a deep sequence on HR 8799 taken with the LBTI/LMIRCam and its recently commissioned L-band vortex coronagraph. Using VIP, we investigated the presence of additional companions around HR 8799 and did not find any significant additional point source beyond the four known planets. VIP is available at http://github.com/vortex-exoplanet/VIP and is accompanied with Jupyter notebook tutorials illustrating the main functionalities of the library.
NASA Technical Reports Server (NTRS)
Nakazawa, Shohei
1989-01-01
The user options available for running the MHOST finite element analysis package is described. MHOST is a solid and structural analysis program based on the mixed finite element technology, and is specifically designed for 3-D inelastic analysis. A family of 2- and 3-D continuum elements along with beam and shell structural elements can be utilized, many options are available in the constitutive equation library, the solution algorithms and the analysis capabilities. The outline of solution algorithms is discussed along with the data input and output, analysis options including the user subroutines and the definition of the finite elements implemented in the program package.
FoldMiner and LOCK 2: protein structure comparison and motif discovery on the web.
Shapiro, Jessica; Brutlag, Douglas
2004-07-01
The FoldMiner web server (http://foldminer.stanford.edu/) provides remote access to methods for protein structure alignment and unsupervised motif discovery. FoldMiner is unique among such algorithms in that it improves both the motif definition and the sensitivity of a structural similarity search by combining the search and motif discovery methods and using information from each process to enhance the other. In a typical run, a query structure is aligned to all structures in one of several databases of single domain targets in order to identify its structural neighbors and to discover a motif that is the basis for the similarity among the query and statistically significant targets. This process is fully automated, but options for manual refinement of the results are available as well. The server uses the Chime plugin and customized controls to allow for visualization of the motif and of structural superpositions. In addition, we provide an interface to the LOCK 2 algorithm for rapid alignments of a query structure to smaller numbers of user-specified targets.
Müller, Marcel; Mönkemöller, Viola; Hennig, Simon; Hübner, Wolfgang; Huser, Thomas
2016-01-01
Super-resolved structured illumination microscopy (SR-SIM) is an important tool for fluorescence microscopy. SR-SIM microscopes perform multiple image acquisitions with varying illumination patterns, and reconstruct them to a super-resolved image. In its most frequent, linear implementation, SR-SIM doubles the spatial resolution. The reconstruction is performed numerically on the acquired wide-field image data, and thus relies on a software implementation of specific SR-SIM image reconstruction algorithms. We present fairSIM, an easy-to-use plugin that provides SR-SIM reconstructions for a wide range of SR-SIM platforms directly within ImageJ. For research groups developing their own implementations of super-resolution structured illumination microscopy, fairSIM takes away the hurdle of generating yet another implementation of the reconstruction algorithm. For users of commercial microscopes, it offers an additional, in-depth analysis option for their data independent of specific operating systems. As a modular, open-source solution, fairSIM can easily be adapted, automated and extended as the field of SR-SIM progresses. PMID:26996201
Cortico-muscular coherence on artifact corrected EEG-EMG data recorded with a MRI scanner.
Muthuraman, M; Galka, A; Hong, V N; Heute, U; Deuschl, G; Raethjen, J
2013-01-01
Simultaneous recording of electroencephalogram (EEG) and electromyogram (EMG) with magnetic resonance imaging (MRI) provides great potential for studying human brain activity with high temporal and spatial resolution. But, due to the MRI, the recorded signals are contaminated with artifacts. The correction of these artifacts is important to use these signals for further spectral analysis. The coherence can reveal the cortical representation of peripheral muscle signal in particular motor tasks, e.g. finger movements. The artifact correction of these signals was done by two different algorithms the Brain vision analyzer (BVA) and the Matlab FMRIB plug-in for EEGLAB. The Welch periodogram method was used for estimating the cortico-muscular coherence. Our analysis revealed coherence with a frequency of 5Hz in the contralateral side of the brain. The entropy is estimated for the calculated coherence to get the distribution of coherence in the scalp. The significance of the paper is to identify the optimal algorithm to rectify the MR artifacts and as a first step to use both these signals EEG and EMG in conjunction with MRI for further studies.
Error image aware content restoration
NASA Astrophysics Data System (ADS)
Choi, Sungwoo; Lee, Moonsik; Jung, Byunghee
2015-12-01
As the resolution of TV significantly increased, content consumers have become increasingly sensitive to the subtlest defect in TV contents. This rising standard in quality demanded by consumers has posed a new challenge in today's context where the tape-based process has transitioned to the file-based process: the transition necessitated digitalizing old archives, a process which inevitably produces errors such as disordered pixel blocks, scattered white noise, or totally missing pixels. Unsurprisingly, detecting and fixing such errors require a substantial amount of time and human labor to meet the standard demanded by today's consumers. In this paper, we introduce a novel, automated error restoration algorithm which can be applied to different types of classic errors by utilizing adjacent images while preserving the undamaged parts of an error image as much as possible. We tested our method to error images detected from our quality check system in KBS(Korean Broadcasting System) video archive. We are also implementing the algorithm as a plugin of well-known NLE(Non-linear editing system), which is a familiar tool for quality control agent.
Code of Federal Regulations, 2012 CFR
2012-07-01
... ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) ENERGY POLICY FUEL ECONOMY AND GREENHOUSE GAS EXHAUST EMISSIONS OF MOTOR VEHICLES Fuel Economy and Carbon-Related Exhaust Emission Test Procedures § 600.116-12 Special procedures related to electric vehicles and plug-in hybrid electric vehicles. (a) Determine fuel economy...
Alternative Fuels Data Center: Availability of Hybrid and Plug-In Electric
AddThis.com... More in this section... Electricity Basics Benefits & Considerations Stations Vehicles electricity to improve fuel efficiency. Pre-Owned Vehicles Learn about buying and selling pre-owned and plug-in electric vehicles. Learn more about the benefits and considerations of electricity as a
Health and performance monitoring of the online computer cluster of CMS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bauer, G.; et al.
2012-01-01
The CMS experiment at the LHC features over 2'500 devices that need constant monitoring in order to ensure proper data taking. The monitoring solution has been migrated from Nagios to Icinga, with several useful plugins. The motivations behind the migration and the selection of the plugins are discussed.
Internet Wargaming with Distributed Processing Using the Client-Server Model
1997-03-01
in for war game development . There are tool kits for writing binary files that are interpreted by a particular plug-in. The most popular plug-in set...multi-player game development , the speed with which the environment is changing should be taken into 35 account. For this project JavaScript was chosen
NREL Validates Plug-In Hybrid Truck for Pacific Gas and Electric Company |
Energy Systems Integration Facility | NREL Pacific Gas and Electric Company NREL Validates Plug -In Hybrid Truck for Pacific Gas and Electric Company NREL is evaluating and analyzing a Pacific Gas and Electric Company (PG&E) plug-in hybrid electric utility truck developed by Efficient
Integrating plug-in electric vehicles into the electric power system
NASA Astrophysics Data System (ADS)
Wu, Di
This dissertation contributes to our understanding of how plug-in hybrid electric vehicles (PHEVs) and plug-in battery-only electric vehicles (EVs)---collectively termed plug-in electric vehicles (PEVs)---could be successfully integrated with the electric power system. The research addresses issues at a diverse range of levels pertaining to light-duty vehicles, which account for the majority of highway vehicle miles traveled, energy consumed by highway travel modes, and carbon dioxide emissions from on-road sources. Specifically, the following topics are investigated: (i) On-board power electronics topologies for bidirectional vehicle-to-grid and grid-to-vehicle power transfer; (ii) The estimation of the electric energy and power consumption by fleets of light-duty PEVs; (iii) An operating framework for the scheduling and dispatch of electric power by PEV aggregators; (iv) The pricing of electricity by PHEV aggregators and how it affects the decision-making process of a cost-conscious PHEV owner; (v) The impacts on distribution systems from PEVs under aggregator control; (vi) The modeling of light-duty PEVs for long-term energy and transportation planning at a national scale.
PetriScape - A plugin for discrete Petri net simulations in Cytoscape.
Almeida, Diogo; Azevedo, Vasco; Silva, Artur; Baumbach, Jan
2016-06-04
Systems biology plays a central role for biological network analysis in the post-genomic era. Cytoscape is the standard bioinformatics tool offering the community an extensible platform for computational analysis of the emerging cellular network together with experimental omics data sets. However, only few apps/plugins/tools are available for simulating network dynamics in Cytoscape 3. Many approaches of varying complexity exist but none of them have been integrated into Cytoscape as app/plugin yet. Here, we introduce PetriScape, the first Petri net simulator for Cytoscape. Although discrete Petri nets are quite simplistic models, they are capable of modeling global network properties and simulating their behaviour. In addition, they are easily understood and well visualizable. PetriScape comes with the following main functionalities: (1) import of biological networks in SBML format, (2) conversion into a Petri net, (3) visualization as Petri net, and (4) simulation and visualization of the token flow in Cytoscape. PetriScape is the first Cytoscape plugin for Petri nets. It allows a straightforward Petri net model creation, simulation and visualization with Cytoscape, providing clues about the activity of key components in biological networks.
PetriScape - A plugin for discrete Petri net simulations in Cytoscape.
Almeida, Diogo; Azevedo, Vasco; Silva, Artur; Baumbach, Jan
2016-03-01
Systems biology plays a central role for biological network analysis in the post-genomic era. Cytoscape is the standard bioinformatics tool offering the community an extensible platform for computational analysis of the emerging cellular network together with experimental omics data sets. However, only few apps/plugins/tools are available for simulating network dynamics in Cytoscape 3. Many approaches of varying complexity exist but none of them have been integrated into Cytoscape as app/plugin yet. Here, we introduce PetriScape, the first Petri net simulator for Cytoscape. Although discrete Petri nets are quite simplistic models, they are capable of modeling global network properties and simulating their behaviour. In addition, they are easily understood and well visualizable. PetriScape comes with the following main functionalities: (1) import of biological networks in SBML format, (2) conversion into a Petri net, (3) visualization as Petri net, and (4) simulation and visualization of the token flow in Cytoscape. PetriScape is the first Cytoscape plugin for Petri nets. It allows a straightforward Petri net model creation, simulation and visualization with Cytoscape, providing clues about the activity of key components in biological networks.
[Plug-in Based Centralized Control System in Operating Rooms].
Wang, Yunlong
2017-05-30
Centralized equipment controls in an operating room (OR) is crucial to an efficient workflow in the OR. To achieve centralized control, an integrative OR needs to focus on designing a control panel that can appropriately incorporate equipment from different manufactures with various connecting ports and controls. Here we propose to achieve equipment integration using plug-in modules. Each OR will be equipped with a dynamic plug-in control panel containing physically removable connecting ports. Matching outlets will be installed onto the control panels of each equipment used at any given time. This dynamic control panel will be backed with a database containing plug-in modules that can connect any two types of connecting ports common among medical equipment manufacturers. The correct connecting ports will be called using reflection dynamics. This database will be updated regularly to include new connecting ports on the market, making it easy to maintain, update, expand and remain relevant as new equipment are developed. Together, the physical panel and the database will achieve centralized equipment controls in the OR that can be easily adapted to any equipment in the OR.
Basic mathematical function libraries for scientific computation
NASA Technical Reports Server (NTRS)
Galant, David C.
1989-01-01
Ada packages implementing selected mathematical functions for the support of scientific and engineering applications were written. The packages provide the Ada programmer with the mathematical function support found in the languages Pascal and FORTRAN as well as an extended precision arithmetic and a complete complex arithmetic. The algorithms used are fully described and analyzed. Implementation assumes that the Ada type FLOAT objects fully conform to the IEEE 754-1985 standard for single binary floating-point arithmetic, and that INTEGER objects are 32-bit entities. Codes for the Ada packages are included as appendixes.
Parallel Climate Data Assimilation PSAS Package
NASA Technical Reports Server (NTRS)
Ding, Hong Q.; Chan, Clara; Gennery, Donald B.; Ferraro, Robert D.
1996-01-01
We have designed and implemented a set of highly efficient and highly scalable algorithms for an unstructured computational package, the PSAS data assimilation package, as demonstrated by detailed performance analysis of systematic runs on up to 512node Intel Paragon. The equation solver achieves a sustained 18 Gflops performance. As the results, we achieved an unprecedented 100-fold solution time reduction on the Intel Paragon parallel platform over the Cray C90. This not only meets and exceeds the DAO time requirements, but also significantly enlarges the window of exploration in climate data assimilations.
PI2GIS: processing image to geographical information systems, a learning tool for QGIS
NASA Astrophysics Data System (ADS)
Correia, R.; Teodoro, A.; Duarte, L.
2017-10-01
To perform an accurate interpretation of remote sensing images, it is necessary to extract information using different image processing techniques. Nowadays, it became usual to use image processing plugins to add new capabilities/functionalities integrated in Geographical Information System (GIS) software. The aim of this work was to develop an open source application to automatically process and classify remote sensing images from a set of satellite input data. The application was integrated in a GIS software (QGIS), automating several image processing steps. The use of QGIS for this purpose is justified since it is easy and quick to develop new plugins, using Python language. This plugin is inspired in the Semi-Automatic Classification Plugin (SCP) developed by Luca Congedo. SCP allows the supervised classification of remote sensing images, the calculation of vegetation indices such as NDVI (Normalized Difference Vegetation Index) and EVI (Enhanced Vegetation Index) and other image processing operations. When analysing SCP, it was realized that a set of operations, that are very useful in teaching classes of remote sensing and image processing tasks, were lacking, such as the visualization of histograms, the application of filters, different image corrections, unsupervised classification and several environmental indices computation. The new set of operations included in the PI2GIS plugin can be divided into three groups: pre-processing, processing, and classification procedures. The application was tested consider an image from Landsat 8 OLI from a North area of Portugal.
NASA Astrophysics Data System (ADS)
Illing, Sebastian; Schuster, Mareike; Kadow, Christopher; Kröner, Igor; Richling, Andy; Grieger, Jens; Kruschke, Tim; Lang, Benjamin; Redl, Robert; Schartner, Thomas; Cubasch, Ulrich
2016-04-01
MiKlip is project for medium-term climate prediction funded by the Federal Ministry of Education and Research in Germany (BMBF) and aims to create a model system that is able provide reliable decadal climate forecasts. During the first project phase of MiKlip the sub-project INTEGRATION located at Freie Universität Berlin developed a framework for scientific infrastructures (FREVA). More information about FREVA can be found in EGU2016-13060. An instance of this framework is used as Central Evaluation System (CES) during the MiKlip project. Throughout the first project phase various sub-projects developed over 25 analysis tools - so called plugins - for the CES. The main focus of these plugins is on the evaluation and verification of decadal climate prediction data, but most plugins are not limited to this scope. They target a wide range of scientific questions. Starting from preprocessing tools like the "LeadtimeSelector", which creates lead-time dependent time-series from decadal hindcast sets, over tracking tools like the "Zykpak" plugin, which can objectively locate and track mid-latitude cyclones, to plugins like "MurCSS" or "SPECS", which calculate deterministic and probabilistic skill metrics. We also integrated some analyses from Model Evaluation Tools (MET), which was developed at NCAR. We will show the theoretical background, technical implementation strategies, and some interesting results of the evaluation of the MiKlip Prototype decadal prediction system for a selected set of these tools.
WPBMB Entrez: An interface to NCBI Entrez for Wordpress.
Gohara, David W
2018-03-01
Research-oriented websites are an important means for the timely communication of information. These websites fall under a number of categories including: research laboratories, training grant and program projects, and online service portals. Invariably there is content on a site, such as publication listings, that require frequent updating. A number of content management systems exist to aid in the task of developing and managing a website, each with their strengths and weaknesses. One popular choice is Wordpress, a free, open source and actively developed application for the creation of web content. During a recent site redesign for our department, the need arose to ensure publications were up to date for each of the research labs and department as a whole. Several plugins for Wordpress offer this type of functionality, but in many cases the plugins are either no longer maintained, are missing features that would require the use of several, possibly incompatible, plugins or lack features for layout on a webpage. WPBMB Entrez was developed to address these needs. WPBMB Entrez utilizes a subset of NCBI Entrez and RCSB databases to maintain up to date records of publications, and publication related information on Wordpress-based websites. The core functionality uses the same search query syntax as on the NCBI Entrez site, including advanced query syntaxes. The plugin is extensible allowing for rapid development and addition of new data sources as the need arises. WPBMB Entrez was designed to be easy to use, yet flexible enough to address more complex usage scenarios. Features of the plugin include: an easy to use interface, design customization, multiple templates for displaying publication results, a caching mechanism to reduce page load times, supports multiple distinct queries and retrieval modes, and the ability to aggregate multiple queries into unified lists. Additionally, developer documentation is provided to aid in customization of the plugin. WPBMB Entrez is available at no cost, is open source and works with all recent versions of Wordpress. Copyright © 2017 Elsevier B.V. All rights reserved.
2012-01-01
Background Graph drawing is an integral part of many systems biology studies, enabling visual exploration and mining of large-scale biological networks. While a number of layout algorithms are available in popular network analysis platforms, such as Cytoscape, it remains poorly understood how well their solutions reflect the underlying biological processes that give rise to the network connectivity structure. Moreover, visualizations obtained using conventional layout algorithms, such as those based on the force-directed drawing approach, may become uninformative when applied to larger networks with dense or clustered connectivity structure. Methods We implemented a modified layout plug-in, named Multilevel Layout, which applies the conventional layout algorithms within a multilevel optimization framework to better capture the hierarchical modularity of many biological networks. Using a wide variety of real life biological networks, we carried out a systematic evaluation of the method in comparison with other layout algorithms in Cytoscape. Results The multilevel approach provided both biologically relevant and visually pleasant layout solutions in most network types, hence complementing the layout options available in Cytoscape. In particular, it could improve drawing of large-scale networks of yeast genetic interactions and human physical interactions. In more general terms, the biological evaluation framework developed here enables one to assess the layout solutions from any existing or future graph drawing algorithm as well as to optimize their performance for a given network type or structure. Conclusions By making use of the multilevel modular organization when visualizing biological networks, together with the biological evaluation of the layout solutions, one can generate convenient visualizations for many network biology applications. PMID:22448851
OntoCheck: verifying ontology naming conventions and metadata completeness in Protégé 4
2012-01-01
Background Although policy providers have outlined minimal metadata guidelines and naming conventions, ontologies of today still display inter- and intra-ontology heterogeneities in class labelling schemes and metadata completeness. This fact is at least partially due to missing or inappropriate tools. Software support can ease this situation and contribute to overall ontology consistency and quality by helping to enforce such conventions. Objective We provide a plugin for the Protégé Ontology editor to allow for easy checks on compliance towards ontology naming conventions and metadata completeness, as well as curation in case of found violations. Implementation In a requirement analysis, derived from a prior standardization approach carried out within the OBO Foundry, we investigate the needed capabilities for software tools to check, curate and maintain class naming conventions. A Protégé tab plugin was implemented accordingly using the Protégé 4.1 libraries. The plugin was tested on six different ontologies. Based on these test results, the plugin could be refined, also by the integration of new functionalities. Results The new Protégé plugin, OntoCheck, allows for ontology tests to be carried out on OWL ontologies. In particular the OntoCheck plugin helps to clean up an ontology with regard to lexical heterogeneity, i.e. enforcing naming conventions and metadata completeness, meeting most of the requirements outlined for such a tool. Found test violations can be corrected to foster consistency in entity naming and meta-annotation within an artefact. Once specified, check constraints like name patterns can be stored and exchanged for later re-use. Here we describe a first version of the software, illustrate its capabilities and use within running ontology development efforts and briefly outline improvements resulting from its application. Further, we discuss OntoChecks capabilities in the context of related tools and highlight potential future expansions. Conclusions The OntoCheck plugin facilitates labelling error detection and curation, contributing to lexical quality assurance in OWL ontologies. Ultimately, we hope this Protégé extension will ease ontology alignments as well as lexical post-processing of annotated data and hence can increase overall secondary data usage by humans and computers. PMID:23046606
3D reconstruction of synapses with deep learning based on EM Images
NASA Astrophysics Data System (ADS)
Xiao, Chi; Rao, Qiang; Zhang, Dandan; Chen, Xi; Han, Hua; Xie, Qiwei
2017-03-01
Recently, due to the rapid development of electron microscope (EM) with its high resolution, stacks delivered by EM can be used to analyze a variety of components that are critical to understand brain function. Since synaptic study is essential in neurobiology and can be analyzed by EM stacks, the automated routines for reconstruction of synapses based on EM Images can become a very useful tool for analyzing large volumes of brain tissue and providing the ability to understand the mechanism of brain. In this article, we propose a novel automated method to realize 3D reconstruction of synapses for Automated Tapecollecting Ultra Microtome Scanning Electron Microscopy (ATUM-SEM) with deep learning. Being different from other reconstruction algorithms, which employ classifier to segment synaptic clefts directly. We utilize deep learning method and segmentation algorithm to obtain synaptic clefts as well as promote the accuracy of reconstruction. The proposed method contains five parts: (1) using modified Moving Least Square (MLS) deformation algorithm and Scale Invariant Feature Transform (SIFT) features to register adjacent sections, (2) adopting Faster Region Convolutional Neural Networks (Faster R-CNN) algorithm to detect synapses, (3) utilizing screening method which takes context cues of synapses into consideration to reduce the false positive rate, (4) combining a practical morphology algorithm with a suitable fitting function to segment synaptic clefts and optimize the shape of them, (5) applying the plugin in FIJI to show the final 3D visualization of synapses. Experimental results on ATUM-SEM images demonstrate the effectiveness of our proposed method.
PyRosetta: a script-based interface for implementing molecular modeling algorithms using Rosetta.
Chaudhury, Sidhartha; Lyskov, Sergey; Gray, Jeffrey J
2010-03-01
PyRosetta is a stand-alone Python-based implementation of the Rosetta molecular modeling package that allows users to write custom structure prediction and design algorithms using the major Rosetta sampling and scoring functions. PyRosetta contains Python bindings to libraries that define Rosetta functions including those for accessing and manipulating protein structure, calculating energies and running Monte Carlo-based simulations. PyRosetta can be used in two ways: (i) interactively, using iPython and (ii) script-based, using Python scripting. Interactive mode contains a number of help features and is ideal for beginners while script-mode is best suited for algorithm development. PyRosetta has similar computational performance to Rosetta, can be easily scaled up for cluster applications and has been implemented for algorithms demonstrating protein docking, protein folding, loop modeling and design. PyRosetta is a stand-alone package available at http://www.pyrosetta.org under the Rosetta license which is free for academic and non-profit users. A tutorial, user's manual and sample scripts demonstrating usage are also available on the web site.
PyRosetta: a script-based interface for implementing molecular modeling algorithms using Rosetta
Chaudhury, Sidhartha; Lyskov, Sergey; Gray, Jeffrey J.
2010-01-01
Summary: PyRosetta is a stand-alone Python-based implementation of the Rosetta molecular modeling package that allows users to write custom structure prediction and design algorithms using the major Rosetta sampling and scoring functions. PyRosetta contains Python bindings to libraries that define Rosetta functions including those for accessing and manipulating protein structure, calculating energies and running Monte Carlo-based simulations. PyRosetta can be used in two ways: (i) interactively, using iPython and (ii) script-based, using Python scripting. Interactive mode contains a number of help features and is ideal for beginners while script-mode is best suited for algorithm development. PyRosetta has similar computational performance to Rosetta, can be easily scaled up for cluster applications and has been implemented for algorithms demonstrating protein docking, protein folding, loop modeling and design. Availability: PyRosetta is a stand-alone package available at http://www.pyrosetta.org under the Rosetta license which is free for academic and non-profit users. A tutorial, user's manual and sample scripts demonstrating usage are also available on the web site. Contact: pyrosetta@graylab.jhu.edu PMID:20061306
DynamO: a free O(N) general event-driven molecular dynamics simulator.
Bannerman, M N; Sargant, R; Lue, L
2011-11-30
Molecular dynamics algorithms for systems of particles interacting through discrete or "hard" potentials are fundamentally different to the methods for continuous or "soft" potential systems. Although many software packages have been developed for continuous potential systems, software for discrete potential systems based on event-driven algorithms are relatively scarce and specialized. We present DynamO, a general event-driven simulation package, which displays the optimal O(N) asymptotic scaling of the computational cost with the number of particles N, rather than the O(N) scaling found in most standard algorithms. DynamO provides reference implementations of the best available event-driven algorithms. These techniques allow the rapid simulation of both complex and large (>10(6) particles) systems for long times. The performance of the program is benchmarked for elastic hard sphere systems, homogeneous cooling and sheared inelastic hard spheres, and equilibrium Lennard-Jones fluids. This software and its documentation are distributed under the GNU General Public license and can be freely downloaded from http://marcusbannerman.co.uk/dynamo. Copyright © 2011 Wiley Periodicals, Inc.
NASA Technical Reports Server (NTRS)
Buntine, Wray
1993-01-01
This paper introduces the IND Tree Package to prospective users. IND does supervised learning using classification trees. This learning task is a basic tool used in the development of diagnosis, monitoring and expert systems. The IND Tree Package was developed as part of a NASA project to semi-automate the development of data analysis and modelling algorithms using artificial intelligence techniques. The IND Tree Package integrates features from CART and C4 with newer Bayesian and minimum encoding methods for growing classification trees and graphs. The IND Tree Package also provides an experimental control suite on top. The newer features give improved probability estimates often required in diagnostic and screening tasks. The package comes with a manual, Unix 'man' entries, and a guide to tree methods and research. The IND Tree Package is implemented in C under Unix and was beta-tested at university and commercial research laboratories in the United States.
Myers, Owen D; Sumner, Susan J; Li, Shuzhao; Barnes, Stephen; Du, Xiuxia
2017-09-05
XCMS and MZmine 2 are two widely used software packages for preprocessing untargeted LC/MS metabolomics data. Both construct extracted ion chromatograms (EICs) and detect peaks from the EICs, the first two steps in the data preprocessing workflow. While both packages have performed admirably in peak picking, they also detect a problematic number of false positive EIC peaks and can also fail to detect real EIC peaks. The former and latter translate downstream into spurious and missing compounds and present significant limitations with most existing software packages that preprocess untargeted mass spectrometry metabolomics data. We seek to understand the specific reasons why XCMS and MZmine 2 find the false positive EIC peaks that they do and in what ways they fail to detect real compounds. We investigate differences of EIC construction methods in XCMS and MZmine 2 and find several problems in the XCMS centWave peak detection algorithm which we show are partly responsible for the false positive and false negative compound identifications. In addition, we find a problem with MZmine 2's use of centWave. We hope that a detailed understanding of the XCMS and MZmine 2 algorithms will allow users to work with them more effectively and will also help with future algorithmic development.
ProtSqueeze: simple and effective automated tool for setting up membrane protein simulations.
Yesylevskyy, Semen O
2007-01-01
The major challenge in setting up membrane protein simulations is embedding the protein into the pre-equilibrated lipid bilayer. Several techniques were proposed to achieve optimal packing of the lipid molecules around the protein. However, all of them possess serious disadvantages, which limit their applicability and discourage the users of simulation packages from using them. In the present work, we analyzed existing approaches and proposed a new procedure of protein insertion into the lipid bilayer, which is implemented in the ProtSqueeze software. The advantages of ProtSqueeze are as follows: (1) the insertion algorithm is simple, understandable, and controllable; (2) the software can work with virtually any simulation package on virtually any platform; (3) no modification of the source code of the simulation package is needed; (4) the procedure of insertion is as automated as possible; (5) ProtSqueeze is distributed for free under a general public license. In this work, we present the architecture and the algorithm of ProtSqueeze and demonstrate its usage in case studies.
scarlet: Source separation in multi-band images by Constrained Matrix Factorization
NASA Astrophysics Data System (ADS)
Melchior, Peter; Moolekamp, Fred; Jerdee, Maximilian; Armstrong, Robert; Sun, Ai-Lei; Bosch, James; Lupton, Robert
2018-03-01
SCARLET performs source separation (aka "deblending") on multi-band images. It is geared towards optical astronomy, where scenes are composed of stars and galaxies, but it is straightforward to apply it to other imaging data. Separation is achieved through a constrained matrix factorization, which models each source with a Spectral Energy Distribution (SED) and a non-parametric morphology, or multiple such components per source. The code performs forced photometry (with PSF matching if needed) using an optimal weight function given by the signal-to-noise weighted morphology across bands. The approach works well if the sources in the scene have different colors and can be further strengthened by imposing various additional constraints/priors on each source. Because of its generic utility, this package provides a stand-alone implementation that contains the core components of the source separation algorithm. However, the development of this package is part of the LSST Science Pipeline; the meas_deblender package contains a wrapper to implement the algorithms here for the LSST stack.
PyBoolNet: a python package for the generation, analysis and visualization of boolean networks.
Klarner, Hannes; Streck, Adam; Siebert, Heike
2017-03-01
The goal of this project is to provide a simple interface to working with Boolean networks. Emphasis is put on easy access to a large number of common tasks including the generation and manipulation of networks, attractor and basin computation, model checking and trap space computation, execution of established graph algorithms as well as graph drawing and layouts. P y B ool N et is a Python package for working with Boolean networks that supports simple access to model checking via N u SMV, standard graph algorithms via N etwork X and visualization via dot . In addition, state of the art attractor computation exploiting P otassco ASP is implemented. The package is function-based and uses only native Python and N etwork X data types. https://github.com/hklarner/PyBoolNet. hannes.klarner@fu-berlin.de. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
SPOTting model parameters using a ready-made Python package
NASA Astrophysics Data System (ADS)
Houska, Tobias; Kraft, Philipp; Breuer, Lutz
2015-04-01
The selection and parameterization of reliable process descriptions in ecological modelling is driven by several uncertainties. The procedure is highly dependent on various criteria, like the used algorithm, the likelihood function selected and the definition of the prior parameter distributions. A wide variety of tools have been developed in the past decades to optimize parameters. Some of the tools are closed source. Due to this, the choice for a specific parameter estimation method is sometimes more dependent on its availability than the performance. A toolbox with a large set of methods can support users in deciding about the most suitable method. Further, it enables to test and compare different methods. We developed the SPOT (Statistical Parameter Optimization Tool), an open source python package containing a comprehensive set of modules, to analyze and optimize parameters of (environmental) models. SPOT comes along with a selected set of algorithms for parameter optimization and uncertainty analyses (Monte Carlo, MC; Latin Hypercube Sampling, LHS; Maximum Likelihood, MLE; Markov Chain Monte Carlo, MCMC; Scuffled Complex Evolution, SCE-UA; Differential Evolution Markov Chain, DE-MCZ), together with several likelihood functions (Bias, (log-) Nash-Sutcliff model efficiency, Correlation Coefficient, Coefficient of Determination, Covariance, (Decomposed-, Relative-, Root-) Mean Squared Error, Mean Absolute Error, Agreement Index) and prior distributions (Binomial, Chi-Square, Dirichlet, Exponential, Laplace, (log-, multivariate-) Normal, Pareto, Poisson, Cauchy, Uniform, Weibull) to sample from. The model-independent structure makes it suitable to analyze a wide range of applications. We apply all algorithms of the SPOT package in three different case studies. Firstly, we investigate the response of the Rosenbrock function, where the MLE algorithm shows its strengths. Secondly, we study the Griewank function, which has a challenging response surface for optimization methods. Here we see simple algorithms like the MCMC struggling to find the global optimum of the function, while algorithms like SCE-UA and DE-MCZ show their strengths. Thirdly, we apply an uncertainty analysis of a one-dimensional physically based hydrological model build with the Catchment Modelling Framework (CMF). The model is driven by meteorological and groundwater data from a Free Air Carbon Enrichment (FACE) experiment in Linden (Hesse, Germany). Simulation results are evaluated with measured soil moisture data. We search for optimal parameter sets of the van Genuchten-Mualem function and find different equally optimal solutions with some of the algorithms. The case studies reveal that the implemented SPOT methods work sufficiently well. They further show the benefit of having one tool at hand that includes a number of parameter search methods, likelihood functions and a priori parameter distributions within one platform independent package.
EVALUATION OF REGISTRATION, COMPRESSION AND CLASSIFICATION ALGORITHMS
NASA Technical Reports Server (NTRS)
Jayroe, R. R.
1994-01-01
Several types of algorithms are generally used to process digital imagery such as Landsat data. The most commonly used algorithms perform the task of registration, compression, and classification. Because there are different techniques available for performing registration, compression, and classification, imagery data users need a rationale for selecting a particular approach to meet their particular needs. This collection of registration, compression, and classification algorithms was developed so that different approaches could be evaluated and the best approach for a particular application determined. Routines are included for six registration algorithms, six compression algorithms, and two classification algorithms. The package also includes routines for evaluating the effects of processing on the image data. This collection of routines should be useful to anyone using or developing image processing software. Registration of image data involves the geometrical alteration of the imagery. Registration routines available in the evaluation package include image magnification, mapping functions, partitioning, map overlay, and data interpolation. The compression of image data involves reducing the volume of data needed for a given image. Compression routines available in the package include adaptive differential pulse code modulation, two-dimensional transforms, clustering, vector reduction, and picture segmentation. Classification of image data involves analyzing the uncompressed or compressed image data to produce inventories and maps of areas of similar spectral properties within a scene. The classification routines available include a sequential linear technique and a maximum likelihood technique. The choice of the appropriate evaluation criteria is quite important in evaluating the image processing functions. The user is therefore given a choice of evaluation criteria with which to investigate the available image processing functions. All of the available evaluation criteria basically compare the observed results with the expected results. For the image reconstruction processes of registration and compression, the expected results are usually the original data or some selected characteristics of the original data. For classification processes the expected result is the ground truth of the scene. Thus, the comparison process consists of determining what changes occur in processing, where the changes occur, how much change occurs, and the amplitude of the change. The package includes evaluation routines for performing such comparisons as average uncertainty, average information transfer, chi-square statistics, multidimensional histograms, and computation of contingency matrices. This collection of routines is written in FORTRAN IV for batch execution and has been implemented on an IBM 360 computer with a central memory requirement of approximately 662K of 8 bit bytes. This collection of image processing and evaluation routines was developed in 1979.
Comparison of particle tracking algorithms in commercial CFD packages: sedimentation and diffusion.
Robinson, Risa J; Snyder, Pam; Oldham, Michael J
2007-05-01
Computational fluid dynamic modeling software has enabled microdosimetry patterns of inhaled toxins and toxicants to be predicted and visualized, and is being used in inhalation toxicology and risk assessment. These predicted microdosimetry patterns in airway structures are derived from predicted airflow patterns within these airways and particle tracking algorithms used in computational fluid dynamics (CFD) software packages. Although these commercial CFD codes have been tested for accuracy under various conditions, they have not been well tested for respiratory flows in general. Nor has their particle tracking algorithm accuracy been well studied. In this study, three software packages, Fluent Discrete Phase Model (DPM), Fluent Fine Particle Model (FPM), and ANSYS CFX, were evaluated. Sedimentation and diffusion were each isolated in a straight tube geometry and tested for accuracy. A range of flow rates corresponding to adult low activity (minute ventilation = 10 L/min) and to heavy exertion (minute ventilation = 60 L/min) were tested by varying the range of dimensionless diffusion and sedimentation parameters found using the Weibel symmetric 23 generation lung morphology. Numerical results for fully developed parabolic and uniform (slip) profiles were compared respectively, to Pich (1972) and Yu (1977) analytical sedimentation solutions. Schum and Yeh (1980) equations for sedimentation were also compared. Numerical results for diffusional deposition were compared to analytical solutions of Ingham (1975) for parabolic and uniform profiles. Significant differences were found among the various CFD software packages and between numerical and analytical solutions. Therefore, it is prudent to validate CFD predictions against analytical solutions in idealized geometry before tackling the complex geometries of the respiratory tract.
NASA Technical Reports Server (NTRS)
Tilmes, Curt A.; Fleig, Albert J.
2008-01-01
NASA's traditional science data processing systems have focused on specific missions, and providing data access, processing and services to the funded science teams of those specific missions. Recently NASA has been modifying this stance, changing the focus from Missions to Measurements. Where a specific Mission has a discrete beginning and end, the Measurement considers long term data continuity across multiple missions. Total Column Ozone, a critical measurement of atmospheric composition, has been monitored for'decades on a series of Total Ozone Mapping Spectrometer (TOMS) instruments. Some important European missions also monitor ozone, including the Global Ozone Monitoring Experiment (GOME) and SCIAMACHY. With the U.S.IEuropean cooperative launch of the Dutch Ozone Monitoring Instrument (OMI) on NASA Aura satellite, and the GOME-2 instrumental on MetOp, the ozone monitoring record has been further extended. In conjunction with the U.S. Department of Defense (DoD) and the National Oceanic and Atmospheric Administration (NOAA), NASA is now preparing to evaluate data and algorithms for the next generation Ozone Mapping and Profiler Suite (OMPS) which will launch on the National Polar-orbiting Operational Environmental Satellite System (NPOESS) Preparatory Project (NPP) in 2010. NASA is constructing the Science Data Segment (SDS) which is comprised of several elements to evaluate the various NPP data products and algorithms. The NPP SDS Ozone Product Evaluation and Test Element (PEATE) will build on the heritage of the TOMS and OM1 mission based processing systems. The overall measurement based system that will encompass these efforts is the Atmospheric Composition Processing System (ACPS). We have extended the system to include access to publically available data sets from other instruments where feasible, including non-NASA missions as appropriate. The heritage system was largely monolithic providing a very controlled processing flow from data.ingest of satellite data to the ultimate archive of specific operational data products. The ACPS allows more open access with standard protocols including HTTP, SOAPIXML, RSS and various REST incarnations. External entities can be granted access to various modules within the system, including an extended data archive, metadata searching, production planning and processing. Data access is provided with very fine grained access control. It is possible to easily designate certain datasets as being available to the public, or restricted to groups of researchers, or limited strictly to the originator. This can be used, for example, to release one's best validated data to the public, but restrict the "new version" of data processed with a new, unproven algorithm until it is ready. Similarly, the system can provide access to algorithms, both as modifiable source code (where possible) and fully integrated executable Algorithm Plugin Packages (APPs). This enables researchers to download publically released versions of the processing algorithms and easily reproduce the processing remotely, while interacting with the ACPS. The algorithms can be modified allowing better experimentation and rapid improvement. The modified algorithms can be easily integrated back into the production system for large scale bulk processing to evaluate improvements. The system includes complete provenance tracking of algorithms, data and the entire processing environment. The origin of any data or algorithms is recorded and the entire history of the processing chains are stored such that a researcher can understand the entire data flow. Provenance is captured in a form suitable for the system to guarantee scientific reproducability of any data product it distributes even in cases where the physical data products themselves have been deleted due to space constraints. We are currently working on Semantic Web ontologies for representing the various provenance information. A new web site focusing on consolidating informaon about the measurement, processing system, and data access has been established to encourage interaction with the overall scientific community. We will describe the system, its data processing capabilities, and the methods the community can use to interact with the standard interfaces of the system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Singer, Mark
This presentation includes data captured by the National Renewable Energy Laboratory (NREL) to support the U.S. Department of Energy's Vehicle Technologies Office (VTO) research efforts. The data capture consumer views on fuel economy, plug-in electric vehicle battery range, and willingness to pay for advanced vehicle technologies.
40 CFR 600.302-12 - Fuel economy label-general provisions.
Code of Federal Regulations, 2012 CFR
2012-07-01
... includes hybrid electric vehicles that do not have plug-in capability. Include a logo corresponding to the..., include a fuel pump logo and the designation “E85”. (iii) Identify plug-in hybrid electric vehicles as... fuel pump logo as specified in paragraph (b)(3)(i) of this section and an electric plug logo to the...
40 CFR 600.302-12 - Fuel economy label-general provisions.
Code of Federal Regulations, 2014 CFR
2014-07-01
... includes hybrid electric vehicles that do not have plug-in capability. Include a logo corresponding to the..., include a fuel pump logo and the designation “E85”. (iii) Identify plug-in hybrid electric vehicles as... fuel pump logo as specified in paragraph (b)(3)(i) of this section and an electric plug logo to the...
Alternative Fuels Data Center: Plug-In Hybrid Electric Vehicles
Data Center: Plug-In Hybrid Electric Vehicles on AddThis.com... More in this section... Electricity other propulsion source. Using electricity from the grid to run the vehicle some or all of the time levels of emissions, depending on the electricity source. There are several light-duty PHEVs commercially
Alternative Fuels Data Center: Los Angeles Saves With Hybrid and Plug-In
Electric VehiclesA> Los Angeles Saves With Hybrid and Plug-In Electric Vehicles to someone by E million gallons last year. For information about this project, contact Los Angeles Clean Cities Coalition - Television's Original Automotive Magazine Provided by Maryland Public Television Related Videos Photo of a car
Households' Stories of Their Encounters with a Plug-In Hybrid Electric Vehicle
ERIC Educational Resources Information Center
Caperello, Nicolette D.; Kurani, Kenneth S.
2012-01-01
One way to progress toward greenhouse gas reductions is for people to drive plug-in hybrid electric vehicles (PHEVs). Households in this study participated in a 4- to 6-week PHEV driving trial. A narrative of each household's encounter with the PHEV was constructed by the researchers from multiple in-home interviews, questionnaires completed by…
Alternative Fuels Data Center: Plug-In Vehicles to Harness Renewable Energy
state has partnered with the U.S. Department of Energy through the Hawaii Clean Energy Initiative to adoption," Larson said. HCC supports the Hawaii Clean Energy Initiative, a partnership between DOE and Hawaii Clean Energy Initiative Honolulu Clean Cities National Clean Fleets Partnership Hybrid and Plug-In
DeJongh, Matthew; Bockstege, Benjamin; Frybarger, Paul; Hazekamp, Nicholas; Kammeraad, Joshua; McGeehan, Travis
2012-01-01
Summary: CytoSEED is a Cytoscape plugin for viewing, manipulating and analyzing metabolic models created using the Model SEED. The CytoSEED plugin enables users of the Model SEED to create informative visualizations of the reaction networks generated for their organisms of interest. These visualizations are useful for understanding organism-specific biochemistry and for highlighting the results of flux variability analysis experiments. Availability and Implementation: Freely available for download on the web at http://sourceforge.net/projects/cytoseed/. Implemented in Java SE 6 and supported on all platforms that support Cytoscape. Contact: dejongh@hope.edu Supplementary information: Installation instructions, a tutorial, and full-size figures are available at http://www.cs.hope.edu/cytoseed/. PMID:22210867
Integrating R with GIS for innovative geocomputing - the examples of RQGIS and RSAGA
NASA Astrophysics Data System (ADS)
Muenchow, Jannes; Schratz, Patrick; Bangs, Donovan; Brenning, Alexander
2017-04-01
While Geographical information systems (GIS) are good at efficiently manipulating and processing large amounts of geospatial data, the programming language R excels in (geo-)statistical analyses. Thus, bringing GIS algorithms to the R console combines the best of the two worlds, and paves the way for innovative geocomputing. To promote this approach, we will contrast the new RQGIS package with the established RSAGA package in terms of architecture, functionality and ease-of-use. Both packages use already existing Application Programming Interfaces (API), namely the QGIS Python API and SAGA API, to access GIS functionality from within R. Overall, RQGIS has the advantage of providing a unified interface to several GIS toolboxes (GRASS, SAGA, GDAL, etc.) bringing more than 1000 geocomputing algorithms to the R console (though only a subset of the full functionality of a specific third-party provider might be available). To further support the unified interface, QGIS automatically converts the input data into the formats supported by the respective third-party module. Moreover, RQGIS is easier to use than RSAGA due to special convenience functions (open_help, get_args_man, run_qgis). Nevertheless, the experienced SAGA user will most likely prefer the RSAGA package since it lets the user access all SAGA algorithms for a wide range of SAGA versions (currently 2.0.4 - 2.2.3). Additionally, RSAGA includes numerous user-friendly wrapper functions with arguments and meaningful default values (e.g., rsaga.slope). What is more, RSAGA provides the user with special geocomputing R functions, i.e. functions which solely run in R without using SAGA in the background (e.g., pick.from.ascii.grid, grid.predict and multi.local.function). To demonstrate the advantages of each package, we will derive terrain attributes from digital elevation models to model species richness and landslide susceptibility using non-linear generalized linear or generalized additive models. In the end, the choice of RQGIS or RSAGA depends on the user's preferences, expertise and tasks at hand. But both packages will benefit anyone working with large spatio-temporal data in R.
Capturing User Reading Behaviors for Personalized Document Summarization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Songhua; Jiang, Hao; Lau, Francis
2011-01-01
We propose a new personalized document summarization method that observes a user's personal reading preferences. These preferences are inferred from the user's reading behaviors, including facial expressions, gaze positions, and reading durations that were captured during the user's past reading activities. We compare the performance of our algorithm with that of a few peer algorithms and software packages. The results of our comparative study show that our algorithm can produce more superior personalized document summaries than all the other methods in that the summaries generated by our algorithm can better satisfy a user's personal preferences.
NASA Astrophysics Data System (ADS)
Lemmens, R.; Maathuis, B.; Mannaerts, C.; Foerster, T.; Schaeffer, B.; Wytzisk, A.
2009-12-01
This paper involves easy accessible integrated web-based analysis of satellite images with a plug-in based open source software. The paper is targeted to both users and developers of geospatial software. Guided by a use case scenario, we describe the ILWIS software and its toolbox to access satellite images through the GEONETCast broadcasting system. The last two decades have shown a major shift from stand-alone software systems to networked ones, often client/server applications using distributed geo-(web-)services. This allows organisations to combine without much effort their own data with remotely available data and processing functionality. Key to this integrated spatial data analysis is a low-cost access to data from within a user-friendly and flexible software. Web-based open source software solutions are more often a powerful option for developing countries. The Integrated Land and Water Information System (ILWIS) is a PC-based GIS & Remote Sensing software, comprising a complete package of image processing, spatial analysis and digital mapping and was developed as commercial software from the early nineties onwards. Recent project efforts have migrated ILWIS into a modular, plug-in-based open source software, and provide web-service support for OGC-based web mapping and processing. The core objective of the ILWIS Open source project is to provide a maintainable framework for researchers and software developers to implement training components, scientific toolboxes and (web-) services. The latest plug-ins have been developed for multi-criteria decision making, water resources analysis and spatial statistics analysis. The development of this framework is done since 2007 in the context of 52°North, which is an open initiative that advances the development of cutting edge open source geospatial software, using the GPL license. GEONETCast, as part of the emerging Global Earth Observation System of Systems (GEOSS), puts essential environmental data at the fingertips of users around the globe. This user-friendly and low-cost information dissemination provides global information as a basis for decision-making in a number of critical areas, including public health, energy, agriculture, weather, water, climate, natural disasters and ecosystems. GEONETCast makes available satellite images via Digital Video Broadcast (DVB) technology. An OGC WMS interface and plug-ins which convert GEONETCast data streams allow an ILWIS user to integrate various distributed data sources with data locally stored on his machine. Our paper describes a use case in which ILWIS is used with GEONETCast satellite imagery for decision making processes in Ghana. We also explain how the ILWIS software can be extended with additional functionality by means of building plug-ins and unfold our plans to implement other OGC standards, such as WCS and WPS in the same context. Especially, the latter one can be seen as a major step forward in terms of moving well-proven desktop based processing functionality to the web. This enables the embedding of ILWIS functionality in Spatial Data Infrastructures or even the execution in scalable and on-demand cloud computing environments.
Free software for performing physical analysis of systems for digital radiography and mammography.
Donini, Bruno; Rivetti, Stefano; Lanconelli, Nico; Bertolini, Marco
2014-05-01
In this paper, the authors present a free software for assisting users in achieving the physical characterization of x-ray digital systems and image quality checks. The program was developed as a plugin of a well-known public-domain suite ImageJ. The software can assist users in calculating various physical parameters such as the response curve (also termed signal transfer property), modulation transfer function (MTF), noise power spectra (NPS), and detective quantum efficiency (DQE). It also includes the computation of some image quality checks: defective pixel analysis, uniformity, dark analysis, and lag. The software was made available in 2009 and has been used during the last couple of years by many users who gave us valuable feedback for improving its usability. It was tested for achieving the physical characterization of several clinical systems for digital radiography and mammography. Various published papers made use of the outcomes of the plugin. This software is potentially beneficial to a variety of users: physicists working in hospitals, staff working in radiological departments, such as medical physicists, physicians, engineers. The plugin, together with a brief user manual, are freely available and can be found online (www.medphys.it/downloads.htm). With our plugin users can estimate all three most important parameters used for physical characterization (MTF, NPS, and also DQE). The plugin can run on any operating system equipped with ImageJ suite. The authors validated the software by comparing MTF and NPS curves on a common set of images with those obtained with other dedicated programs, achieving a very good agreement.
Plug Your Users into Library Resources with OpenSearch Plug-Ins
ERIC Educational Resources Information Center
Baker, Nicholas C.
2007-01-01
To bring the library catalog and other online resources right into users' workspace quickly and easily without needing much more than a short XML file, the author, a reference and Web services librarian at Williams College, learned to build and use OpenSearch plug-ins. OpenSearch is a set of simple technologies and standards that allows the…
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
To assist federal agencies with the transition to plug-in electric vehicles (PEVs), including battery electric vehicles (BEVs) and plug-in hybrid electric vehicles (PHEVs), FEMP offers technical guidance on electric vehicle supply equipment (EVSE) installations and site-specific planning through partnerships with the National Renewable Energy Laboratory’s EVSE Tiger Teams.
DOE Office of Scientific and Technical Information (OSTI.GOV)
This software is a plug-in that interfaces between the Phoenix Integration's Model Center and the Base SAS 9.2 applications. The end use of the plug-in is to link input and output data that resides in SAS tables or MS SQL to and from "legacy" software programs without recoding. The potential end users are users who need to run legacy code and want data stored in a SQL database.
Code of Federal Regulations, 2013 CFR
2013-07-01
... technology under § 86.1870-12, and requires the measurement of electrical current (in amps) flowing into the... 40 Protection of Environment 31 2013-07-01 2013-07-01 false Special procedures related to electric vehicles, hybrid electric vehicles, and plug-in hybrid electric vehicles. 600.116-12 Section 600.116-12...
NASA Astrophysics Data System (ADS)
Xiao, Jingjie
A key hurdle for implementing real-time pricing of electricity is a lack of consumers' responses. Solutions to overcome the hurdle include the energy management system that automatically optimizes household appliance usage such as plug-in hybrid electric vehicle charging (and discharging with vehicle-to-grid) via a two-way communication with the grid. Real-time pricing, combined with household automation devices, has a potential to accommodate an increasing penetration of plug-in hybrid electric vehicles. In addition, the intelligent energy controller on the consumer-side can help increase the utilization rate of the intermittent renewable resource, as the demand can be managed to match the output profile of renewables, thus making the intermittent resource such as wind and solar more economically competitive in the long run. One of the main goals of this dissertation is to present how real-time retail pricing, aided by control automation devices, can be integrated into the wholesale electricity market under various uncertainties through approximate dynamic programming. What distinguishes this study from the existing work in the literature is that whole- sale electricity prices are endogenously determined as we solve a system operator's economic dispatch problem on an hourly basis over the entire optimization horizon. This modeling and algorithm framework will allow a feedback loop between electricity prices and electricity consumption to be fully captured. While we are interested in a near-optimal solution using approximate dynamic programming; deterministic linear programming benchmarks are use to demonstrate the quality of our solutions. The other goal of the dissertation is to use this framework to provide numerical evidence to the debate on whether real-time pricing is superior than the current flat rate structure in terms of both economic and environmental impacts. For this purpose, the modeling and algorithm framework is tested on a large-scale test case with hundreds of power plants based on data available for California, making our findings useful for policy makers, system operators and utility companies to gain a concrete understanding on the scale of the impact with real-time pricing.
The swiss army knife of job submission tools: grid-control
NASA Astrophysics Data System (ADS)
Stober, F.; Fischer, M.; Schleper, P.; Stadie, H.; Garbers, C.; Lange, J.; Kovalchuk, N.
2017-10-01
grid-control is a lightweight and highly portable open source submission tool that supports all common workflows in high energy physics (HEP). It has been used by a sizeable number of HEP analyses to process tasks that sometimes consist of up to 100k jobs. grid-control is built around a powerful plugin and configuration system, that allows users to easily specify all aspects of the desired workflow. Job submission to a wide range of local or remote batch systems or grid middleware is supported. Tasks can be conveniently specified through the parameter space that will be processed, which can consist of any number of variables and data sources with complex dependencies on each other. Dataset information is processed through a configurable pipeline of dataset filters, partition plugins and partition filters. The partition plugins can take the number of files, size of the work units, metadata or combinations thereof into account. All changes to the input datasets or variables are propagated through the processing pipeline and can transparently trigger adjustments to the parameter space and the job submission. While the core functionality is completely experiment independent, full integration with the CMS computing environment is provided by a small set of plugins.
Infrastructure for Rapid Development of Java GUI Programs
NASA Technical Reports Server (NTRS)
Jones, Jeremy; Hostetter, Carl F.; Wheeler, Philip
2006-01-01
The Java Application Shell (JAS) is a software framework that accelerates the development of Java graphical-user-interface (GUI) application programs by enabling the reuse of common, proven GUI elements, as distinguished from writing custom code for GUI elements. JAS is a software infrastructure upon which Java interactive application programs and graphical user interfaces (GUIs) for those programs can be built as sets of plug-ins. JAS provides an application- programming interface that is extensible by application-specific plugins that describe and encapsulate both specifications of a GUI and application-specific functionality tied to the specified GUI elements. The desired GUI elements are specified in Extensible Markup Language (XML) descriptions instead of in compiled code. JAS reads and interprets these descriptions, then creates and configures a corresponding GUI from a standard set of generic, reusable GUI elements. These elements are then attached (again, according to the XML descriptions) to application-specific compiled code and scripts. An application program constructed by use of JAS as its core can be extended by writing new plug-ins and replacing existing plug-ins. Thus, JAS solves many problems that Java programmers generally solve anew for each project, thereby reducing development and testing time.
StackSplit - a plugin for multi-event shear wave splitting analyses in SplitLab
NASA Astrophysics Data System (ADS)
Grund, Michael
2017-08-01
SplitLab is a powerful and widely used tool for analysing seismological shear wave splitting of single event measurements. However, in many cases, especially temporary station deployments close to the noisy seaside, ocean bottom or for recordings affected by strong anthropogenic noise, only multi-event approaches provide stable and reliable splitting results. In order to extend the original SplitLab environment for such analyses, I present the StackSplit plugin that can easily be implemented within the well accepted main program. StackSplit grants easy access to several different analysis approaches within SplitLab, including a new multiple waveform based inversion method as well as the most established standard stacking procedures. The possibility to switch between different analysis approaches at any time allows the user for the most flexible processing of individual multi-event splitting measurements for a single recording station. Besides the provided functions of the plugin, no other external program is needed for the multi-event analyses since StackSplit performs within the available SplitLab structure which is based on MATLAB. The effectiveness and use of this plugin is demonstrated with data examples of a long running seismological recording station in Finland.
The deployment of a large scale object store at the RAL Tier-1
NASA Astrophysics Data System (ADS)
Dewhurst, A.; Johnson, I.; Adams, J.; Canning, B.; Vasilakakos, G.; Packer, A.
2017-10-01
Since 2014, the RAL Tier-1 has been working on deploying a Ceph backed object store. The aim is to replace Castor for disk-only storage. This new service must be scalable to meet the data demands of the LHC to 2020 and beyond. As well as offering access protocols the LHC experiments currently use, it must also provide industry standard access protocols. In order to keep costs down the service must use erasure coding rather than replication to ensure data reliability. This paper will present details of the storage service setup, which has been named Echo, as well as the experience gained from running it. The RAL Tier-1 has also been developing XrootD and GridFTP plugins for Ceph. Both plugins are built on top of the same libraries that write striped data into Ceph and therefore data written by one protocol will be accessible by the other. In the long term we hope the LHC experiments will migrate to industry standard protocols, therefore these plugins will only provide the features needed by the LHC experiments. This paper will report on the development and testing of these plugins.
NASA Astrophysics Data System (ADS)
Yen, J. L.; Kremer, P.; Amin, N.; Fung, J.
1989-05-01
The Department of National Defence (Canada) has been conducting studies into multi-beam adaptive arrays for extremely high frequency (EHF) frequency hopped signals. A three-beam 43 GHz adaptive antenna and a beam control processor is under development. An interactive software package for the operation of the array, capable of applying different control algorithms is being written. A maximum signal to jammer plus noise ratio (SJNR) was found to provide superior performance in preventing degradation of user signals in the presence of nearby jammers. A new fast algorithm using a modified conjugate gradient approach was found to be a very efficient way to implement anti-jamming arrays based on maximum SJNR criterion. The present study was intended to refine and simplify this algorithm and to implement the algorithm on an experimental array for real-time evaluation of anti-jamming performance. A three-beam adaptive array was used. A simulation package was used in the evaluation of multi-beam systems using more than three beams and different user-jammer scenarios. An attempt to further reduce the computation burden through continued analysis of maximum SJNR met with limited success. A method to acquire and track an incoming laser beam is proposed.
NASA Astrophysics Data System (ADS)
Yen, J. L.; Kremer, P.; Fung, J.
1990-05-01
The Department of National Defence (Canada) has been conducting studies into multi-beam adaptive arrays for extremely high frequency (EHF) frequency hopped signals. A three-beam 43 GHz adaptive antenna and a beam control processor is under development. An interactive software package for the operation of the array, capable of applying different control algorithms is being written. A maximum signal to jammer plus noise ratio (SJNR) has been found to provide superior performance in preventing degradation of user signals in the presence of nearby jammers. A new fast algorithm using a modified conjugate gradient approach has been found to be a very efficient way to implement anti-jamming arrays based on maximum SJNR criterion. The present study was intended to refine and simplify this algorithm and to implement the algorithm on an experimental array for real-time evaluation of anti-jamming performance. A three-beam adaptive array was used. A simulation package was used in the evaluation of multi-beam systems using more than three beams and different user-jammer scenarios. An attempt to further reduce the computation burden through further analysis of maximum SJNR met with limited success. The investigation of a new angle detector for spatial tracking in heterodyne laser space communications was completed.
Implementation of interconnect simulation tools in spice
NASA Technical Reports Server (NTRS)
Satsangi, H.; Schutt-Aine, J. E.
1993-01-01
Accurate computer simulation of high speed digital computer circuits and communication circuits requires a multimode approach to simulate both the devices and the interconnects between devices. Classical circuit analysis algorithms (lumped parameter) are needed for circuit devices and the network formed by the interconnected devices. The interconnects, however, have to be modeled as transmission lines which incorporate electromagnetic field analysis. An approach to writing a multimode simulator is to take an existing software package which performs either lumped parameter analysis or field analysis and add the missing type of analysis routines to the package. In this work a traditionally lumped parameter simulator, SPICE, is modified so that it will perform lossy transmission line analysis using a different model approach. Modifying SPICE3E2 or any other large software package is not a trivial task. An understanding of the programming conventions used, simulation software, and simulation algorithms is required. This thesis was written to clarify the procedure for installing a device into SPICE3E2. The installation of three devices is documented and the installations of the first two provide a foundation for installation of the lossy line which is the third device. The details of discussions are specific to SPICE, but the concepts will be helpful when performing installations into other circuit analysis packages.
Hierarchical Petascale Simulation Framework For Stress Corrosion Cracking
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grama, Ananth
2013-12-18
A number of major accomplishments resulted from the project. These include: • Data Structures, Algorithms, and Numerical Methods for Reactive Molecular Dynamics. We have developed a range of novel data structures, algorithms, and solvers (amortized ILU, Spike) for use with ReaxFF and charge equilibration. • Parallel Formulations of ReactiveMD (Purdue ReactiveMolecular Dynamics Package, PuReMD, PuReMD-GPU, and PG-PuReMD) for Messaging, GPU, and GPU Cluster Platforms. We have developed efficient serial, parallel (MPI), GPU (Cuda), and GPU Cluster (MPI/Cuda) implementations. Our implementations have been demonstrated to be significantly better than the state of the art, both in terms of performance and scalability.more » • Comprehensive Validation in the Context of Diverse Applications. We have demonstrated the use of our software in diverse systems, including silica-water, silicon-germanium nanorods, and as part of other projects, extended it to applications ranging from explosives (RDX) to lipid bilayers (biomembranes under oxidative stress). • Open Source Software Packages for Reactive Molecular Dynamics. All versions of our soft- ware have been released over the public domain. There are over 100 major research groups worldwide using our software. • Implementation into the Department of Energy LAMMPS Software Package. We have also integrated our software into the Department of Energy LAMMPS software package.« less
Ho, Michelle L; Adler, Benjamin A; Torre, Michael L; Silberg, Jonathan J; Suh, Junghae
2013-12-20
Adeno-associated virus (AAV) recombination can result in chimeric capsid protein subunits whose ability to assemble into an oligomeric capsid, package a genome, and transduce cells depends on the inheritance of sequence from different AAV parents. To develop quantitative design principles for guiding site-directed recombination of AAV capsids, we have examined how capsid structural perturbations predicted by the SCHEMA algorithm correlate with experimental measurements of disruption in seventeen chimeric capsid proteins. In our small chimera population, created by recombining AAV serotypes 2 and 4, we found that protection of viral genomes and cellular transduction were inversely related to calculated disruption of the capsid structure. Interestingly, however, we did not observe a correlation between genome packaging and calculated structural disruption; a majority of the chimeric capsid proteins formed at least partially assembled capsids and more than half packaged genomes, including those with the highest SCHEMA disruption. These results suggest that the sequence space accessed by recombination of divergent AAV serotypes is rich in capsid chimeras that assemble into 60-mer capsids and package viral genomes. Overall, the SCHEMA algorithm may be useful for delineating quantitative design principles to guide the creation of libraries enriched in genome-protecting virus nanoparticles that can effectively transduce cells. Such improvements to the virus design process may help advance not only gene therapy applications but also other bionanotechnologies dependent upon the development of viruses with new sequences and functions.
Ho, Michelle L.; Adler, Benjamin A.; Torre, Michael L.; Silberg, Jonathan J.; Suh, Junghae
2013-01-01
Adeno-associated virus (AAV) recombination can result in chimeric capsid protein subunits whose ability to assemble into an oligomeric capsid, package a genome, and transduce cells depends on the inheritance of sequence from different AAV parents. To develop quantitative design principles for guiding site-directed recombination of AAV capsids, we have examined how capsid structural perturbations predicted by the SCHEMA algorithm correlate with experimental measurements of disruption in seventeen chimeric capsid proteins. In our small chimera population, created by recombining AAV serotypes 2 and 4, we found that protection of viral genomes and cellular transduction were inversely related to calculated disruption of the capsid structure. Interestingly, however, we did not observe a correlation between genome packaging and calculated structural disruption; a majority of the chimeric capsid proteins formed at least partially assembled capsids and more than half packaged genomes, including those with the highest SCHEMA disruption. These results suggest that the sequence space accessed by recombination of divergent AAV serotypes is rich in capsid chimeras that assemble into 60-mer capsids and package viral genomes. Overall, the SCHEMA algorithm may be useful for delineating quantitative design principles to guide the creation of libraries enriched in genome-protecting virus nanoparticles that can effectively transduce cells. Such improvements to the virus design process may help advance not only gene therapy applications, but also other bionanotechnologies dependent upon the development of viruses with new sequences and functions. PMID:23899192
pyRMSD: a Python package for efficient pairwise RMSD matrix calculation and handling.
Gil, Víctor A; Guallar, Víctor
2013-09-15
We introduce pyRMSD, an open source standalone Python package that aims at offering an integrative and efficient way of performing Root Mean Square Deviation (RMSD)-related calculations of large sets of structures. It is specially tuned to do fast collective RMSD calculations, as pairwise RMSD matrices, implementing up to three well-known superposition algorithms. pyRMSD provides its own symmetric distance matrix class that, besides the fact that it can be used as a regular matrix, helps to save memory and increases memory access speed. This last feature can dramatically improve the overall performance of any Python algorithm using it. In addition, its extensibility, testing suites and documentation make it a good choice to those in need of a workbench for developing or testing new algorithms. The source code (under MIT license), installer, test suites and benchmarks can be found at https://pele.bsc.es/ under the tools section. victor.guallar@bsc.es Supplementary data are available at Bioinformatics online.
Kobayashi, Chigusa; Jung, Jaewoon; Matsunaga, Yasuhiro; Mori, Takaharu; Ando, Tadashi; Tamura, Koichi; Kamiya, Motoshi; Sugita, Yuji
2017-09-30
GENeralized-Ensemble SImulation System (GENESIS) is a software package for molecular dynamics (MD) simulation of biological systems. It is designed to extend limitations in system size and accessible time scale by adopting highly parallelized schemes and enhanced conformational sampling algorithms. In this new version, GENESIS 1.1, new functions and advanced algorithms have been added. The all-atom and coarse-grained potential energy functions used in AMBER and GROMACS packages now become available in addition to CHARMM energy functions. The performance of MD simulations has been greatly improved by further optimization, multiple time-step integration, and hybrid (CPU + GPU) computing. The string method and replica-exchange umbrella sampling with flexible collective variable choice are used for finding the minimum free-energy pathway and obtaining free-energy profiles for conformational changes of a macromolecule. These new features increase the usefulness and power of GENESIS for modeling and simulation in biological research. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
RGG: A general GUI Framework for R scripts
Visne, Ilhami; Dilaveroglu, Erkan; Vierlinger, Klemens; Lauss, Martin; Yildiz, Ahmet; Weinhaeusel, Andreas; Noehammer, Christa; Leisch, Friedrich; Kriegner, Albert
2009-01-01
Background R is the leading open source statistics software with a vast number of biostatistical and bioinformatical analysis packages. To exploit the advantages of R, extensive scripting/programming skills are required. Results We have developed a software tool called R GUI Generator (RGG) which enables the easy generation of Graphical User Interfaces (GUIs) for the programming language R by adding a few Extensible Markup Language (XML) – tags. RGG consists of an XML-based GUI definition language and a Java-based GUI engine. GUIs are generated in runtime from defined GUI tags that are embedded into the R script. User-GUI input is returned to the R code and replaces the XML-tags. RGG files can be developed using any text editor. The current version of RGG is available as a stand-alone software (RGGRunner) and as a plug-in for JGR. Conclusion RGG is a general GUI framework for R that has the potential to introduce R statistics (R packages, built-in functions and scripts) to users with limited programming skills and helps to bridge the gap between R developers and GUI-dependent users. RGG aims to abstract the GUI development from individual GUI toolkits by using an XML-based GUI definition language. Thus RGG can be easily integrated in any software. The RGG project further includes the development of a web-based repository for RGG-GUIs. RGG is an open source project licensed under the Lesser General Public License (LGPL) and can be downloaded freely at PMID:19254356
Las Palmeras Molecular Dynamics: A flexible and modular molecular dynamics code
NASA Astrophysics Data System (ADS)
Davis, Sergio; Loyola, Claudia; González, Felipe; Peralta, Joaquín
2010-12-01
Las Palmeras Molecular Dynamics (LPMD) is a highly modular and extensible molecular dynamics (MD) code using interatomic potential functions. LPMD is able to perform equilibrium MD simulations of bulk crystalline solids, amorphous solids and liquids, as well as non-equilibrium MD (NEMD) simulations such as shock wave propagation, projectile impacts, cluster collisions, shearing, deformation under load, heat conduction, heterogeneous melting, among others, which involve unusual MD features like non-moving atoms and walls, unstoppable atoms with constant-velocity, and external forces like electric fields. LPMD is written in C++ as a compromise between efficiency and clarity of design, and its architecture is based on separate components or plug-ins, implemented as modules which are loaded on demand at runtime. The advantage of this architecture is the ability to completely link together the desired components involved in the simulation in different ways at runtime, using a user-friendly control file language which describes the simulation work-flow. As an added bonus, the plug-in API (Application Programming Interface) makes it possible to use the LPMD components to analyze data coming from other simulation packages, convert between input file formats, apply different transformations to saved MD atomic trajectories, and visualize dynamical processes either in real-time or as a post-processing step. Individual components, such as a new potential function, a new integrator, a new file format, new properties to calculate, new real-time visualizers, and even a new algorithm for handling neighbor lists can be easily coded, compiled and tested within LPMD by virtue of its object-oriented API, without the need to modify the rest of the code. LPMD includes already several pair potential functions such as Lennard-Jones, Morse, Buckingham, MCY and the harmonic potential, as well as embedded-atom model (EAM) functions such as the Sutton-Chen and Gupta potentials. Integrators to choose include Euler (if only for demonstration purposes), Verlet and Velocity Verlet, Leapfrog and Beeman, among others. Electrostatic forces are treated as another potential function, by default using the plug-in implementing the Ewald summation method. Program summaryProgram title: LPMD Catalogue identifier: AEHG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHG_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License version 3 No. of lines in distributed program, including test data, etc.: 509 490 No. of bytes in distributed program, including test data, etc.: 6 814 754 Distribution format: tar.gz Programming language: C++ Computer: 32-bit and 64-bit workstation Operating system: UNIX RAM: Minimum 1024 bytes Classification: 7.7 External routines: zlib, OpenGL Nature of problem: Study of Statistical Mechanics and Thermodynamics of condensed matter systems, as well as kinetics of non-equilibrium processes in the same systems. Solution method: Equilibrium and non-equilibrium molecular dynamics method, Monte Carlo methods. Restrictions: Rigid molecules are not supported. Polarizable atoms and chemical bonds (proteins) either. Unusual features: The program is able to change the temperature of the simulation cell, the pressure, cut regions of the cell, color the atoms by properties, even during the simulation. It is also possible to fix the positions and/or velocity of groups of atoms. Visualization of atoms and some physical properties during the simulation. Additional comments: The program does not only perform molecular dynamics and Monte Carlo simulations, it is also able to filter and manipulate atomic configurations, read and write different file formats, convert between them, evaluate different structural and dynamical properties. Running time: 50 seconds on a 1000-step simulation of 4000 argon atoms, running on a single 2.67 GHz Intel processor.
NASA Astrophysics Data System (ADS)
Mozaffari, Ahmad; Vajedi, Mahyar; Azad, Nasser L.
2015-06-01
The main proposition of the current investigation is to develop a computational intelligence-based framework which can be used for the real-time estimation of optimum battery state-of-charge (SOC) trajectory in plug-in hybrid electric vehicles (PHEVs). The estimated SOC trajectory can be then employed for an intelligent power management to significantly improve the fuel economy of the vehicle. The devised intelligent SOC trajectory builder takes advantage of the upcoming route information preview to achieve the lowest possible total cost of electricity and fossil fuel. To reduce the complexity of real-time optimization, the authors propose an immune system-based clustering approach which allows categorizing the route information into a predefined number of segments. The intelligent real-time optimizer is also inspired on the basis of interactions in biological immune systems, and is called artificial immune algorithm (AIA). The objective function of the optimizer is derived from a computationally efficient artificial neural network (ANN) which is trained by a database obtained from a high-fidelity model of the vehicle built in the Autonomie software. The simulation results demonstrate that the integration of immune inspired clustering tool, AIA and ANN, will result in a powerful framework which can generate a near global optimum SOC trajectory for the baseline vehicle, that is, the Toyota Prius PHEV. The outcomes of the current investigation prove that by taking advantage of intelligent approaches, it is possible to design a computationally efficient and powerful SOC trajectory builder for the intelligent power management of PHEVs.
Evaluation of copy number variation detection for a SNP array platform
2014-01-01
Background Copy Number Variations (CNVs) are usually inferred from Single Nucleotide Polymorphism (SNP) arrays by use of some software packages based on given algorithms. However, there is no clear understanding of the performance of these software packages; it is therefore difficult to select one or several software packages for CNV detection based on the SNP array platform. We selected four publicly available software packages designed for CNV calling from an Affymetrix SNP array, including Birdsuite, dChip, Genotyping Console (GTC) and PennCNV. The publicly available dataset generated by Array-based Comparative Genomic Hybridization (CGH), with a resolution of 24 million probes per sample, was considered to be the “gold standard”. Compared with the CGH-based dataset, the success rate, average stability rate, sensitivity, consistence and reproducibility of these four software packages were assessed compared with the “gold standard”. Specially, we also compared the efficiency of detecting CNVs simultaneously by two, three and all of the software packages with that by a single software package. Results Simply from the quantity of the detected CNVs, Birdsuite detected the most while GTC detected the least. We found that Birdsuite and dChip had obvious detecting bias. And GTC seemed to be inferior because of the least amount of CNVs it detected. Thereafter we investigated the detection consistency produced by one certain software package and the rest three software suits. We found that the consistency of dChip was the lowest while GTC was the highest. Compared with the CNVs detecting result of CGH, in the matching group, GTC called the most matching CNVs, PennCNV-Affy ranked second. In the non-overlapping group, GTC called the least CNVs. With regards to the reproducibility of CNV calling, larger CNVs were usually replicated better. PennCNV-Affy shows the best consistency while Birdsuite shows the poorest. Conclusion We found that PennCNV outperformed the other three packages in the sensitivity and specificity of CNV calling. Obviously, each calling method had its own limitations and advantages for different data analysis. Therefore, the optimized calling methods might be identified using multiple algorithms to evaluate the concordance and discordance of SNP array-based CNV calling. PMID:24555668
NASA Astrophysics Data System (ADS)
Pilz, Tobias; Francke, Till; Bronstert, Axel
2017-08-01
The characteristics of a landscape pose essential factors for hydrological processes. Therefore, an adequate representation of the landscape of a catchment in hydrological models is vital. However, many of such models exist differing, amongst others, in spatial concept and discretisation. The latter constitutes an essential pre-processing step, for which many different algorithms along with numerous software implementations exist. In that context, existing solutions are often model specific, commercial, or depend on commercial back-end software, and allow only a limited or no workflow automation at all. Consequently, a new package for the scientific software and scripting environment R, called lumpR, was developed. lumpR employs an algorithm for hillslope-based landscape discretisation directed to large-scale application via a hierarchical multi-scale approach. The package addresses existing limitations as it is free and open source, easily extendible to other hydrological models, and the workflow can be fully automated. Moreover, it is user-friendly as the direct coupling to a GIS allows for immediate visual inspection and manual adjustment. Sufficient control is furthermore retained via parameter specification and the option to include expert knowledge. Conversely, completely automatic operation also allows for extensive analysis of aspects related to landscape discretisation. In a case study, the application of the package is presented. A sensitivity analysis of the most important discretisation parameters demonstrates its efficient workflow automation. Considering multiple streamflow metrics, the employed model proved reasonably robust to the discretisation parameters. However, parameters determining the sizes of subbasins and hillslopes proved to be more important than the others, including the number of representative hillslopes, the number of attributes employed for the lumping algorithm, and the number of sub-discretisations of the representative hillslopes.
Battery Second Use for Plug-In Electric Vehicles | Transportation Research
-ion battery cost barriers to the deployment of both plug-in electric vehicles (PEVs) and grid application, the total lifetime value of the battery is increased, and the cost of the battery can be shared second use B2U Repurposing Cost Calculator For B2U, NRELs repurposing cost calculator is available for
NASA Astrophysics Data System (ADS)
Kadow, Christopher; Illing, Sebastian; Kunst, Oliver; Schartner, Thomas; Kirchner, Ingo; Rust, Henning W.; Cubasch, Ulrich; Ulbrich, Uwe
2016-04-01
The Freie Univ Evaluation System Framework (Freva - freva.met.fu-berlin.de) is a software infrastructure for standardized data and tool solutions in Earth system science. Freva runs on high performance computers to handle customizable evaluation systems of research projects, institutes or universities. It combines different software technologies into one common hybrid infrastructure, including all features present in the shell and web environment. The database interface satisfies the international standards provided by the Earth System Grid Federation (ESGF). Freva indexes different data projects into one common search environment by storing the meta data information of the self-describing model, reanalysis and observational data sets in a database. This implemented meta data system with its advanced but easy-to-handle search tool supports users, developers and their plugins to retrieve the required information. A generic application programming interface (API) allows scientific developers to connect their analysis tools with the evaluation system independently of the programming language used. Users of the evaluation techniques benefit from the common interface of the evaluation system without any need to understand the different scripting languages. Facilitation of the provision and usage of tools and climate data automatically increases the number of scientists working with the data sets and identifying discrepancies. The integrated web-shell (shellinabox) adds a degree of freedom in the choice of the working environment and can be used as a gate to the research projects HPC. Plugins are able to integrate their e.g. post-processed results into the database of the user. This allows e.g. post-processing plugins to feed statistical analysis plugins, which fosters an active exchange between plugin developers of a research project. Additionally, the history and configuration sub-system stores every analysis performed with the evaluation system in a database. Configurations and results of the tools can be shared among scientists via shell or web system. Therefore, plugged-in tools benefit from transparency and reproducibility. Furthermore, if configurations match while starting an evaluation plugin, the system suggests to use results already produced by other users - saving CPU/h, I/O, disk space and time. The efficient interaction between different technologies improves the Earth system modeling science framed by Freva.
NASA Astrophysics Data System (ADS)
Kadow, C.; Illing, S.; Schartner, T.; Grieger, J.; Kirchner, I.; Rust, H.; Cubasch, U.; Ulbrich, U.
2017-12-01
The Freie Univ Evaluation System Framework (Freva - freva.met.fu-berlin.de) is a software infrastructure for standardized data and tool solutions in Earth system science (e.g. www-miklip.dkrz.de, cmip-eval.dkrz.de). Freva runs on high performance computers to handle customizable evaluation systems of research projects, institutes or universities. It combines different software technologies into one common hybrid infrastructure, including all features present in the shell and web environment. The database interface satisfies the international standards provided by the Earth System Grid Federation (ESGF). Freva indexes different data projects into one common search environment by storing the meta data information of the self-describing model, reanalysis and observational data sets in a database. This implemented meta data system with its advanced but easy-to-handle search tool supports users, developers and their plugins to retrieve the required information. A generic application programming interface (API) allows scientific developers to connect their analysis tools with the evaluation system independently of the programming language used. Users of the evaluation techniques benefit from the common interface of the evaluation system without any need to understand the different scripting languages. The integrated web-shell (shellinabox) adds a degree of freedom in the choice of the working environment and can be used as a gate to the research projects HPC. Plugins are able to integrate their e.g. post-processed results into the database of the user. This allows e.g. post-processing plugins to feed statistical analysis plugins, which fosters an active exchange between plugin developers of a research project. Additionally, the history and configuration sub-system stores every analysis performed with the evaluation system in a database. Configurations and results of the tools can be shared among scientists via shell or web system. Furthermore, if configurations match while starting an evaluation plugin, the system suggests to use results already produced by other users - saving CPU/h, I/O, disk space and time. The efficient interaction between different technologies improves the Earth system modeling science framed by Freva.
Multiattribute Decision Modeling Techniques: A Comparative Analysis
1988-08-01
Analytic Hierarchy Process ( AHP ). It is structurally similar to SMART, but elicitation methods are different and there are several algorithms for...reconciliation of inconsistent judgments and for consistency checks that are not available in any of the utility procedures. The AHP has been applied...of commercially available software packages that implement the AHP algorithms. Elicitation Methods. The AHP builds heavily on value trees, which
Comparison of rotation algorithms for digital images
NASA Astrophysics Data System (ADS)
Starovoitov, Valery V.; Samal, Dmitry
1999-09-01
The paper presents a comparative study of several algorithms developed for digital image rotation. No losing generality we studied gray scale images. We have tested methods preserving gray values of the original images, performing some interpolation and two procedures implemented into the Corel Photo-paint and Adobe Photoshop soft packages. By the similar way methods for rotation of color images may be evaluated also.
Parallel Climate Data Assimilation PSAS Package Achieves 18 GFLOPs on 512-Node Intel Paragon
NASA Technical Reports Server (NTRS)
Ding, H. Q.; Chan, C.; Gennery, D. B.; Ferraro, R. D.
1995-01-01
Several algorithms were added to the Physical-space Statistical Analysis System (PSAS) from Goddard, which assimilates observational weather data by correcting for different levels of uncertainty about the data and different locations for mobile observation platforms. The new algorithms and use of the 512-node Intel Paragon allowed a hundred-fold decrease in processing time.
Exact and Monte carlo resampling procedures for the Wilcoxon-Mann-Whitney and Kruskal-Wallis tests.
Berry, K J; Mielke, P W
2000-12-01
Exact and Monte Carlo resampling FORTRAN programs are described for the Wilcoxon-Mann-Whitney rank sum test and the Kruskal-Wallis one-way analysis of variance for ranks test. The program algorithms compensate for tied values and do not depend on asymptotic approximations for probability values, unlike most algorithms contained in PC-based statistical software packages.
Novel Techniques for Millimeter-Wave Packages
NASA Technical Reports Server (NTRS)
Herman, Martin I.; Lee, Karen A.; Kolawa, Elzbieta A.; Lowry, Lynn E.; Tulintseff, Ann N.
1995-01-01
A new millimeter-wave package architecture with supporting electrical, mechanical and material science experiment and analysis is presented. This package is well suited for discrete devices, monolithic microwave integrated circuits (MMIC's) and multichip module (MCM) applications. It has low-loss wide-band RF transitions which are necessary to overcome manufacturing tolerances leading to lower per unit cost Potential applications of this new packaging architecture which go beyond the standard requirements of device protection include integration of antennas, compatibility to photonic networks and direct transitions to waveguide systems. Techniques for electromagnetic analysis, thermal control and hermetic sealing were explored. Three dimensional electromagnetic analysis was performed using a finite difference time-domain (FDTD) algorithm and experimentally verified for millimeter-wave package input and output transitions. New multi-material system concepts (AlN, Cu, and diamond thin films) which allow excellent surface finishes to be achieved with enhanced thermal management have been investigated. A new approach utilizing block copolymer coatings was employed to hermetically seal packages which met MIL STD-883.
Introduction of A New Toolbox for Processing Digital Images From Multiple Camera Networks: FMIPROT
NASA Astrophysics Data System (ADS)
Melih Tanis, Cemal; Nadir Arslan, Ali
2017-04-01
Webcam networks intended for scientific monitoring of ecosystems is providing digital images and other environmental data for various studies. Also, other types of camera networks can also be used for scientific purposes, e.g. usage of traffic webcams for phenological studies, camera networks for ski tracks and avalanche monitoring over mountains for hydrological studies. To efficiently harness the potential of these camera networks, easy to use software which can obtain and handle images from different networks having different protocols and standards is necessary. For the analyses of the images from webcam networks, numerous software packages are freely available. These software packages have different strong features not only for analyzing but also post processing digital images. But specifically for the ease of use, applicability and scalability, a different set of features could be added. Thus, a more customized approach would be of high value, not only for analyzing images of comprehensive camera networks, but also considering the possibility to create operational data extraction and processing with an easy to use toolbox. At this paper, we introduce a new toolbox, entitled; Finnish Meteorological Institute Image PROcessing Tool (FMIPROT) which a customized approach is followed. FMIPROT has currently following features: • straightforward installation, • no software dependencies that require as extra installations, • communication with multiple camera networks, • automatic downloading and handling images, • user friendly and simple user interface, • data filtering, • visualizing results on customizable plots, • plugins; allows users to add their own algorithms. Current image analyses in FMIPROT include "Color Fraction Extraction" and "Vegetation Indices". The analysis of color fraction extraction is calculating the fractions of the colors in a region of interest, for red, green and blue colors along with brightness and luminance parameters. The analysis of vegetation indices is a collection of indices used in vegetation phenology and includes "Green Fraction" (green chromatic coordinate), "Green-Red Vegetation Index" and "Green Excess Index". "Snow cover fraction" analysis which detects snow covered pixels in the images and georeference them on a geospatial plane to calculate the snow cover fraction is being implemented at the moment. FMIPROT is being developed during the EU Life+ MONIMET project. Altogether we mounted 28 cameras at 14 different sites in Finland as MONIMET camera network. In this paper, we will present details of FMIPROT and analysis results from MONIMET camera network. We will also discuss on future planned developments of FMIPROT.
al3c: high-performance software for parameter inference using Approximate Bayesian Computation.
Stram, Alexander H; Marjoram, Paul; Chen, Gary K
2015-11-01
The development of Approximate Bayesian Computation (ABC) algorithms for parameter inference which are both computationally efficient and scalable in parallel computing environments is an important area of research. Monte Carlo rejection sampling, a fundamental component of ABC algorithms, is trivial to distribute over multiple processors but is inherently inefficient. While development of algorithms such as ABC Sequential Monte Carlo (ABC-SMC) help address the inherent inefficiencies of rejection sampling, such approaches are not as easily scaled on multiple processors. As a result, current Bayesian inference software offerings that use ABC-SMC lack the ability to scale in parallel computing environments. We present al3c, a C++ framework for implementing ABC-SMC in parallel. By requiring only that users define essential functions such as the simulation model and prior distribution function, al3c abstracts the user from both the complexities of parallel programming and the details of the ABC-SMC algorithm. By using the al3c framework, the user is able to scale the ABC-SMC algorithm in parallel computing environments for his or her specific application, with minimal programming overhead. al3c is offered as a static binary for Linux and OS-X computing environments. The user completes an XML configuration file and C++ plug-in template for the specific application, which are used by al3c to obtain the desired results. Users can download the static binaries, source code, reference documentation and examples (including those in this article) by visiting https://github.com/ahstram/al3c. astram@usc.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Higher-Order Corrections to Timelike Jets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giele, W.T.; /Fermilab; Kosower, D.A.
2011-02-01
We present a simple formalism for the evolution of timelike jets in which tree-level matrix element corrections can be systematically incorporated, up to arbitrary parton multiplicities and over all of phase space, in a way that exponentiates the matching corrections. The scheme is cast as a shower Markov chain which generates one single unweighted event sample, that can be passed to standard hadronization models. Remaining perturbative uncertainties are estimated by providing several alternative weight sets for the same events, at a relatively modest additional overhead. As an explicit example, we consider Z {yields} q{bar q} evolution with unpolarized, massless quarksmore » and include several formally subleading improvements as well as matching to tree-level matrix elements through {alpha}{sub s}{sup 4}. The resulting algorithm is implemented in the publicly available VINCIA plugin to the PYTHIA8 event generator.« less
EV-Grid Integration (EVGI) Control and System Implementation - Research Overview
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kisacikoglu, Mithat; Markel, Tony; Meintz, Andrew
2016-03-23
Plug-in electric vehicles (PEVs) are being increasingly adopted in industry today. Microgrid applications of PEVs require the development of charging and discharging algorithms and individual characterization of vehicles including the on-board chargers and vehicle mobility. This study summarizes the capabilities of the Electric Vehicle Grid Integration (EVGI) Team at NREL and underlines different recent projects of the Team. Our studies include V1G, V2G, and V2H control of PEVs as well as test and analysis of stationary and dynamic wireless power transfer (WPT) systems. The presentation also includes the future scope of study which implements real-time simulation of PEVs in amore » microgrid scenario. The capabilities at Vehicle Testing and Integration Facility (VTIF) and Energy Systems Integration Facility (ESIF) were described within the scope of the EVGI research.« less
Apparatus for controlling the scan width of a scanning laser beam
Johnson, Gary W.
1996-01-01
Swept-wavelength lasers are often used in absorption spectroscopy applications. In experiments where high accuracy is required, it is desirable to continuously monitor and control the range of wavelengths scanned (the scan width). A system has been demonstrated whereby the scan width of a swept ring-dye laser, or semiconductor diode laser, can be measured and controlled in real-time with a resolution better than 0.1%. Scan linearity, or conformity to a nonlinear scan waveform, can be measured and controlled. The system of the invention consists of a Fabry-Perot interferometer, three CAMAC interface modules, and a microcomputer running a simple analysis and proportional-integral control algorithm. With additional modules, multiple lasers can be simultaneously controlled. The invention also includes an embodiment implemented on an ordinary PC with a multifunction plug-in board.
Apparatus for controlling the scan width of a scanning laser beam
Johnson, G.W.
1996-10-22
Swept-wavelength lasers are often used in absorption spectroscopy applications. In experiments where high accuracy is required, it is desirable to continuously monitor and control the range of wavelengths scanned (the scan width). A system has been demonstrated whereby the scan width of a swept ring-dye laser, or semiconductor diode laser, can be measured and controlled in real-time with a resolution better than 0.1%. Scan linearity, or conformity to a nonlinear scan waveform, can be measured and controlled. The system of the invention consists of a Fabry-Perot interferometer, three CAMAC interface modules, and a microcomputer running a simple analysis and proportional-integral control algorithm. With additional modules, multiple lasers can be simultaneously controlled. The invention also includes an embodiment implemented on an ordinary PC with a multifunction plug-in board. 8 figs.
Athena X-IFU event reconstruction software: SIRENA
NASA Astrophysics Data System (ADS)
Ceballos, Maria Teresa; Cobo, Beatriz; Peille, Philippe; Wilms, Joern; Brand, Thorsten; Dauser, Thomas; Bandler, Simon; Smith, Stephen
2015-09-01
This contribution describes the status and technical details of the SIRENA package, the software currently in development to perform the on board event energy reconstruction for the Athena calorimeter X-IFU. This on board processing will be done in the X-IFU DRE unit and it will consist in an initial triggering of event pulses followed by an analysis (with the SIRENA package) to determine the energy content of such events.The current algorithm used by SIRENA is the optimal filtering technique (also used by ASTRO-H processor) although some other algorithms are also being tested.Here we present these studies and some preliminary results about the energy resolution of the instrument based on simulations done with the SIXTE simulator (http://www.sternwarte.uni-erlangen.de/research/sixte/) in which SIRENA is integrated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Lijuan; Gonder, Jeff; Brooker, Aaron
This study evaluated the costs and benefits associated with the use of stationary-wireless-power-transfer-enabled plug-in hybrid electric buses and determined the cost effectiveness relative to conventional buses and hybrid electric buses. A factorial design was performed over a number of different battery sizes, charging power levels, and f bus stop charging stations. The net present costs were calculated for each vehicle design and provided the basis for design evaluation. In all cases, given the assumed economic conditions, the conventional bus achieved the lowest net present cost while the optimal plug-in hybrid electric bus scenario beat out the hybrid electric comparison scenario.more » The parameter sensitivity was also investigated under favorable and unfavorable market penetration assumptions.« less
A plug-in to Eclipse for VHDL source codes: functionalities
NASA Astrophysics Data System (ADS)
Niton, B.; Poźniak, K. T.; Romaniuk, R. S.
The paper presents an original application, written by authors, which supports writing and edition of source codes in VHDL language. It is a step towards fully automatic, augmented code writing for photonic and electronic systems, also systems based on FPGA and/or DSP processors. An implementation is described, based on VEditor. VEditor is a free license program. Thus, the work presented in this paper supplements and extends this free license. The introduction characterizes shortly available tools on the market which serve for aiding the design processes of electronic systems in VHDL. Particular attention was put on plug-ins to the Eclipse environment and Emacs program. There are presented detailed properties of the written plug-in such as: programming extension conception, and the results of the activities of formatter, re-factorizer, code hider, and other new additions to the VEditor program.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Lijuan; Gonder, Jeff; Burton, Evan
This study evaluates the costs and benefits associated with the use of a plug-in hybrid electric bus and determines the cost effectiveness relative to a conventional bus and a hybrid electric bus. A sensitivity sweep analysis was performed over a number of a different battery sizes, charging powers, and charging stations. The net present value was calculated for each vehicle design and provided the basis for the design evaluation. In all cases, given present day economic assumptions, the conventional bus achieved the lowest net present value while the optimal plug-in hybrid electric bus scenario reached lower lifetime costs than themore » hybrid electric bus. The study also performed parameter sensitivity analysis under low market potential assumptions and high market potential assumptions. The net present value of plug-in hybrid electric bus is close to that of conventional bus.« less
tsiR: An R package for time-series Susceptible-Infected-Recovered models of epidemics.
Becker, Alexander D; Grenfell, Bryan T
2017-01-01
tsiR is an open source software package implemented in the R programming language designed to analyze infectious disease time-series data. The software extends a well-studied and widely-applied algorithm, the time-series Susceptible-Infected-Recovered (TSIR) model, to infer parameters from incidence data, such as contact seasonality, and to forward simulate the underlying mechanistic model. The tsiR package aggregates a number of different fitting features previously described in the literature in a user-friendly way, providing support for their broader adoption in infectious disease research. Also included in tsiR are a number of diagnostic tools to assess the fit of the TSIR model. This package should be useful for researchers analyzing incidence data for fully-immunizing infectious diseases.
NASA Astrophysics Data System (ADS)
Tolson, B.; Matott, L. S.; Gaffoor, T. A.; Asadzadeh, M.; Shafii, M.; Pomorski, P.; Xu, X.; Jahanpour, M.; Razavi, S.; Haghnegahdar, A.; Craig, J. R.
2015-12-01
We introduce asynchronous parallel implementations of the Dynamically Dimensioned Search (DDS) family of algorithms including DDS, discrete DDS, PA-DDS and DDS-AU. These parallel algorithms are unique from most existing parallel optimization algorithms in the water resources field in that parallel DDS is asynchronous and does not require an entire population (set of candidate solutions) to be evaluated before generating and then sending a new candidate solution for evaluation. One key advance in this study is developing the first parallel PA-DDS multi-objective optimization algorithm. The other key advance is enhancing the computational efficiency of solving optimization problems (such as model calibration) by combining a parallel optimization algorithm with the deterministic model pre-emption concept. These two efficiency techniques can only be combined because of the asynchronous nature of parallel DDS. Model pre-emption functions to terminate simulation model runs early, prior to completely simulating the model calibration period for example, when intermediate results indicate the candidate solution is so poor that it will definitely have no influence on the generation of further candidate solutions. The computational savings of deterministic model preemption available in serial implementations of population-based algorithms (e.g., PSO) disappear in synchronous parallel implementations as these algorithms. In addition to the key advances above, we implement the algorithms across a range of computation platforms (Windows and Unix-based operating systems from multi-core desktops to a supercomputer system) and package these for future modellers within a model-independent calibration software package called Ostrich as well as MATLAB versions. Results across multiple platforms and multiple case studies (from 4 to 64 processors) demonstrate the vast improvement over serial DDS-based algorithms and highlight the important role model pre-emption plays in the performance of parallel, pre-emptable DDS algorithms. Case studies include single- and multiple-objective optimization problems in water resources model calibration and in many cases linear or near linear speedups are observed.
Suppes, Trisha; Rush, A John; Dennehy, Ellen B; Crismon, M Lynn; Kashner, T Michael; Toprac, Marcia G; Carmody, Thomas J; Brown, E Sherwood; Biggs, Melanie M; Shores-Wilson, Kathy; Witte, Bradley P; Trivedi, Madhukar H; Miller, Alexander L; Altshuler, Kenneth Z; Shon, Steven P
2003-04-01
The Texas Medication Algorithm Project (TMAP) assessed the clinical and economic impact of algorithm-driven treatment (ALGO) as compared with treatment-as-usual (TAU) in patients served in public mental health centers. This report presents clinical outcomes in patients with a history of mania (BD), including bipolar I and schizoaffective disorder, bipolar type, during 12 months of treatment beginning March 1998 and ending with the final active patient visit in April 2000. Patients were diagnosed with bipolar I disorder or schizoaffective disorder, bipolar type, according to DSM-IV criteria. ALGO was comprised of a medication algorithm and manual to guide treatment decisions. Physicians and clinical coordinators received training and expert consultation throughout the project. ALGO also provided a disorder-specific patient and family education package. TAU clinics had no exposure to the medication algorithms. Quarterly outcome evaluations were obtained by independent raters. Hierarchical linear modeling, based on a declining effects model, was used to assess clinical outcome of ALGO versus TAU. ALGO and TAU patients showed significant initial decreases in symptoms (p =.03 and p <.001, respectively) measured by the 24-item Brief Psychiatric Rating Scale (BPRS-24) at the 3-month assessment interval, with significantly greater effects for the ALGO group. Limited catch-up by TAU was observed over the remaining 3 quarters. Differences were also observed in measures of mania and psychosis but not in depression, side-effect burden, or functioning. For patients with a history of mania, relative to TAU, the ALGO intervention package was associated with greater initial and sustained improvement on the primary clinical outcome measure, the BPRS-24, and the secondary outcome measure, the Clinician-Administered Rating Scale for Mania (CARS-M). Further research is planned to clarify which elements of the ALGO package contributed to this between-group difference.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramroth, L. A.; Gonder, J.; Brooker, A.
2012-09-01
The National Renewable Energy Laboratory verified diesel-conventional and diesel-hybrid parcel delivery vehicle models to evaluate petroleum reduction and cost implications of plug-in hybrid gasoline and diesel variants. These variants are run on a field-data-derived design matrix to analyze the effects of drive cycle, distance, battery replacements, battery capacity, and motor power on fuel consumption and lifetime cost. Two cost scenarios using fuel prices corresponding to forecasted highs for 2011 and 2030 and battery costs per kilowatt-hour representing current and long-term targets compare plug-in hybrid lifetime costs with diesel conventional lifetime costs. Under a future cost scenario of $100/kWh battery energymore » and $5/gal fuel, plug-in hybrids are cost effective. Assuming a current cost of $700/kWh and $3/gal fuel, they rarely recoup the additional motor and battery cost. The results highlight the importance of understanding the application's drive cycle, daily driving distance, and kinetic intensity. For instances in the current-cost scenario where the additional plug-in hybrid cost is regained in fuel savings, the combination of kinetic intensity and daily distance travelled does not coincide with the usage patterns observed in the field data. If the usage patterns were adjusted, the hybrids could become cost effective.« less
NASA Astrophysics Data System (ADS)
Tarroja, Brian; Eichman, Joshua D.; Zhang, Li; Brown, Tim M.; Samuelsen, Scott
2015-03-01
A study has been performed that analyzes the effectiveness of utilizing plug-in vehicles to meet holistic environmental goals across the combined electricity and transportation sectors. In this study, plug-in hybrid electric vehicle (PHEV) penetration levels are varied from 0 to 60% and base renewable penetration levels are varied from 10 to 63%. The first part focused on the effect of installing plug-in hybrid electric vehicles on the environmental performance of the combined electricity and transportation sectors. The second part addresses impacts on the design and operation of load-balancing resources on the electric grid associated with fleet capacity factor, peaking and load-following generator capacity, efficiency, ramp rates, start-up events and the levelized cost of electricity. PHEVs using smart charging are found to counteract many of the disruptive impacts of intermittent renewable power on balancing generators for a wide range of renewable penetration levels, only becoming limited at high renewable penetration levels due to lack of flexibility and finite load size. This study highlights synergy between sustainability measures in the electric and transportation sectors and the importance of communicative dispatch of these vehicles.
Gathering and Exploring Scientific Knowledge in Pharmacovigilance
Lopes, Pedro; Nunes, Tiago; Campos, David; Furlong, Laura Ines; Bauer-Mehren, Anna; Sanz, Ferran; Carrascosa, Maria Carmen; Mestres, Jordi; Kors, Jan; Singh, Bharat; van Mulligen, Erik; Van der Lei, Johan; Diallo, Gayo; Avillach, Paul; Ahlberg, Ernst; Boyer, Scott; Diaz, Carlos; Oliveira, José Luís
2013-01-01
Pharmacovigilance plays a key role in the healthcare domain through the assessment, monitoring and discovery of interactions amongst drugs and their effects in the human organism. However, technological advances in this field have been slowing down over the last decade due to miscellaneous legal, ethical and methodological constraints. Pharmaceutical companies started to realize that collaborative and integrative approaches boost current drug research and development processes. Hence, new strategies are required to connect researchers, datasets, biomedical knowledge and analysis algorithms, allowing them to fully exploit the true value behind state-of-the-art pharmacovigilance efforts. This manuscript introduces a new platform directed towards pharmacovigilance knowledge providers. This system, based on a service-oriented architecture, adopts a plugin-based approach to solve fundamental pharmacovigilance software challenges. With the wealth of collected clinical and pharmaceutical data, it is now possible to connect knowledge providers’ analysis and exploration algorithms with real data. As a result, new strategies allow a faster identification of high-risk interactions between marketed drugs and adverse events, and enable the automated uncovering of scientific evidence behind them. With this architecture, the pharmacovigilance field has a new platform to coordinate large-scale drug evaluation efforts in a unique ecosystem, publicly available at http://bioinformatics.ua.pt/euadr/. PMID:24349421
NASA Astrophysics Data System (ADS)
Diaz, K. S.; Kim, E. H.; Jones, R. M.; de Leon, K. C.; Woodcroft, B. J.; Tyson, G. W.; Rich, V. I.
2014-12-01
The growing field of metaproteomics links microbial communities to their expressed functions by using mass spectrometry methods to characterize community proteins. Comparison of mass spectrometry protein search algorithms and their biases is crucial for maximizing the quality and amount of protein identifications in mass spectral data. Available algorithms employ different approaches when mapping mass spectra to peptides against a database. We compared mass spectra from four microbial proteomes derived from high-organic content soils searched with two search algorithms: 1) Sequest HT as packaged within Proteome Discoverer (v.1.4) and 2) X!Tandem as packaged in TransProteomicPipeline (v.4.7.1). Searches used matched metagenomes, and results were filtered to allow identification of high probability proteins. There was little overlap in proteins identified by both algorithms, on average just ~24% of the total. However, when adjusted for spectral abundance, the overlap improved to ~70%. Proteome Discoverer generally outperformed X!Tandem, identifying an average of 12.5% more proteins than X!Tandem, with X!Tandem identifying more proteins only in the first two proteomes. For spectrally-adjusted results, the algorithms were similar, with X!Tandem marginally outperforming Proteome Discoverer by an average of ~4%. We then assessed differences in heat shock proteins (HSP) identification by the two algorithms by BLASTing identified proteins against the Heat Shock Protein Information Resource, because HSP hits typically account for the majority signal in proteomes, due to extraction protocols. Total HSP identifications for each of the 4 proteomes were approximately ~15%, ~11%, ~17%, and ~19%, with ~14% for total HSPs with redundancies removed. Of the ~15% average of proteins from the 4 proteomes identified as HSPs, ~10% of proteins and spectra were identified by both algorithms. On average, Proteome Discoverer identified ~9% more HSPs than X!Tandem.
Multi-agent coordination algorithms for control of distributed energy resources in smart grids
NASA Astrophysics Data System (ADS)
Cortes, Andres
Sustainable energy is a top-priority for researchers these days, since electricity and transportation are pillars of modern society. Integration of clean energy technologies such as wind, solar, and plug-in electric vehicles (PEVs), is a major engineering challenge in operation and management of power systems. This is due to the uncertain nature of renewable energy technologies and the large amount of extra load that PEVs would add to the power grid. Given the networked structure of a power system, multi-agent control and optimization strategies are natural approaches to address the various problems of interest for the safe and reliable operation of the power grid. The distributed computation in multi-agent algorithms addresses three problems at the same time: i) it allows for the handling of problems with millions of variables that a single processor cannot compute, ii) it allows certain independence and privacy to electricity customers by not requiring any usage information, and iii) it is robust to localized failures in the communication network, being able to solve problems by simply neglecting the failing section of the system. We propose various algorithms to coordinate storage, generation, and demand resources in a power grid using multi-agent computation and decentralized decision making. First, we introduce a hierarchical vehicle-one-grid (V1G) algorithm for coordination of PEVs under usage constraints, where energy only flows from the grid in to the batteries of PEVs. We then present a hierarchical vehicle-to-grid (V2G) algorithm for PEV coordination that takes into consideration line capacity constraints in the distribution grid, and where energy flows both ways, from the grid in to the batteries, and from the batteries to the grid. Next, we develop a greedy-like hierarchical algorithm for management of demand response events with on/off loads. Finally, we introduce distributed algorithms for the optimal control of distributed energy resources, i.e., generation and storage in a microgrid. The algorithms we present are provably correct and tested in simulation. Each algorithm is assumed to work on a particular network topology, and simulation studies are carried out in order to demonstrate their convergence properties to a desired solution.
Gonçalves, Cristina P; Mohallem, José R
2004-11-15
We report the development of a simple algorithm to modify quantum chemistry codes based on the LCAO procedure, to account for the isotope problem in electronic structure calculations. No extra computations are required compared to standard Born-Oppenheimer calculations. An upgrade of the Gamess package called ISOTOPE is presented, and its applicability is demonstrated in some examples.
CONRAD—A software framework for cone-beam imaging in radiology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maier, Andreas; Choi, Jang-Hwan; Riess, Christian
2013-11-15
Purpose: In the community of x-ray imaging, there is a multitude of tools and applications that are used in scientific practice. Many of these tools are proprietary and can only be used within a certain lab. Often the same algorithm is implemented multiple times by different groups in order to enable comparison. In an effort to tackle this problem, the authors created CONRAD, a software framework that provides many of the tools that are required to simulate basic processes in x-ray imaging and perform image reconstruction with consideration of nonlinear physical effects.Methods: CONRAD is a Java-based state-of-the-art software platform withmore » extensive documentation. It is based on platform-independent technologies. Special libraries offer access to hardware acceleration such as OpenCL. There is an easy-to-use interface for parallel processing. The software package includes different simulation tools that are able to generate up to 4D projection and volume data and respective vector motion fields. Well known reconstruction algorithms such as FBP, DBP, and ART are included. All algorithms in the package are referenced to a scientific source.Results: A total of 13 different phantoms and 30 processing steps have already been integrated into the platform at the time of writing. The platform comprises 74.000 nonblank lines of code out of which 19% are used for documentation. The software package is available for download at http://conrad.stanford.edu. To demonstrate the use of the package, the authors reconstructed images from two different scanners, a table top system and a clinical C-arm system. Runtimes were evaluated using the RabbitCT platform and demonstrate state-of-the-art runtimes with 2.5 s for the 256 problem size and 12.4 s for the 512 problem size.Conclusions: As a common software framework, CONRAD enables the medical physics community to share algorithms and develop new ideas. In particular this offers new opportunities for scientific collaboration and quantitative performance comparison between the methods of different groups.« less
NASA Astrophysics Data System (ADS)
Krinitskiy, Mikhail; Sinitsyn, Alexey; Gulev, Sergey
2014-05-01
Cloud fraction is a critical parameter for the accurate estimation of short-wave and long-wave radiation - one of the most important surface fluxes over sea and land. Massive estimates of the total cloud cover as well as cloud amount for different layers of clouds are available from visual observations, satellite measurements and reanalyses. However, these data are subject of different uncertainties and need continuous validation against highly accurate in-situ measurements. Sky imaging with high resolution fish eye camera provides an excellent opportunity for collecting cloud cover data supplemented with additional characteristics hardly available from routine visual observations (e.g. structure of cloud cover under broken cloud conditions, parameters of distribution of cloud dimensions). We present operational automatic observational package which is based on fish eye camera taking sky images with high resolution (up to 1Hz) in time and a spatial resolution of 968x648px. This spatial resolution has been justified as an optimal by several sensitivity experiments. For the use of the package at research vessel when the horizontal positioning becomes critical, a special extension of the hardware and software to the package has been developed. These modules provide the explicit detection of the optimal moment for shooting. For the post processing of sky images we developed a software realizing the algorithm of the filtering of sunburn effect in case of small and moderate could cover and broken cloud conditions. The same algorithm accurately quantifies the cloud fraction by analyzing color mixture for each point and introducing the so-called "grayness rate index" for every pixel. The accuracy of the algorithm has been tested using the data collected during several campaigns in 2005-2011 in the North Atlantic Ocean. The collection of images included more than 3000 images for different cloud conditions supplied with observations of standard parameters. The system is fully autonomous and has a block for digital data collection at the hard disk. The system has been tested for a wide range of open ocean cloud conditions and we will demonstrate some pilot results of data processing and physical interpretation of fractional cloud cover estimation.
Peters, Sanne A E; Dunford, Elizabeth; Jones, Alexandra; Ni Mhurchu, Cliona; Crino, Michelle; Taylor, Fraser; Woodward, Mark; Neal, Bruce
2017-07-05
The Health Star Rating (HSR) is an interpretive front-of-pack labelling system that rates the overall nutritional profile of packaged foods. The algorithm underpinning the HSR includes total sugar content as one of the components. This has been criticised because intrinsic sugars naturally present in dairy, fruits, and vegetables are treated the same as sugars added during food processing. We assessed whether the HSR could better discriminate between core and discretionary foods by including added sugar in the underlying algorithm. Nutrition information was extracted for 34,135 packaged foods available in The George Institute's Australian FoodSwitch database. Added sugar levels were imputed from food composition databases. Products were classified as 'core' or 'discretionary' based on the Australian Dietary Guidelines. The ability of each of the nutrients included in the HSR algorithm, as well as added sugar, to discriminate between core and discretionary foods was estimated using the area under the curve (AUC). 15,965 core and 18,350 discretionary foods were included. Of these, 8230 (52%) core foods and 15,947 (87%) discretionary foods contained added sugar. Median (Q1, Q3) HSRs were 4.0 (3.0, 4.5) for core foods and 2.0 (1.0, 3.0) for discretionary foods. Median added sugar contents (g/100 g) were 3.3 (1.5, 5.5) for core foods and 14.6 (1.8, 37.2) for discretionary foods. Of all the nutrients used in the current HSR algorithm, total sugar had the greatest individual capacity to discriminate between core and discretionary foods; AUC 0.692 (0.686; 0.697). Added sugar alone achieved an AUC of 0.777 (0.772; 0.782). A model with all nutrients in the current HSR algorithm had an AUC of 0.817 (0.812; 0.821), which increased to 0.871 (0.867; 0.874) with inclusion of added sugar. The HSR nutrients discriminate well between core and discretionary packaged foods. However, discrimination was improved when added sugar was also included. These data argue for inclusion of added sugar in an updated HSR algorithm and declaration of added sugar as part of mandatory nutrient declarations.
Lepre, Jorge; Rice, J Jeremy; Tu, Yuhai; Stolovitzky, Gustavo
2004-05-01
Despite the growing literature devoted to finding differentially expressed genes in assays probing different tissues types, little attention has been paid to the combinatorial nature of feature selection inherent to large, high-dimensional gene expression datasets. New flexible data analysis approaches capable of searching relevant subgroups of genes and experiments are needed to understand multivariate associations of gene expression patterns with observed phenotypes. We present in detail a deterministic algorithm to discover patterns of multivariate gene associations in gene expression data. The patterns discovered are differential with respect to a control dataset. The algorithm is exhaustive and efficient, reporting all existent patterns that fit a given input parameter set while avoiding enumeration of the entire pattern space. The value of the pattern discovery approach is demonstrated by finding a set of genes that differentiate between two types of lymphoma. Moreover, these genes are found to behave consistently in an independent dataset produced in a different laboratory using different arrays, thus validating the genes selected using our algorithm. We show that the genes deemed significant in terms of their multivariate statistics will be missed using other methods. Our set of pattern discovery algorithms including a user interface is distributed as a package called Genes@Work. This package is freely available to non-commercial users and can be downloaded from our website (http://www.research.ibm.com/FunGen).
Exploring Convergent Evolution to Provide a Foundation for Protein Engineering
2009-02-26
information if it does not display a currently valid OMB control number PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ORGANIZATION. RETORT DATE (DD-MM-YYYY...the DivergentSet and MotifCluster Algorithms Using support from this grant, we developed two software packages that provide key infrastructure for...software package we developed, MotifCluster," provides a novel way of detecting distantly related homologs, one of the key aims of the proposal. Unlike
RevEcoR: an R package for the reverse ecology analysis of microbiomes.
Cao, Yang; Wang, Yuanyuan; Zheng, Xiaofei; Li, Fei; Bo, Xiaochen
2016-07-29
All species live in complex ecosystems. The structure and complexity of a microbial community reflects not only diversity and function, but also the environment in which it occurs. However, traditional ecological methods can only be applied on a small scale and for relatively well-understood biological systems. Recently, a graph-theory-based algorithm called the reverse ecology approach has been developed that can analyze the metabolic networks of all the species in a microbial community, and predict the metabolic interface between species and their environment. Here, we present RevEcoR, an R package and a Shiny Web application that implements the reverse ecology algorithm for determining microbe-microbe interactions in microbial communities. This software allows users to obtain large-scale ecological insights into species' ecology directly from high-throughput metagenomic data. The software has great potential for facilitating the study of microbiomes. RevEcoR is open source software for the study of microbial community ecology. The RevEcoR R package is freely available under the GNU General Public License v. 2.0 at http://cran.r-project.org/web/packages/RevEcoR/ with the vignette and typical usage examples, and the interactive Shiny web application is available at http://yiluheihei.shinyapps.io/shiny-RevEcoR , or can be installed locally with the source code accessed from https://github.com/yiluheihei/shiny-RevEcoR .
An Algorithm and R Program for Fitting and Simulation of Pharmacokinetic and Pharmacodynamic Data.
Li, Jijie; Yan, Kewei; Hou, Lisha; Du, Xudong; Zhu, Ping; Zheng, Li; Zhu, Cairong
2017-06-01
Pharmacokinetic/pharmacodynamic link models are widely used in dose-finding studies. By applying such models, the results of initial pharmacokinetic/pharmacodynamic studies can be used to predict the potential therapeutic dose range. This knowledge can improve the design of later comparative large-scale clinical trials by reducing the number of participants and saving time and resources. However, the modeling process can be challenging, time consuming, and costly, even when using cutting-edge, powerful pharmacological software. Here, we provide a freely available R program for expediently analyzing pharmacokinetic/pharmacodynamic data, including data importation, parameter estimation, simulation, and model diagnostics. First, we explain the theory related to the establishment of the pharmacokinetic/pharmacodynamic link model. Subsequently, we present the algorithms used for parameter estimation and potential therapeutic dose computation. The implementation of the R program is illustrated by a clinical example. The software package is then validated by comparing the model parameters and the goodness-of-fit statistics generated by our R package with those generated by the widely used pharmacological software WinNonlin. The pharmacokinetic and pharmacodynamic parameters as well as the potential recommended therapeutic dose can be acquired with the R package. The validation process shows that the parameters estimated using our package are satisfactory. The R program developed and presented here provides pharmacokinetic researchers with a simple and easy-to-access tool for pharmacokinetic/pharmacodynamic analysis on personal computers.
Data Fusion of Geographically Dispersed Information: Experience With the Scalable Data Grid
2011-03-01
framework that exposes calls to this HLA interface to registered plug-ins (Figure 2). The SDG exploits this plug-in to intercept and log messages and...implementation and use of the SDG and Hadoop Figure 4. Schematic meshrouter topology. Figure 5. Factored Meshrouter implementation, with application-specific...communications primitives. Yao, Ward, & Davis 92 ITEA Journal show promise for the T&E environments. It reported on experiments implementing the SDG and on
FibrilJ: ImageJ plugin for fibrils' diameter and persistence length determination
NASA Astrophysics Data System (ADS)
Sokolov, P. A.; Belousov, M. V.; Bondarev, S. A.; Zhouravleva, G. A.; Kasyanenko, N. A.
2017-05-01
Application of microscopy to evaluate the morphology and size of filamentous proteins and amyloids requires new and creative approaches to simplify and automate the image processing. The estimation of mean values of fibrils diameter, length and bending stiffness on micrographs is a major challenge. For this purpose we developed an open-source FibrilJ plugin for the ImageJ/FiJi program. It automatically recognizes the fibrils on the surface of a mica, silicon, gold or formvar film and further analyzes them to calculate the distribution of fibrils by diameters, lengths and persistence lengths. The plugin has been validated by the processing of TEM images of fibrils formed by Sup35NM yeast protein and artificially created images of rod-shape objects with predefined parameters. Novel data obtained by SEM for Sup35NM protein fibrils immobilized on silicon and gold substrates are also presented and analyzed.
The NASA Auralization Framework and Plugin Architecture
NASA Technical Reports Server (NTRS)
Aumann, Aric R.; Tuttle, Brian C.; Chapin, William L.; Rizzi, Stephen A.
2015-01-01
NASA has a long history of investigating human response to aircraft flyover noise and in recent years has developed a capability to fully auralize the noise of aircraft during their design. This capability is particularly useful for unconventional designs with noise signatures significantly different from the current fleet. To that end, a flexible software architecture has been developed to facilitate rapid integration of new simulation techniques for noise source synthesis and propagation, and to foster collaboration amongst researchers through a common releasable code base. The NASA Auralization Framework (NAF) is a skeletal framework written in C++ with basic functionalities and a plugin architecture that allows users to mix and match NAF capabilities with their own methods through the development and use of dynamically linked libraries. This paper presents the NAF software architecture and discusses several advanced auralization techniques that have been implemented as plugins to the framework.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Lijuan; Gonder, Jeff; Burton, Evan
This study evaluates the costs and benefits associated with the use of a stationary-wireless- power-transfer-enabled plug-in hybrid electric bus and determines the cost effectiveness relative to a conventional bus and a hybrid electric bus. A sensitivity sweep was performed over many different battery sizes, charging power levels, and number/location of bus stop charging stations. The net present cost was calculated for each vehicle design and provided the basis for design evaluation. In all cases, given the assumed economic conditions, the conventional bus achieved the lowest net present cost while the optimal plug-in hybrid electric bus scenario beat out the hybridmore » electric comparison scenario. The study also performed parameter sensitivity analysis under favorable and high unfavorable market penetration assumptions. The analysis identifies fuel saving opportunities with plug-in hybrid electric bus scenarios at cumulative net present costs not too dissimilar from those for conventional buses.« less
Adaptive powertrain control for plugin hybrid electric vehicles
Kedar-Dongarkar, Gurunath; Weslati, Feisel
2013-10-15
A powertrain control system for a plugin hybrid electric vehicle. The system comprises an adaptive charge sustaining controller; at least one internal data source connected to the adaptive charge sustaining controller; and a memory connected to the adaptive charge sustaining controller for storing data generated by the at least one internal data source. The adaptive charge sustaining controller is operable to select an operating mode of the vehicle's powertrain along a given route based on programming generated from data stored in the memory associated with that route. Further described is a method of adaptively controlling operation of a plugin hybrid electric vehicle powertrain comprising identifying a route being traveled, activating stored adaptive charge sustaining mode programming for the identified route and controlling operation of the powertrain along the identified route by selecting from a plurality of operational modes based on the stored adaptive charge sustaining mode programming.
Silicon Carbide High Temperature Anemometer and Method for Assembling the Same
NASA Technical Reports Server (NTRS)
Okojie, Robert S. (Inventor); Fralick, Gustave C. (Inventor); Saad, George J. (Inventor)
2003-01-01
A high temperature anemometer includes a pair of substrates. One of the substrates has a plurality of electrodes on a facing surface, while the other of the substrates has a sensor cavity on a facing surface. A sensor is received in the sensor cavity, wherein the sensor has a plurality of bondpads, and wherein the bond pads contact the plurality of electrodes when the facing surfaces are mated with one another. The anemometer further includes a plurality of plug-in pins, wherein the substrate with the cavity has a plurality of trenches with each one receiving a plurality of plug-in pins. The plurality of plug-in pins contact the plurality of electrodes when the substrates are mated with one another. The sensor cavity is at an end of one of the substrates such that the sensor partially extends from the substrate. The sensor and the substrates are preferably made of silicon carbide.
NASA Technical Reports Server (NTRS)
Hruby, R. J.; Bjorkman, W. S.; Schmidt, S. F.; Carestia, R. A.
1979-01-01
Algorithms were developed that attempt to identify which sensor in a tetrad configuration has experienced a step failure. An algorithm is also described that provides a measure of the confidence with which the correct identification was made. Experimental results are presented from real-time tests conducted on a three-axis motion facility utilizing an ortho-skew tetrad strapdown inertial sensor package. The effects of prediction errors and of quantization on correct failure identification are discussed as well as an algorithm for detecting second failures through prediction.
Rocker: Open source, easy-to-use tool for AUC and enrichment calculations and ROC visualization.
Lätti, Sakari; Niinivehmas, Sanna; Pentikäinen, Olli T
2016-01-01
Receiver operating characteristics (ROC) curve with the calculation of area under curve (AUC) is a useful tool to evaluate the performance of biomedical and chemoinformatics data. For example, in virtual drug screening ROC curves are very often used to visualize the efficiency of the used application to separate active ligands from inactive molecules. Unfortunately, most of the available tools for ROC analysis are implemented into commercially available software packages, or are plugins in statistical software, which are not always the easiest to use. Here, we present Rocker, a simple ROC curve visualization tool that can be used for the generation of publication quality images. Rocker also includes an automatic calculation of the AUC for the ROC curve and Boltzmann-enhanced discrimination of ROC (BEDROC). Furthermore, in virtual screening campaigns it is often important to understand the early enrichment of active ligand identification, for this Rocker offers automated calculation routine. To enable further development of Rocker, it is freely available (MIT-GPL license) for use and modifications from our web-site (http://www.jyu.fi/rocker).
SOCR: Statistics Online Computational Resource
Dinov, Ivo D.
2011-01-01
The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR). This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student’s intuition and enhance their learning. PMID:21451741
Real-Time Data Processing Onboard Remote Sensor Platforms: Annual Review #3 Data Package
NASA Technical Reports Server (NTRS)
Cook, Sid; Harsanyi, Joe
2003-01-01
The current program status reviewed by this viewgraph presentation includes: 1) New Evaluation Results; 2) Algorithm Improvement Investigations; 3) Electronic Hardware Design; 4) Software Hardware Interface Design.
Oostenveld, Robert; Fries, Pascal; Maris, Eric; Schoffelen, Jan-Mathijs
2011-01-01
This paper describes FieldTrip, an open source software package that we developed for the analysis of MEG, EEG, and other electrophysiological data. The software is implemented as a MATLAB toolbox and includes a complete set of consistent and user-friendly high-level functions that allow experimental neuroscientists to analyze experimental data. It includes algorithms for simple and advanced analysis, such as time-frequency analysis using multitapers, source reconstruction using dipoles, distributed sources and beamformers, connectivity analysis, and nonparametric statistical permutation tests at the channel and source level. The implementation as toolbox allows the user to perform elaborate and structured analyses of large data sets using the MATLAB command line and batch scripting. Furthermore, users and developers can easily extend the functionality and implement new algorithms. The modular design facilitates the reuse in other software packages.
Ionospheric Specifications for SAR Interferometry (ISSI)
NASA Technical Reports Server (NTRS)
Pi, Xiaoqing; Chapman, Bruce D; Freeman, Anthony; Szeliga, Walter; Buckley, Sean M.; Rosen, Paul A.; Lavalle, Marco
2013-01-01
The ISSI software package is designed to image the ionosphere from space by calibrating and processing polarimetric synthetic aperture radar (PolSAR) data collected from low Earth orbit satellites. Signals transmitted and received by a PolSAR are subject to the Faraday rotation effect as they traverse the magnetized ionosphere. The ISSI algorithms combine the horizontally and vertically polarized (with respect to the radar system) SAR signals to estimate Faraday rotation and ionospheric total electron content (TEC) with spatial resolutions of sub-kilometers to kilometers, and to derive radar system calibration parameters. The ISSI software package has been designed and developed to integrate the algorithms, process PolSAR data, and image as well as visualize the ionospheric measurements. A number of tests have been conducted using ISSI with PolSAR data collected from various latitude regions using the phase array-type L-band synthetic aperture radar (PALSAR) onboard Japan Aerospace Exploration Agency's Advanced Land Observing Satellite mission, and also with Global Positioning System data. These tests have demonstrated and validated SAR-derived ionospheric images and data correction algorithms.
Ho, ThienLuan; Oh, Seung-Rohk
2017-01-01
Approximate string matching with k-differences has a number of practical applications, ranging from pattern recognition to computational biology. This paper proposes an efficient memory-access algorithm for parallel approximate string matching with k-differences on Graphics Processing Units (GPUs). In the proposed algorithm, all threads in the same GPUs warp share data using warp-shuffle operation instead of accessing the shared memory. Moreover, we implement the proposed algorithm by exploiting the memory structure of GPUs to optimize its performance. Experiment results for real DNA packages revealed that the performance of the proposed algorithm and its implementation archived up to 122.64 and 1.53 times compared to that of sequential algorithm on CPU and previous parallel approximate string matching algorithm on GPUs, respectively. PMID:29016700
STAR Algorithm Integration Team - Facilitating operational algorithm development
NASA Astrophysics Data System (ADS)
Mikles, V. J.
2015-12-01
The NOAA/NESDIS Center for Satellite Research and Applications (STAR) provides technical support of the Joint Polar Satellite System (JPSS) algorithm development and integration tasks. Utilizing data from the S-NPP satellite, JPSS generates over thirty Environmental Data Records (EDRs) and Intermediate Products (IPs) spanning atmospheric, ocean, cryosphere, and land weather disciplines. The Algorithm Integration Team (AIT) brings technical expertise and support to product algorithms, specifically in testing and validating science algorithms in a pre-operational environment. The AIT verifies that new and updated algorithms function in the development environment, enforces established software development standards, and ensures that delivered packages are functional and complete. AIT facilitates the development of new JPSS-1 algorithms by implementing a review approach based on the Enterprise Product Lifecycle (EPL) process. Building on relationships established during the S-NPP algorithm development process and coordinating directly with science algorithm developers, the AIT has implemented structured reviews with self-contained document suites. The process has supported algorithm improvements for products such as ozone, active fire, vegetation index, and temperature and moisture profiles.
2012-09-30
recognition. Algorithm design and statistical analysis and feature analysis. Post -Doctoral Associate, Cornell University, Bioacoustics Research...short. The HPC-ADA was designed based on fielded systems [1-4, 6] that offer a variety of desirable attributes, specifically dynamic resource...The software package was designed to utilize parallel and distributed processing for running recognition and other advanced algorithms. DeLMA
Detection and Classification of Objects in Synthetic Aperture Radar Imagery
2006-02-01
a higher False Alarm Rate (FAR). Currently, a standard edge detector is the Canny algorithm, which is available with the mathematics package MATLAB ...the algorithm used to calculate the Radon transform. The MATLAB implementation uses the built in Radon transform procedure, which is extremely... MATLAB code for a faster forward-backwards selection process has also been provided. In both cases, the feature selection was accomplished by using
NASA Technical Reports Server (NTRS)
Kincaid, D. R.; Young, D. M.
1984-01-01
Adapting and designing mathematical software to achieve optimum performance on the CYBER 205 is discussed. Comments and observations are made in light of recent work done on modifying the ITPACK software package and on writing new software for vector supercomputers. The goal was to develop very efficient vector algorithms and software for solving large sparse linear systems using iterative methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zamora, Richard; Voter, Arthur; Uberuaga, Bla
2017-10-23
The SpecTAD software represents a refactoring of the Temperature Accelerated Dynamics (TAD2) code authored by Arthur F. Voter and Blas P. Uberuaga (LA-CC-02-05). SpecTAD extends the capabilities of TAD2, by providing algorithms for both temporal and spatial parallelism. The novel algorithms for temporal parallelism include both speculation and replication based techniques. SpecTAD also offers the optional capability to dynamically link to the open-source LAMMPS package.
A permutation-based non-parametric analysis of CRISPR screen data.
Jia, Gaoxiang; Wang, Xinlei; Xiao, Guanghua
2017-07-19
Clustered regularly-interspaced short palindromic repeats (CRISPR) screens are usually implemented in cultured cells to identify genes with critical functions. Although several methods have been developed or adapted to analyze CRISPR screening data, no single specific algorithm has gained popularity. Thus, rigorous procedures are needed to overcome the shortcomings of existing algorithms. We developed a Permutation-Based Non-Parametric Analysis (PBNPA) algorithm, which computes p-values at the gene level by permuting sgRNA labels, and thus it avoids restrictive distributional assumptions. Although PBNPA is designed to analyze CRISPR data, it can also be applied to analyze genetic screens implemented with siRNAs or shRNAs and drug screens. We compared the performance of PBNPA with competing methods on simulated data as well as on real data. PBNPA outperformed recent methods designed for CRISPR screen analysis, as well as methods used for analyzing other functional genomics screens, in terms of Receiver Operating Characteristics (ROC) curves and False Discovery Rate (FDR) control for simulated data under various settings. Remarkably, the PBNPA algorithm showed better consistency and FDR control on published real data as well. PBNPA yields more consistent and reliable results than its competitors, especially when the data quality is low. R package of PBNPA is available at: https://cran.r-project.org/web/packages/PBNPA/ .
An alternative to FASTSIM for tangential solution of the wheel-rail contact
NASA Astrophysics Data System (ADS)
Sichani, Matin Sh.; Enblom, Roger; Berg, Mats
2016-06-01
In most rail vehicle dynamics simulation packages, tangential solution of the wheel-rail contact is gained by means of Kalker's FASTSIM algorithm. While 5-25% error is expected for creep force estimation, the errors of shear stress distribution, needed for wheel-rail damage analysis, may rise above 30% due to the parabolic traction bound. Therefore, a novel algorithm named FaStrip is proposed as an alternative to FASTSIM. It is based on the strip theory which extends the two-dimensional rolling contact solution to three-dimensional contacts. To form FaStrip, the original strip theory is amended to obtain accurate estimations for any contact ellipse size and it is combined by a numerical algorithm to handle spin. The comparison between the two algorithms shows that using FaStrip improves the accuracy of the estimated shear stress distribution and the creep force estimation in all studied cases. In combined lateral creepage and spin cases, for instance, the error in force estimation reduces from 18% to less than 2%. The estimation of the slip velocities in the slip zone, needed for wear analysis, is also studied. Since FaStrip is as fast as FASTSIM, it can be an alternative for tangential solution of the wheel-rail contact in simulation packages.
NASA Astrophysics Data System (ADS)
Witzel, Gunther; Lu, Jessica R.; Ghez, Andrea M.; Martinez, Gregory D.; Fitzgerald, Michael P.; Britton, Matthew; Sitarski, Breann N.; Do, Tuan; Campbell, Randall D.; Service, Maxwell; Matthews, Keith; Morris, Mark R.; Becklin, E. E.; Wizinowich, Peter L.; Ragland, Sam; Doppmann, Greg; Neyman, Chris; Lyke, James; Kassis, Marc; Rizzi, Luca; Lilley, Scott; Rampy, Rachel
2016-07-01
General relativity can be tested in the strong gravity regime by monitoring stars orbiting the supermassive black hole at the Galactic Center with adaptive optics. However, the limiting source of uncertainty is the spatial PSF variability due to atmospheric anisoplanatism and instrumental aberrations. The Galactic Center Group at UCLA has completed a project developing algorithms to predict PSF variability for Keck AO images. We have created a new software package (AIROPA), based on modified versions of StarFinder and Arroyo, that takes atmospheric turbulence profiles, instrumental aberration maps, and images as inputs and delivers improved photometry and astrometry on crowded fields. This software package will be made publicly available soon.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bartlett, Roscoe
2010-03-31
GlobiPack contains a small collection of optimization globalization algorithms. These algorithms are used by optimization and various nonlinear equation solver algorithms.Used as the line-search procedure with Newton and Quasi-Newton optimization and nonlinear equation solver methods. These are standard published 1-D line search algorithms such as are described in the book Nocedal and Wright Numerical Optimization: 2nd edition, 2006. One set of algorithms were copied and refactored from the existing open-source Trilinos package MOOCHO where the linear search code is used to globalize SQP methods. This software is generic to any mathematical optimization problem where smooth derivatives exist. There is nomore » specific connection or mention whatsoever to any specific application, period. You cannot find more general mathematical software.« less
An Improved Recovery Algorithm for Decayed AES Key Schedule Images
NASA Astrophysics Data System (ADS)
Tsow, Alex
A practical algorithm that recovers AES key schedules from decayed memory images is presented. Halderman et al. [1] established this recovery capability, dubbed the cold-boot attack, as a serious vulnerability for several widespread software-based encryption packages. Our algorithm recovers AES-128 key schedules tens of millions of times faster than the original proof-of-concept release. In practice, it enables reliable recovery of key schedules at 70% decay, well over twice the decay capacity of previous methods. The algorithm is generalized to AES-256 and is empirically shown to recover 256-bit key schedules that have suffered 65% decay. When solutions are unique, the algorithm efficiently validates this property and outputs the solution for memory images decayed up to 60%.
Single-pass incremental force updates for adaptively restrained molecular dynamics.
Singh, Krishna Kant; Redon, Stephane
2018-03-30
Adaptively restrained molecular dynamics (ARMD) allows users to perform more integration steps in wall-clock time by switching on and off positional degrees of freedoms. This article presents new, single-pass incremental force updates algorithms to efficiently simulate a system using ARMD. We assessed different algorithms for speedup measurements and implemented them in the LAMMPS MD package. We validated the single-pass incremental force update algorithm on four different benchmarks using diverse pair potentials. The proposed algorithm allows us to perform simulation of a system faster than traditional MD in both NVE and NVT ensembles. Moreover, ARMD using the new single-pass algorithm speeds up the convergence of observables in wall-clock time. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barbee, D; McCarthy, A; Galavis, P
Purpose: Errors found during initial physics plan checks frequently require replanning and reprinting, resulting decreased departmental efficiency. Additionally, errors may be missed during physics checks, resulting in potential treatment errors or interruption. This work presents a process control created using the Eclipse Scripting API (ESAPI) enabling dosimetrists and physicists to detect potential errors in the Eclipse treatment planning system prior to performing any plan approvals or printing. Methods: Potential failure modes for five categories were generated based on available ESAPI (v11) patient object properties: Images, Contours, Plans, Beams, and Dose. An Eclipse script plugin (PlanCheck) was written in C# tomore » check errors most frequently observed clinically in each of the categories. The PlanCheck algorithms were devised to check technical aspects of plans, such as deliverability (e.g. minimum EDW MUs), in addition to ensuring that policy and procedures relating to planning were being followed. The effect on clinical workflow efficiency was measured by tracking the plan document error rate and plan revision/retirement rates in the Aria database over monthly intervals. Results: The number of potential failure modes the PlanCheck script is currently capable of checking for in the following categories: Images (6), Contours (7), Plans (8), Beams (17), and Dose (4). Prior to implementation of the PlanCheck plugin, the observed error rates in errored plan documents and revised/retired plans in the Aria database was 20% and 22%, respectively. Error rates were seen to decrease gradually over time as adoption of the script improved. Conclusion: A process control created using the Eclipse scripting API enabled plan checks to occur within the planning system, resulting in reduction in error rates and improved efficiency. Future work includes: initiating full FMEA for planning workflow, extending categories to include additional checks outside of ESAPI via Aria database queries, and eventual automated plan checks.« less
SBSI: an extensible distributed software infrastructure for parameter estimation in systems biology
Adams, Richard; Clark, Allan; Yamaguchi, Azusa; Hanlon, Neil; Tsorman, Nikos; Ali, Shakir; Lebedeva, Galina; Goltsov, Alexey; Sorokin, Anatoly; Akman, Ozgur E.; Troein, Carl; Millar, Andrew J.; Goryanin, Igor; Gilmore, Stephen
2013-01-01
Summary: Complex computational experiments in Systems Biology, such as fitting model parameters to experimental data, can be challenging to perform. Not only do they frequently require a high level of computational power, but the software needed to run the experiment needs to be usable by scientists with varying levels of computational expertise, and modellers need to be able to obtain up-to-date experimental data resources easily. We have developed a software suite, the Systems Biology Software Infrastructure (SBSI), to facilitate the parameter-fitting process. SBSI is a modular software suite composed of three major components: SBSINumerics, a high-performance library containing parallelized algorithms for performing parameter fitting; SBSIDispatcher, a middleware application to track experiments and submit jobs to back-end servers; and SBSIVisual, an extensible client application used to configure optimization experiments and view results. Furthermore, we have created a plugin infrastructure to enable project-specific modules to be easily installed. Plugin developers can take advantage of the existing user-interface and application framework to customize SBSI for their own uses, facilitated by SBSI’s use of standard data formats. Availability and implementation: All SBSI binaries and source-code are freely available from http://sourceforge.net/projects/sbsi under an Apache 2 open-source license. The server-side SBSINumerics runs on any Unix-based operating system; both SBSIVisual and SBSIDispatcher are written in Java and are platform independent, allowing use on Windows, Linux and Mac OS X. The SBSI project website at http://www.sbsi.ed.ac.uk provides documentation and tutorials. Contact: stg@inf.ed.ac.uk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:23329415
Flexible Software Architecture for Visualization and Seismic Data Analysis
NASA Astrophysics Data System (ADS)
Petunin, S.; Pavlov, I.; Mogilenskikh, D.; Podzyuban, D.; Arkhipov, A.; Baturuin, N.; Lisin, A.; Smith, A.; Rivers, W.; Harben, P.
2007-12-01
Research in the field of seismology requires software and signal processing utilities for seismogram manipulation and analysis. Seismologists and data analysts often encounter a major problem in the use of any particular software application specific to seismic data analysis: the tuning of commands and windows to the specific waveforms and hot key combinations so as to fit their familiar informational environment. The ability to modify the user's interface independently from the developer requires an adaptive code structure. An adaptive code structure also allows for expansion of software capabilities such as new signal processing modules and implementation of more efficient algorithms. Our approach is to use a flexible "open" architecture for development of geophysical software. This report presents an integrated solution for organizing a logical software architecture based on the Unix version of the Geotool software implemented on the Microsoft NET 2.0 platform. Selection of this platform greatly expands the variety and number of computers that can implement the software, including laptops that can be utilized in field conditions. It also facilitates implementation of communication functions for seismic data requests from remote databases through the Internet. The main principle of the new architecture for Geotool is that scientists should be able to add new routines for digital waveform analysis via software plug-ins that utilize the basic Geotool display for GUI interaction. The use of plug-ins allows the efficient integration of diverse signal-processing software, including software still in preliminary development, into an organized platform without changing the fundamental structure of that platform itself. An analyst's use of Geotool is tracked via a metadata file so that future studies can reconstruct, and alter, the original signal processing operations. The work has been completed in the framework of a joint Russian- American project.
LVFS: A Big Data File Storage Bridge for the HPC Community
NASA Astrophysics Data System (ADS)
Golpayegani, N.; Halem, M.; Mauoka, E.; Fonseca, L. F.
2015-12-01
Merging Big Data capabilities into High Performance Computing architecture starts at the file storage level. Heterogeneous storage systems are emerging which offer enhanced features for dealing with Big Data such as the IBM GPFS storage system's integration into Hadoop Map-Reduce. Taking advantage of these capabilities requires file storage systems to be adaptive and accommodate these new storage technologies. We present the extension of the Lightweight Virtual File System (LVFS) currently running as the production system for the MODIS Level 1 and Atmosphere Archive and Distribution System (LAADS) to incorporate a flexible plugin architecture which allows easy integration of new HPC hardware and/or software storage technologies without disrupting workflows, system architectures and only minimal impact on existing tools. We consider two essential aspects provided by the LVFS plugin architecture needed for the future HPC community. First, it allows for the seamless integration of new and emerging hardware technologies which are significantly different than existing technologies such as Segate's Kinetic disks and Intel's 3DXPoint non-volatile storage. Second is the transparent and instantaneous conversion between new software technologies and various file formats. With most current storage system a switch in file format would require costly reprocessing and nearly doubling of storage requirements. We will install LVFS on UMBC's IBM iDataPlex cluster with a heterogeneous storage architecture utilizing local, remote, and Seagate Kinetic storage as a case study. LVFS merges different kinds of storage architectures to show users a uniform layout and, therefore, prevent any disruption in workflows, architecture design, or tool usage. We will show how LVFS will convert HDF data produced by applying machine learning algorithms to Xco2 Level 2 data from the OCO-2 satellite to produce CO2 surface fluxes into GeoTIFF for visualization.
Real-time inverse kinematics for the upper limb: a model-based algorithm using segment orientations.
Borbély, Bence J; Szolgay, Péter
2017-01-17
Model based analysis of human upper limb movements has key importance in understanding the motor control processes of our nervous system. Various simulation software packages have been developed over the years to perform model based analysis. These packages provide computationally intensive-and therefore off-line-solutions to calculate the anatomical joint angles from motion captured raw measurement data (also referred as inverse kinematics). In addition, recent developments in inertial motion sensing technology show that it may replace large, immobile and expensive optical systems with small, mobile and cheaper solutions in cases when a laboratory-free measurement setup is needed. The objective of the presented work is to extend the workflow of measurement and analysis of human arm movements with an algorithm that allows accurate and real-time estimation of anatomical joint angles for a widely used OpenSim upper limb kinematic model when inertial sensors are used for movement recording. The internal structure of the selected upper limb model is analyzed and used as the underlying platform for the development of the proposed algorithm. Based on this structure, a prototype marker set is constructed that facilitates the reconstruction of model-based joint angles using orientation data directly available from inertial measurement systems. The mathematical formulation of the reconstruction algorithm is presented along with the validation of the algorithm on various platforms, including embedded environments. Execution performance tables of the proposed algorithm show significant improvement on all tested platforms. Compared to OpenSim's Inverse Kinematics tool 50-15,000x speedup is achieved while maintaining numerical accuracy. The proposed algorithm is capable of real-time reconstruction of standardized anatomical joint angles even in embedded environments, establishing a new way for complex applications to take advantage of accurate and fast model-based inverse kinematics calculations.
Implementing Workplace Charging with Federal Agencies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Margaret Smith
The number of Americans that chose to purchase plug-in electric vehicles (PEVs), which include plug-in hybrid electric vehicles(PHEVs) and all-electric vehicles (EVs), has steadily increased since 2011. Many of these drivers commute to federal worksites in communities across the country. The opportunity to charge a personal vehicle while at work is valuable to PEV drivers. Employees who have access to workplace charging are six times more likely to own a PEV than those who lack such access.
Plug-In Electric Vehicle Handbook for Workplace Charging Hosts
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2013-08-01
Plug-in electric vehicles (PEVs) have immense potential for increasing the country's energy, economic, and environmental security, and they will play a key role in the future of U.S. transportation. By providing PEV charging at the workplace, employers are perfectly positioned to contribute to and benefit from the electrification of transportation. This handbook answers basic questions about PEVs and charging equipment, helps employers assess whether to offer workplace charging for employees, and outlines important steps for implementation.
Validation of a Custom-made Software for DQE Assessment in Mammography Digital Detectors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ayala-Dominguez, L.; Perez-Ponce, H.; Brandan, M. E.
2010-12-07
This works presents the validation of a custom-made software, designed and developed in Matlab, intended for routine evaluation of detective quantum efficiency DQE, according to algorithms described in the IEC 62220-1-2 standard. DQE, normalized noise power spectrum NNPS and pre-sampling modulation transfer function MTF were calculated from RAW images from a GE Senographe DS (FineView disabled) and a Siemens Novation system. Calculated MTF is in close agreement with results obtained with alternative codes: MTF lowbar tool (Maidment), ImageJ plug-in (Perez-Ponce) and MIQuaELa (Ayala). Overall agreement better than {approx_equal}90% was found in MTF; the largest differences were observed at frequencies closemore » to the Nyquist limit. For the measurement of NNPS and DQE, agreement is similar to that obtained in the MTF. These results suggest that the developed software can be used with confidence for image quality assessment.« less
NASA Astrophysics Data System (ADS)
Kraljić, K.; Strüngmann, L.; Fimmel, E.; Gumbel, M.
2018-01-01
The genetic code is degenerated and it is assumed that redundancy provides error detection and correction mechanisms in the translation process. However, the biological meaning of the code's structure is still under current research. This paper presents a Genetic Code Analysis Toolkit (GCAT) which provides workflows and algorithms for the analysis of the structure of nucleotide sequences. In particular, sets or sequences of codons can be transformed and tested for circularity, comma-freeness, dichotomic partitions and others. GCAT comes with a fertile editor custom-built to work with the genetic code and a batch mode for multi-sequence processing. With the ability to read FASTA files or load sequences from GenBank, the tool can be used for the mathematical and statistical analysis of existing sequence data. GCAT is Java-based and provides a plug-in concept for extensibility. Availability: Open source Homepage:http://www.gcat.bio/
SpotCaliper: fast wavelet-based spot detection with accurate size estimation.
Püspöki, Zsuzsanna; Sage, Daniel; Ward, John Paul; Unser, Michael
2016-04-15
SpotCaliper is a novel wavelet-based image-analysis software providing a fast automatic detection scheme for circular patterns (spots), combined with the precise estimation of their size. It is implemented as an ImageJ plugin with a friendly user interface. The user is allowed to edit the results by modifying the measurements (in a semi-automated way), extract data for further analysis. The fine tuning of the detections includes the possibility of adjusting or removing the original detections, as well as adding further spots. The main advantage of the software is its ability to capture the size of spots in a fast and accurate way. http://bigwww.epfl.ch/algorithms/spotcaliper/ zsuzsanna.puspoki@epfl.ch Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
OpenCOR: a modular and interoperable approach to computational biology
Garny, Alan; Hunter, Peter J.
2015-01-01
Computational biologists have been developing standards and formats for nearly two decades, with the aim of easing the description and exchange of experimental data, mathematical models, simulation experiments, etc. One of those efforts is CellML (cellml.org), an XML-based markup language for the encoding of mathematical models. Early CellML-based environments include COR and OpenCell. However, both of those tools have limitations and were eventually replaced with OpenCOR (opencor.ws). OpenCOR is an open source modeling environment that is supported on Windows, Linux and OS X. It relies on a modular approach, which means that all of its features come in the form of plugins. Those plugins can be used to organize, edit, simulate and analyze models encoded in the CellML format. We start with an introduction to CellML and two of its early adopters, which limitations eventually led to the development of OpenCOR. We then go onto describing the general philosophy behind OpenCOR, as well as describing its openness and its development process. Next, we illustrate various aspects of OpenCOR, such as its user interface and some of the plugins that come bundled with it (e.g., its editing and simulation plugins). Finally, we discuss some of the advantages and limitations of OpenCOR before drawing some concluding remarks. PMID:25705192
Clock Scan Protocol for Image Analysis: ImageJ Plugins.
Dobretsov, Maxim; Petkau, Georg; Hayar, Abdallah; Petkau, Eugen
2017-06-19
The clock scan protocol for image analysis is an efficient tool to quantify the average pixel intensity within, at the border, and outside (background) a closed or segmented convex-shaped region of interest, leading to the generation of an averaged integral radial pixel-intensity profile. This protocol was originally developed in 2006, as a visual basic 6 script, but as such, it had limited distribution. To address this problem and to join similar recent efforts by others, we converted the original clock scan protocol code into two Java-based plugins compatible with NIH-sponsored and freely available image analysis programs like ImageJ or Fiji ImageJ. Furthermore, these plugins have several new functions, further expanding the range of capabilities of the original protocol, such as analysis of multiple regions of interest and image stacks. The latter feature of the program is especially useful in applications in which it is important to determine changes related to time and location. Thus, the clock scan analysis of stacks of biological images may potentially be applied to spreading of Na + or Ca ++ within a single cell, as well as to the analysis of spreading activity (e.g., Ca ++ waves) in populations of synaptically-connected or gap junction-coupled cells. Here, we describe these new clock scan plugins and show some examples of their applications in image analysis.
Luk, Jason M; Saville, Bradley A; MacLean, Heather L
2015-04-21
This paper aims to comprehensively distinguish among the merits of different vehicles using a common primary energy source. In this study, we consider compressed natural gas (CNG) use directly in conventional vehicles (CV) and hybrid electric vehicles (HEV), and natural gas-derived electricity (NG-e) use in plug-in battery electric vehicles (BEV). This study evaluates the incremental life cycle air emissions (climate change and human health) impacts and life cycle ownership costs of non-plug-in (CV and HEV) and plug-in light-duty vehicles. Replacing a gasoline CV with a CNG CV, or a CNG CV with a CNG HEV, can provide life cycle air emissions impact benefits without increasing ownership costs; however, the NG-e BEV will likely increase costs (90% confidence interval: $1000 to $31 000 incremental cost per vehicle lifetime). Furthermore, eliminating HEV tailpipe emissions via plug-in vehicles has an insignificant incremental benefit, due to high uncertainties, with emissions cost benefits between -$1000 and $2000. Vehicle criteria air contaminants are a relatively minor contributor to life cycle air emissions impacts because of strict vehicle emissions standards. Therefore, policies should focus on adoption of plug-in vehicles in nonattainment regions, because CNG vehicles are likely more cost-effective at providing overall life cycle air emissions impact benefits.
A CAD approach to magnetic bearing design
NASA Technical Reports Server (NTRS)
Jeyaseelan, M.; Anand, D. K.; Kirk, J. A.
1988-01-01
A design methodology has been developed at the Magnetic Bearing Research Laboratory for designing magnetic bearings using a CAD approach. This is used in the algorithm of an interactive design software package. The package is a design tool developed to enable the designer to simulate the entire process of design and analysis of the system. Its capabilities include interactive input/modification of geometry, finding any possible saturation at critical sections of the system, and the design and analysis of a control system that stabilizes and maintains magnetic suspension.
spMC: an R-package for 3D lithological reconstructions based on spatial Markov chains
NASA Astrophysics Data System (ADS)
Sartore, Luca; Fabbri, Paolo; Gaetan, Carlo
2016-09-01
The paper presents the spatial Markov Chains (spMC) R-package and a case study of subsoil simulation/prediction located in a plain site of Northeastern Italy. spMC is a quite complete collection of advanced methods for data inspection, besides spMC implements Markov Chain models to estimate experimental transition probabilities of categorical lithological data. Furthermore, simulation methods based on most known prediction methods (as indicator Kriging and CoKriging) were implemented in spMC package. Moreover, other more advanced methods are available for simulations, e.g. path methods and Bayesian procedures, that exploit the maximum entropy. Since the spMC package was developed for intensive geostatistical computations, part of the code is implemented for parallel computations via the OpenMP constructs. A final analysis of this computational efficiency compares the simulation/prediction algorithms by using different numbers of CPU cores, and considering the example data set of the case study included in the package.
A geometry package for generation of input data for a three-dimensional potential-flow program
NASA Technical Reports Server (NTRS)
Halsey, N. D.; Hess, J. L.
1978-01-01
The preparation of geometric data for input to three-dimensional potential flow programs was automated and simplified by a geometry package incorporated into the NASA Langley version of the 3-D lifting potential flow program. Input to the computer program for the geometry package consists of a very sparse set of coordinate data, often with an order of magnitude of fewer points than required for the actual potential flow calculations. Isolated components, such as wings, fuselages, etc. are paneled automatically, using one of several possible element distribution algorithms. Curves of intersection between components are calculated, using a hybrid curve-fit/surface-fit approach. Intersecting components are repaneled so that adjacent elements on either side of the intersection curves line up in a satisfactory manner for the potential-flow calculations. Many cases may be run completely (from input, through the geometry package, and through the flow calculations) without interruption. Use of the package significantly reduces the time and expense involved in making three-dimensional potential flow calculations.
A reduction package for cross-dispersed echelle spectrograph data in IDL
NASA Astrophysics Data System (ADS)
Hall, Jeffrey C.; Neff, James E.
1992-12-01
We have written in IDL a data reduction package that performs reduction and extraction of cross-dispersed echelle spectrograph data. The present package includes a complete set of tools for extracting data from any number of spectral orders with arbitrary tilt and curvature. Essential elements include debiasing and flatfielding of the raw CCD image, removal of scattered light background, either nonoptimal or optimal extraction of data, and wavelength calibration and continuum normalization of the extracted orders. A growing set of support routines permits examination of the frame being processed to provide continuing checks on the statistical properties of the data and on the accuracy of the extraction. We will display some sample reductions and discuss the algorithms used. The inherent simplicity and user-friendliness of the IDL interface make this package a useful tool for spectroscopists. We will provide an email distribution list for those interested in receiving the package, and further documentation will be distributed at the meeting.
Visualization of protein interaction networks: problems and solutions
2013-01-01
Background Visualization concerns the representation of data visually and is an important task in scientific research. Protein-protein interactions (PPI) are discovered using either wet lab techniques, such mass spectrometry, or in silico predictions tools, resulting in large collections of interactions stored in specialized databases. The set of all interactions of an organism forms a protein-protein interaction network (PIN) and is an important tool for studying the behaviour of the cell machinery. Since graphic representation of PINs may highlight important substructures, e.g. protein complexes, visualization is more and more used to study the underlying graph structure of PINs. Although graphs are well known data structures, there are different open problems regarding PINs visualization: the high number of nodes and connections, the heterogeneity of nodes (proteins) and edges (interactions), the possibility to annotate proteins and interactions with biological information extracted by ontologies (e.g. Gene Ontology) that enriches the PINs with semantic information, but complicates their visualization. Methods In these last years many software tools for the visualization of PINs have been developed. Initially thought for visualization only, some of them have been successively enriched with new functions for PPI data management and PIN analysis. The paper analyzes the main software tools for PINs visualization considering four main criteria: (i) technology, i.e. availability/license of the software and supported OS (Operating System) platforms; (ii) interoperability, i.e. ability to import/export networks in various formats, ability to export data in a graphic format, extensibility of the system, e.g. through plug-ins; (iii) visualization, i.e. supported layout and rendering algorithms and availability of parallel implementation; (iv) analysis, i.e. availability of network analysis functions, such as clustering or mining of the graph, and the possibility to interact with external databases. Results Currently, many tools are available and it is not easy for the users choosing one of them. Some tools offer sophisticated 2D and 3D network visualization making available many layout algorithms, others tools are more data-oriented and support integration of interaction data coming from different sources and data annotation. Finally, some specialistic tools are dedicated to the analysis of pathways and cellular processes and are oriented toward systems biology studies, where the dynamic aspects of the processes being studied are central. Conclusion A current trend is the deployment of open, extensible visualization tools (e.g. Cytoscape), that may be incrementally enriched by the interactomics community with novel and more powerful functions for PIN analysis, through the development of plug-ins. On the other hand, another emerging trend regards the efficient and parallel implementation of the visualization engine that may provide high interactivity and near real-time response time, as in NAViGaTOR. From a technological point of view, open-source, free and extensible tools, like Cytoscape, guarantee a long term sustainability due to the largeness of the developers and users communities, and provide a great flexibility since new functions are continuously added by the developer community through new plug-ins, but the emerging parallel, often closed-source tools like NAViGaTOR, can offer near real-time response time also in the analysis of very huge PINs. PMID:23368786
Kassiopeia: a modern, extensible C++ particle tracking package
DOE Office of Scientific and Technical Information (OSTI.GOV)
Furse, Daniel; Groh, Stefan; Trost, Nikolaus
The Kassiopeia particle tracking framework is an object-oriented software package using modern C++ techniques, written originally to meet the needs of the KATRIN collaboration. Kassiopeia features a new algorithmic paradigm for particle tracking simulations which targets experiments containing complex geometries and electromagnetic fields, with high priority put on calculation efficiency, customizability, extensibility, and ease-of-use for novice programmers. To solve Kassiopeia's target physics problem the software is capable of simulating particle trajectories governed by arbitrarily complex differential equations of motion, continuous physics processes that may in part be modeled as terms perturbing that equation of motion, stochastic processes that occur inmore » flight such as bulk scattering and decay, and stochastic surface processes occurring at interfaces, including transmission and reflection effects. This entire set of computations takes place against the backdrop of a rich geometry package which serves a variety of roles, including initialization of electromagnetic field simulations and the support of state-dependent algorithm-swapping and behavioral changes as a particle's state evolves. Thanks to the very general approach taken by Kassiopeia it can be used by other experiments facing similar challenges when calculating particle trajectories in electromagnetic fields. It is publicly available at https://github.com/KATRIN-Experiment/Kassiopeia.« less
JPSS Science Data Services for the Direct Readout Community
NASA Technical Reports Server (NTRS)
Chander, Gyanesh; Lutz, Bob
2014-01-01
The Suomi National Polar-orbiting Partnership (S-NPP) and Joint Polar Satellite System (JPSS) High Rate Data (HRD) link provides Direct Broadcast data to users in real-time, utilizing their own remote field terminals. The Field Terminal Support (FTS) provides the resources needed to support the Direct Readout communities by providing software, documentation, and periodic updates to enable them to produce data products from SNPP and JPSS. The FTS distribution server will also provide the necessary ancillary and auxiliary data needed for processing the broadcasts, as well as making orbital data available to assist in locating the satellites of interest. In addition, the FTS provides development support for the algorithm and software through GSFC Direct Readout Laboratory (DRL) International Polar Orbiter Processing Package (IPOPP) and University of Wisconsin (UWISC) Community Satellite Processing Package (CSPP), to enable users to integrate the algorithms into their remote terminals. The support the JPSS Program provides to the institutions developing and maintaining these two software packages, will demonstrate the ability to produce ready-to-use products from the HRD link and provide risk reduction effort at a minimal cost. This paper discusses the key functions and system architecture of FTS.
SPOTting Model Parameters Using a Ready-Made Python Package
NASA Astrophysics Data System (ADS)
Houska, Tobias; Kraft, Philipp; Chamorro-Chavez, Alejandro; Breuer, Lutz
2017-04-01
The choice for specific parameter estimation methods is often more dependent on its availability than its performance. We developed SPOTPY (Statistical Parameter Optimization Tool), an open source python package containing a comprehensive set of methods typically used to calibrate, analyze and optimize parameters for a wide range of ecological models. SPOTPY currently contains eight widely used algorithms, 11 objective functions, and can sample from eight parameter distributions. SPOTPY has a model-independent structure and can be run in parallel from the workstation to large computation clusters using the Message Passing Interface (MPI). We tested SPOTPY in five different case studies to parameterize the Rosenbrock, Griewank and Ackley functions, a one-dimensional physically based soil moisture routine, where we searched for parameters of the van Genuchten-Mualem function and a calibration of a biogeochemistry model with different objective functions. The case studies reveal that the implemented SPOTPY methods can be used for any model with just a minimal amount of code for maximal power of parameter optimization. They further show the benefit of having one package at hand that includes number of well performing parameter search methods, since not every case study can be solved sufficiently with every algorithm or every objective function.
Kassiopeia: a modern, extensible C++ particle tracking package
Furse, Daniel; Groh, Stefan; Trost, Nikolaus; ...
2017-05-16
The Kassiopeia particle tracking framework is an object-oriented software package using modern C++ techniques, written originally to meet the needs of the KATRIN collaboration. Kassiopeia features a new algorithmic paradigm for particle tracking simulations which targets experiments containing complex geometries and electromagnetic fields, with high priority put on calculation efficiency, customizability, extensibility, and ease-of-use for novice programmers. To solve Kassiopeia's target physics problem the software is capable of simulating particle trajectories governed by arbitrarily complex differential equations of motion, continuous physics processes that may in part be modeled as terms perturbing that equation of motion, stochastic processes that occur inmore » flight such as bulk scattering and decay, and stochastic surface processes occurring at interfaces, including transmission and reflection effects. This entire set of computations takes place against the backdrop of a rich geometry package which serves a variety of roles, including initialization of electromagnetic field simulations and the support of state-dependent algorithm-swapping and behavioral changes as a particle's state evolves. Thanks to the very general approach taken by Kassiopeia it can be used by other experiments facing similar challenges when calculating particle trajectories in electromagnetic fields. It is publicly available at https://github.com/KATRIN-Experiment/Kassiopeia.« less
Kassiopeia: a modern, extensible C++ particle tracking package
NASA Astrophysics Data System (ADS)
Furse, Daniel; Groh, Stefan; Trost, Nikolaus; Babutzka, Martin; Barrett, John P.; Behrens, Jan; Buzinsky, Nicholas; Corona, Thomas; Enomoto, Sanshiro; Erhard, Moritz; Formaggio, Joseph A.; Glück, Ferenc; Harms, Fabian; Heizmann, Florian; Hilk, Daniel; Käfer, Wolfgang; Kleesiek, Marco; Leiber, Benjamin; Mertens, Susanne; Oblath, Noah S.; Renschler, Pascal; Schwarz, Johannes; Slocum, Penny L.; Wandkowsky, Nancy; Wierman, Kevin; Zacher, Michael
2017-05-01
The Kassiopeia particle tracking framework is an object-oriented software package using modern C++ techniques, written originally to meet the needs of the KATRIN collaboration. Kassiopeia features a new algorithmic paradigm for particle tracking simulations which targets experiments containing complex geometries and electromagnetic fields, with high priority put on calculation efficiency, customizability, extensibility, and ease-of-use for novice programmers. To solve Kassiopeia's target physics problem the software is capable of simulating particle trajectories governed by arbitrarily complex differential equations of motion, continuous physics processes that may in part be modeled as terms perturbing that equation of motion, stochastic processes that occur in flight such as bulk scattering and decay, and stochastic surface processes occurring at interfaces, including transmission and reflection effects. This entire set of computations takes place against the backdrop of a rich geometry package which serves a variety of roles, including initialization of electromagnetic field simulations and the support of state-dependent algorithm-swapping and behavioral changes as a particle’s state evolves. Thanks to the very general approach taken by Kassiopeia it can be used by other experiments facing similar challenges when calculating particle trajectories in electromagnetic fields. It is publicly available at https://github.com/KATRIN-Experiment/Kassiopeia.
SPOTting Model Parameters Using a Ready-Made Python Package.
Houska, Tobias; Kraft, Philipp; Chamorro-Chavez, Alejandro; Breuer, Lutz
2015-01-01
The choice for specific parameter estimation methods is often more dependent on its availability than its performance. We developed SPOTPY (Statistical Parameter Optimization Tool), an open source python package containing a comprehensive set of methods typically used to calibrate, analyze and optimize parameters for a wide range of ecological models. SPOTPY currently contains eight widely used algorithms, 11 objective functions, and can sample from eight parameter distributions. SPOTPY has a model-independent structure and can be run in parallel from the workstation to large computation clusters using the Message Passing Interface (MPI). We tested SPOTPY in five different case studies to parameterize the Rosenbrock, Griewank and Ackley functions, a one-dimensional physically based soil moisture routine, where we searched for parameters of the van Genuchten-Mualem function and a calibration of a biogeochemistry model with different objective functions. The case studies reveal that the implemented SPOTPY methods can be used for any model with just a minimal amount of code for maximal power of parameter optimization. They further show the benefit of having one package at hand that includes number of well performing parameter search methods, since not every case study can be solved sufficiently with every algorithm or every objective function.