Sample records for command line tool

  1. chemalot and chemalot_knime: Command line programs as workflow tools for drug discovery.

    PubMed

    Lee, Man-Ling; Aliagas, Ignacio; Feng, Jianwen A; Gabriel, Thomas; O'Donnell, T J; Sellers, Benjamin D; Wiswedel, Bernd; Gobbi, Alberto

    2017-06-12

    Analyzing files containing chemical information is at the core of cheminformatics. Each analysis may require a unique workflow. This paper describes the chemalot and chemalot_knime open source packages. Chemalot is a set of command line programs with a wide range of functionalities for cheminformatics. The chemalot_knime package allows command line programs that read and write SD files from stdin and to stdout to be wrapped into KNIME nodes. The combination of chemalot and chemalot_knime not only facilitates the compilation and maintenance of sequences of command line programs but also allows KNIME workflows to take advantage of the compute power of a LINUX cluster. Use of the command line programs is demonstrated in three different workflow examples: (1) A workflow to create a data file with project-relevant data for structure-activity or property analysis and other type of investigations, (2) The creation of a quantitative structure-property-relationship model using the command line programs via KNIME nodes, and (3) The analysis of strain energy in small molecule ligand conformations from the Protein Data Bank database. The chemalot and chemalot_knime packages provide lightweight and powerful tools for many tasks in cheminformatics. They are easily integrated with other open source and commercial command line tools and can be combined to build new and even more powerful tools. The chemalot_knime package facilitates the generation and maintenance of user-defined command line workflows, taking advantage of the graphical design capabilities in KNIME. Graphical abstract Example KNIME workflow with chemalot nodes and the corresponding command line pipe.

  2. C 3, A Command-line Catalog Cross-match Tool for Large Astrophysical Catalogs

    NASA Astrophysics Data System (ADS)

    Riccio, Giuseppe; Brescia, Massimo; Cavuoti, Stefano; Mercurio, Amata; di Giorgio, Anna Maria; Molinari, Sergio

    2017-02-01

    Modern Astrophysics is based on multi-wavelength data organized into large and heterogeneous catalogs. Hence, the need for efficient, reliable and scalable catalog cross-matching methods plays a crucial role in the era of the petabyte scale. Furthermore, multi-band data have often very different angular resolution, requiring the highest generality of cross-matching features, mainly in terms of region shape and resolution. In this work we present C 3 (Command-line Catalog Cross-match), a multi-platform application designed to efficiently cross-match massive catalogs. It is based on a multi-core parallel processing paradigm and conceived to be executed as a stand-alone command-line process or integrated within any generic data reduction/analysis pipeline, providing the maximum flexibility to the end-user, in terms of portability, parameter configuration, catalog formats, angular resolution, region shapes, coordinate units and cross-matching types. Using real data, extracted from public surveys, we discuss the cross-matching capabilities and computing time efficiency also through a direct comparison with some publicly available tools, chosen among the most used within the community, and representative of different interface paradigms. We verified that the C 3 tool has excellent capabilities to perform an efficient and reliable cross-matching between large data sets. Although the elliptical cross-match and the parametric handling of angular orientation and offset are known concepts in the astrophysical context, their availability in the presented command-line tool makes C 3 competitive in the context of public astronomical tools.

  3. Improve Problem Solving Skills through Adapting Programming Tools

    NASA Technical Reports Server (NTRS)

    Shaykhian, Linda H.; Shaykhian, Gholam Ali

    2007-01-01

    There are numerous ways for engineers and students to become better problem-solvers. The use of command line and visual programming tools can help to model a problem and formulate a solution through visualization. The analysis of problem attributes and constraints provide insight into the scope and complexity of the problem. The visualization aspect of the problem-solving approach tends to make students and engineers more systematic in their thought process and help them catch errors before proceeding too far in the wrong direction. The problem-solver identifies and defines important terms, variables, rules, and procedures required for solving a problem. Every step required to construct the problem solution can be defined in program commands that produce intermediate output. This paper advocates improved problem solving skills through using a programming tool. MatLab created by MathWorks, is an interactive numerical computing environment and programming language. It is a matrix-based system that easily lends itself to matrix manipulation, and plotting of functions and data. MatLab can be used as an interactive command line or a sequence of commands that can be saved in a file as a script or named functions. Prior programming experience is not required to use MatLab commands. The GNU Octave, part of the GNU project, a free computer program for performing numerical computations, is comparable to MatLab. MatLab visual and command programming are presented here.

  4. BuddySuite: Command-Line Toolkits for Manipulating Sequences, Alignments, and Phylogenetic Trees.

    PubMed

    Bond, Stephen R; Keat, Karl E; Barreira, Sofia N; Baxevanis, Andreas D

    2017-06-01

    The ability to manipulate sequence, alignment, and phylogenetic tree files has become an increasingly important skill in the life sciences, whether to generate summary information or to prepare data for further downstream analysis. The command line can be an extremely powerful environment for interacting with these resources, but only if the user has the appropriate general-purpose tools on hand. BuddySuite is a collection of four independent yet interrelated command-line toolkits that facilitate each step in the workflow of sequence discovery, curation, alignment, and phylogenetic reconstruction. Most common sequence, alignment, and tree file formats are automatically detected and parsed, and over 100 tools have been implemented for manipulating these data. The project has been engineered to easily accommodate the addition of new tools, is written in the popular programming language Python, and is hosted on the Python Package Index and GitHub to maximize accessibility. Documentation for each BuddySuite tool, including usage examples, is available at http://tiny.cc/buddysuite_wiki. All software is open source and freely available through http://research.nhgri.nih.gov/software/BuddySuite. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution 2017. This work is written by US Government employees and is in the public domain in the US.

  5. PDS4: Harnessing the Power of Generate and Apache Velocity

    NASA Astrophysics Data System (ADS)

    Padams, J.; Cayanan, M.; Hardman, S.

    2018-04-01

    The PDS4 Generate Tool is a Java-based command-line tool developed by the Cartography and Imaging Sciences Nodes (PDSIMG) for generating PDS4 XML labels, from Apache Velocity templates and input metadata.

  6. ASCIIGenome: a command line genome browser for console terminals.

    PubMed

    Beraldi, Dario

    2017-05-15

    Current genome browsers are designed to work via graphical user interfaces (GUIs), which, however intuitive, are not amenable to operate within console terminals and therefore are difficult to streamline or integrate in scripts. To circumvent these limitations, ASCIIGenome runs exclusively via command line interface to display genomic data directly in a terminal window. By following the same philosophy of UNIX tools, ASCIIGenome aims to be easily integrated with the command line, including batch processing of data, and therefore enables an effective exploration of the data. ASCIIGenome is written in Java. Consequently, it is a cross-platform tool and requires minimal or no installation. Some of the common genomic data types are supported and data access on remote ftp servers is possible. Speed and memory footprint are comparable to or better than those of common genome browsers. Software and source code (MIT License) are available at https://github.com/dariober/ASCIIGenome with detailed documentation at http://asciigenome.readthedocs.io . Dario.beraldi@cruk.cam.ac.uk. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press.

  7. XMGR5 users manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, K.R.; Fisher, J.E.

    1997-03-01

    ACE/gr is XY plotting tool for workstations or X-terminals using X. A few of its features are: User defined scaling, tick marks, labels, symbols, line styles, colors. Batch mode for unattended plotting. Read and write parameters used during a session. Polynomial regression, splines, running averages, DFT/FFT, cross/auto-correlation. Hardcopy support for PostScript, HP-GL, and FrameMaker.mif format. While ACE/gr has a convenient point-and-click interface, most parameter settings and operations are available through a command line interface (found in Files/Commands).

  8. MIA - A free and open source software for gray scale medical image analysis

    PubMed Central

    2013-01-01

    Background Gray scale images make the bulk of data in bio-medical image analysis, and hence, the main focus of many image processing tasks lies in the processing of these monochrome images. With ever improving acquisition devices, spatial and temporal image resolution increases, and data sets become very large. Various image processing frameworks exists that make the development of new algorithms easy by using high level programming languages or visual programming. These frameworks are also accessable to researchers that have no background or little in software development because they take care of otherwise complex tasks. Specifically, the management of working memory is taken care of automatically, usually at the price of requiring more it. As a result, processing large data sets with these tools becomes increasingly difficult on work station class computers. One alternative to using these high level processing tools is the development of new algorithms in a languages like C++, that gives the developer full control over how memory is handled, but the resulting workflow for the prototyping of new algorithms is rather time intensive, and also not appropriate for a researcher with little or no knowledge in software development. Another alternative is in using command line tools that run image processing tasks, use the hard disk to store intermediate results, and provide automation by using shell scripts. Although not as convenient as, e.g. visual programming, this approach is still accessable to researchers without a background in computer science. However, only few tools exist that provide this kind of processing interface, they are usually quite task specific, and don’t provide an clear approach when one wants to shape a new command line tool from a prototype shell script. Results The proposed framework, MIA, provides a combination of command line tools, plug-ins, and libraries that make it possible to run image processing tasks interactively in a command shell and to prototype by using the according shell scripting language. Since the hard disk becomes the temporal storage memory management is usually a non-issue in the prototyping phase. By using string-based descriptions for filters, optimizers, and the likes, the transition from shell scripts to full fledged programs implemented in C++ is also made easy. In addition, its design based on atomic plug-ins and single tasks command line tools makes it easy to extend MIA, usually without the requirement to touch or recompile existing code. Conclusion In this article, we describe the general design of MIA, a general purpouse framework for gray scale image processing. We demonstrated the applicability of the software with example applications from three different research scenarios, namely motion compensation in myocardial perfusion imaging, the processing of high resolution image data that arises in virtual anthropology, and retrospective analysis of treatment outcome in orthognathic surgery. With MIA prototyping algorithms by using shell scripts that combine small, single-task command line tools is a viable alternative to the use of high level languages, an approach that is especially useful when large data sets need to be processed. PMID:24119305

  9. MIA - A free and open source software for gray scale medical image analysis.

    PubMed

    Wollny, Gert; Kellman, Peter; Ledesma-Carbayo, María-Jesus; Skinner, Matthew M; Hublin, Jean-Jaques; Hierl, Thomas

    2013-10-11

    Gray scale images make the bulk of data in bio-medical image analysis, and hence, the main focus of many image processing tasks lies in the processing of these monochrome images. With ever improving acquisition devices, spatial and temporal image resolution increases, and data sets become very large.Various image processing frameworks exists that make the development of new algorithms easy by using high level programming languages or visual programming. These frameworks are also accessable to researchers that have no background or little in software development because they take care of otherwise complex tasks. Specifically, the management of working memory is taken care of automatically, usually at the price of requiring more it. As a result, processing large data sets with these tools becomes increasingly difficult on work station class computers.One alternative to using these high level processing tools is the development of new algorithms in a languages like C++, that gives the developer full control over how memory is handled, but the resulting workflow for the prototyping of new algorithms is rather time intensive, and also not appropriate for a researcher with little or no knowledge in software development.Another alternative is in using command line tools that run image processing tasks, use the hard disk to store intermediate results, and provide automation by using shell scripts. Although not as convenient as, e.g. visual programming, this approach is still accessable to researchers without a background in computer science. However, only few tools exist that provide this kind of processing interface, they are usually quite task specific, and don't provide an clear approach when one wants to shape a new command line tool from a prototype shell script. The proposed framework, MIA, provides a combination of command line tools, plug-ins, and libraries that make it possible to run image processing tasks interactively in a command shell and to prototype by using the according shell scripting language. Since the hard disk becomes the temporal storage memory management is usually a non-issue in the prototyping phase. By using string-based descriptions for filters, optimizers, and the likes, the transition from shell scripts to full fledged programs implemented in C++ is also made easy. In addition, its design based on atomic plug-ins and single tasks command line tools makes it easy to extend MIA, usually without the requirement to touch or recompile existing code. In this article, we describe the general design of MIA, a general purpouse framework for gray scale image processing. We demonstrated the applicability of the software with example applications from three different research scenarios, namely motion compensation in myocardial perfusion imaging, the processing of high resolution image data that arises in virtual anthropology, and retrospective analysis of treatment outcome in orthognathic surgery. With MIA prototyping algorithms by using shell scripts that combine small, single-task command line tools is a viable alternative to the use of high level languages, an approach that is especially useful when large data sets need to be processed.

  10. ADOMA: A Command Line Tool to Modify ClustalW Multiple Alignment Output.

    PubMed

    Zaal, Dionne; Nota, Benjamin

    2016-01-01

    We present ADOMA, a command line tool that produces alternative outputs from ClustalW multiple alignments of nucleotide or protein sequences. ADOMA can simplify the output of alignments by showing only the different residues between sequences, which is often desirable when only small differences such as single nucleotide polymorphisms are present (e.g., between different alleles). Another feature of ADOMA is that it can enhance the ClustalW output by coloring the residues in the alignment. This tool is easily integrated into automated Linux pipelines for next-generation sequencing data analysis, and may be useful for researchers in a broad range of scientific disciplines including evolutionary biology and biomedical sciences. The source code is freely available at https://sourceforge. net/projects/adoma/. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Fast Multiscale Algorithms for Information Representation and Fusion

    DTIC Science & Technology

    2011-07-01

    We are also developing convenient command-line invocation tools in addition to the previously developed APIs . Various real-world data sets...This knowledge is important in geolocation applications where knowing whether a received signal is line-of-sight or not is necessary for the

  12. fluff: exploratory analysis and visualization of high-throughput sequencing data

    PubMed Central

    Georgiou, Georgios

    2016-01-01

    Summary. In this article we describe fluff, a software package that allows for simple exploration, clustering and visualization of high-throughput sequencing data mapped to a reference genome. The package contains three command-line tools to generate publication-quality figures in an uncomplicated manner using sensible defaults. Genome-wide data can be aggregated, clustered and visualized in a heatmap, according to different clustering methods. This includes a predefined setting to identify dynamic clusters between different conditions or developmental stages. Alternatively, clustered data can be visualized in a bandplot. Finally, fluff includes a tool to generate genomic profiles. As command-line tools, the fluff programs can easily be integrated into standard analysis pipelines. The installation is straightforward and documentation is available at http://fluff.readthedocs.org. Availability. fluff is implemented in Python and runs on Linux. The source code is freely available for download at https://github.com/simonvh/fluff. PMID:27547532

  13. The ChIP-Seq tools and web server: a resource for analyzing ChIP-seq and other types of genomic data.

    PubMed

    Ambrosini, Giovanna; Dreos, René; Kumar, Sunil; Bucher, Philipp

    2016-11-18

    ChIP-seq and related high-throughput chromatin profilig assays generate ever increasing volumes of highly valuable biological data. To make sense out of it, biologists need versatile, efficient and user-friendly tools for access, visualization and itegrative analysis of such data. Here we present the ChIP-Seq command line tools and web server, implementing basic algorithms for ChIP-seq data analysis starting with a read alignment file. The tools are optimized for memory-efficiency and speed thus allowing for processing of large data volumes on inexpensive hardware. The web interface provides access to a large database of public data. The ChIP-Seq tools have a modular and interoperable design in that the output from one application can serve as input to another one. Complex and innovative tasks can thus be achieved by running several tools in a cascade. The various ChIP-Seq command line tools and web services either complement or compare favorably to related bioinformatics resources in terms of computational efficiency, ease of access to public data and interoperability with other web-based tools. The ChIP-Seq server is accessible at http://ccg.vital-it.ch/chipseq/ .

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schaumberg, Andrew

    The Omics Tools package provides several small trivial tools for work in genomics. This single portable package, the “omics.jar” file, is a toolbox that works in any Java-based environment, including PCs, Macs, and supercomputers. The number of tools is expected to grow. One tool (called cmsearch.hadoop or cmsearch.local), calls the external cmsearch program to predict non-coding RNA in a genome. The cmsearch program is part of the third-party Infernal package. Omics Tools does not contain Infernal. Infernal may be installed separately. The cmsearch.hadoop subtool requires Apache Hadoop and runs on a supercomputer, though cmsearch.local does not and runs on amore » server. Omics Tools does not contain Hadoop. Hadoop mat be installed separartely The other tools (cmgbk, cmgff, fastats, pal, randgrp, randgrpr, randsub) do not interface with third-party tools. Omics Tools is written in Java and Scala programming languages. Invoking the “help” command shows currently available tools, as shown below: schaumbe@gpint06:~/proj/omics$ java -jar omics.jar help Known commands are: cmgbk : compare cmsearch and GenBank Infernal hits cmgff : compare hits among two GFF (version 3) files cmsearch.hadoop : find Infernal hits in a genome, on your supercomputer cmsearch.local : find Infernal hits in a genome, on your workstation fastats : FASTA stats, e.g. # bases, GC content pal : stem-loop motif detection by palindromic sequence search (code stub) randgrp : random subsample without replacement, of groups randgrpr : random subsample with replacement, of groups (fast) randsub : random subsample without replacement, of file lines For more help regarding a particular command, use: java -jar omics.jar command help Usage: java -jar omics.jar command args« less

  15. WASP Model Tutorials

    EPA Pesticide Factsheets

    Contains WASP tutorial videos. WASP Command Line, WASP, Modeling Dissolved Oxygen, Building a Steady State Example, Modeling Nutrients in Rivers, Nutrient Cycles, Interpreting Water Quality Models, Linking with LSPC, WRDB, BASINS, WCS, WASP Network Tool

  16. Boutiques: a flexible framework to integrate command-line applications in computing platforms.

    PubMed

    Glatard, Tristan; Kiar, Gregory; Aumentado-Armstrong, Tristan; Beck, Natacha; Bellec, Pierre; Bernard, Rémi; Bonnet, Axel; Brown, Shawn T; Camarasu-Pop, Sorina; Cervenansky, Frédéric; Das, Samir; Ferreira da Silva, Rafael; Flandin, Guillaume; Girard, Pascal; Gorgolewski, Krzysztof J; Guttmann, Charles R G; Hayot-Sasson, Valérie; Quirion, Pierre-Olivier; Rioux, Pierre; Rousseau, Marc-Étienne; Evans, Alan C

    2018-05-01

    We present Boutiques, a system to automatically publish, integrate, and execute command-line applications across computational platforms. Boutiques applications are installed through software containers described in a rich and flexible JSON language. A set of core tools facilitates the construction, validation, import, execution, and publishing of applications. Boutiques is currently supported by several distinct virtual research platforms, and it has been used to describe dozens of applications in the neuroinformatics domain. We expect Boutiques to improve the quality of application integration in computational platforms, to reduce redundancy of effort, to contribute to computational reproducibility, and to foster Open Science.

  17. Simple proteomics data analysis in the object-oriented PowerShell.

    PubMed

    Mohammed, Yassene; Palmblad, Magnus

    2013-01-01

    Scripting languages such as Perl and Python are appreciated for solving simple, everyday tasks in bioinformatics. A more recent, object-oriented command shell and scripting language, Windows PowerShell, has many attractive features: an object-oriented interactive command line, fluent navigation and manipulation of XML files, ability to consume Web services from the command line, consistent syntax and grammar, rich regular expressions, and advanced output formatting. The key difference between classical command shells and scripting languages, such as bash, and object-oriented ones, such as PowerShell, is that in the latter the result of a command is a structured object with inherited properties and methods rather than a simple stream of characters. Conveniently, PowerShell is included in all new releases of Microsoft Windows and therefore already installed on most computers in classrooms and teaching labs. In this chapter we demonstrate how PowerShell in particular allows easy interaction with mass spectrometry data in XML formats, connection to Web services for tools such as BLAST, and presentation of results as formatted text or graphics. These features make PowerShell much more than "yet another scripting language."

  18. Boutiques: a flexible framework to integrate command-line applications in computing platforms

    PubMed Central

    Glatard, Tristan; Kiar, Gregory; Aumentado-Armstrong, Tristan; Beck, Natacha; Bellec, Pierre; Bernard, Rémi; Bonnet, Axel; Brown, Shawn T; Camarasu-Pop, Sorina; Cervenansky, Frédéric; Das, Samir; Ferreira da Silva, Rafael; Flandin, Guillaume; Girard, Pascal; Gorgolewski, Krzysztof J; Guttmann, Charles R G; Hayot-Sasson, Valérie; Quirion, Pierre-Olivier; Rioux, Pierre; Rousseau, Marc-Étienne; Evans, Alan C

    2018-01-01

    Abstract We present Boutiques, a system to automatically publish, integrate, and execute command-line applications across computational platforms. Boutiques applications are installed through software containers described in a rich and flexible JSON language. A set of core tools facilitates the construction, validation, import, execution, and publishing of applications. Boutiques is currently supported by several distinct virtual research platforms, and it has been used to describe dozens of applications in the neuroinformatics domain. We expect Boutiques to improve the quality of application integration in computational platforms, to reduce redundancy of effort, to contribute to computational reproducibility, and to foster Open Science. PMID:29718199

  19. Army Networks: Opportunities Exist to Better Utilize Results from Network Integration Evaluations

    DTIC Science & Technology

    2013-08-01

    monitor operations; a touch screen-based mission command planning tool; and an antenna mast . The Army will field only one of these systems in capability...Office JTRS Joint Tactical Radio System NIE Network Integration Evaluation OSD Office of the Secretary of Defense SUE System under Evaluation...command systems . A robust transport layer capable of delivering voice, data, imagery, and video to the tactical edge (i.e., the forward battle lines

  20. CADDIS Volume 4. Data Analysis: Download Software

    EPA Pesticide Factsheets

    Overview of the data analysis tools available for download on CADDIS. Provides instructions for downloading and installing CADStat, access to Microsoft Excel macro for computing SSDs, a brief overview of command line use of R, a statistical software.

  1. FreeSASA: An open source C library for solvent accessible surface area calculations.

    PubMed

    Mitternacht, Simon

    2016-01-01

    Calculating solvent accessible surface areas (SASA) is a run-of-the-mill calculation in structural biology. Although there are many programs available for this calculation, there are no free-standing, open-source tools designed for easy tool-chain integration. FreeSASA is an open source C library for SASA calculations that provides both command-line and Python interfaces in addition to its C API. The library implements both Lee and Richards' and Shrake and Rupley's approximations, and is highly configurable to allow the user to control molecular parameters, accuracy and output granularity. It only depends on standard C libraries and should therefore be easy to compile and install on any platform. The library is well-documented, stable and efficient. The command-line interface can easily replace closed source legacy programs, with comparable or better accuracy and speed, and with some added functionality.

  2. VAGUE: a graphical user interface for the Velvet assembler.

    PubMed

    Powell, David R; Seemann, Torsten

    2013-01-15

    Velvet is a popular open-source de novo genome assembly software tool, which is run from the Unix command line. Most of the problems experienced by new users of Velvet revolve around constructing syntactically and semantically correct command lines, getting input files into acceptable formats and assessing the output. Here, we present Velvet Assembler Graphical User Environment (VAGUE), a multi-platform graphical front-end for Velvet. VAGUE aims to make sequence assembly accessible to a wider audience and to facilitate better usage amongst existing users of Velvet. VAGUE is implemented in JRuby and targets the Java Virtual Machine. It is available under an open-source GPLv2 licence from http://www.vicbioinformatics.com/. torsten.seemann@monash.edu.

  3. C3: A Command-line Catalogue Cross-matching tool for modern astrophysical survey data

    NASA Astrophysics Data System (ADS)

    Riccio, Giuseppe; Brescia, Massimo; Cavuoti, Stefano; Mercurio, Amata; di Giorgio, Anna Maria; Molinari, Sergio

    2017-06-01

    In the current data-driven science era, it is needed that data analysis techniques has to quickly evolve to face with data whose dimensions has increased up to the Petabyte scale. In particular, being modern astrophysics based on multi-wavelength data organized into large catalogues, it is crucial that the astronomical catalog cross-matching methods, strongly dependant from the catalogues size, must ensure efficiency, reliability and scalability. Furthermore, multi-band data are archived and reduced in different ways, so that the resulting catalogues may differ each other in formats, resolution, data structure, etc, thus requiring the highest generality of cross-matching features. We present C 3 (Command-line Catalogue Cross-match), a multi-platform application designed to efficiently cross-match massive catalogues from modern surveys. Conceived as a stand-alone command-line process or a module within generic data reduction/analysis pipeline, it provides the maximum flexibility, in terms of portability, configuration, coordinates and cross-matching types, ensuring high performance capabilities by using a multi-core parallel processing paradigm and a sky partitioning algorithm.

  4. Improvement of the efficient referencing and sample positioning system for micro focused synchrotron X-ray techniques

    NASA Astrophysics Data System (ADS)

    Spangenberg, T.; Göttlicher, J.; Steininger, R.

    2016-05-01

    An efficient referencing and sample positioning system is a basic tool for a micro focus beamline at a synchrotron. The seven years ago introduced command line based system was upgraded at SUL-X beamline at ANKA [1]. A new combination of current server client techniques offers direct control and facilitates unexperienced users the handling of this frequently used tool.

  5. Rapid Diagnostics of Onboard Sequences

    NASA Technical Reports Server (NTRS)

    Starbird, Thomas W.; Morris, John R.; Shams, Khawaja S.; Maimone, Mark W.

    2012-01-01

    Keeping track of sequences onboard a spacecraft is challenging. When reviewing Event Verification Records (EVRs) of sequence executions on the Mars Exploration Rover (MER), operators often found themselves wondering which version of a named sequence the EVR corresponded to. The lack of this information drastically impacts the operators diagnostic capabilities as well as their situational awareness with respect to the commands the spacecraft has executed, since the EVRs do not provide argument values or explanatory comments. Having this information immediately available can be instrumental in diagnosing critical events and can significantly enhance the overall safety of the spacecraft. This software provides auditing capability that can eliminate that uncertainty while diagnosing critical conditions. Furthermore, the Restful interface provides a simple way for sequencing tools to automatically retrieve binary compiled sequence SCMFs (Space Command Message Files) on demand. It also enables developers to change the underlying database, while maintaining the same interface to the existing applications. The logging capabilities are also beneficial to operators when they are trying to recall how they solved a similar problem many days ago: this software enables automatic recovery of SCMF and RML (Robot Markup Language) sequence files directly from the command EVRs, eliminating the need for people to find and validate the corresponding sequences. To address the lack of auditing capability for sequences onboard a spacecraft during earlier missions, extensive logging support was added on the Mars Science Laboratory (MSL) sequencing server. This server is responsible for generating all MSL binary SCMFs from RML input sequences. The sequencing server logs every SCMF it generates into a MySQL database, as well as the high-level RML file and dictionary name inputs used to create the SCMF. The SCMF is then indexed by a hash value that is automatically included in all command EVRs by the onboard flight software. Second, both the binary SCMF result and the RML input file can be retrieved simply by specifying the hash to a Restful web interface. This interface enables command line tools as well as large sophisticated programs to download the SCMF and RMLs on-demand from the database, enabling a vast array of tools to be built on top of it. One such command line tool can retrieve and display RML files, or annotate a list of EVRs by interleaving them with the original sequence commands. This software has been integrated with the MSL sequencing pipeline where it will serve sequences useful in diagnostics, debugging, and situational awareness throughout the mission.

  6. Effective Cyber Situation Awareness (CSA) Assessment and Training

    DTIC Science & Technology

    2013-11-01

    activity/scenario. y. Save Wireshark Captures. z. Save SNORT logs. aa. Save MySQL databases. 4. After the completion of the scenario, the reversion...line or from custom Java code. • Cisco ASA Parser: Builds normalized vendor-neutral firewall rule specifications from Cisco ASA and PIX firewall...The Service tool lets analysts build Cauldron models from either the command line or from custom Java code. Functionally, it corresponds to the

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jarocki, John Charles; Zage, David John; Fisher, Andrew N.

    LinkShop is a software tool for applying the method of Linkography to the analysis time-sequence data. LinkShop provides command line, web, and application programming interfaces (API) for input and processing of time-sequence data, abstraction models, and ontologies. The software creates graph representations of the abstraction model, ontology, and derived linkograph. Finally, the tool allows the user to perform statistical measurements of the linkograph and refine the ontology through direct manipulation of the linkograph.

  8. Task Report for Task Authorization 1 for: Technology Demonstration of the Joint Network Defence and Management System (JNDMS) Project

    DTIC Science & Technology

    2009-01-30

    tool written in Java to support the automated creation of simulated subnets. It can be run giving it a subnet, the number of hosts to create, the...network and can also be used to create subnets with specific profiles. Subnet Creator command line: > java –jar SubnetCreator.jar –j [path to client...command: > java –jar jss_client.jar com.mdacorporation.jndms.JSS.Client.JSSBatchClient [file] 5. Software: This is the output file that will store the

  9. WinHPC System Programming | High-Performance Computing | NREL

    Science.gov Websites

    Programming WinHPC System Programming Learn how to build and run an MPI (message passing interface (mpi.h) and library (msmpi.lib) are. To build from the command line, run... Start > Intel Software Development Tools > Intel C++ Compiler Professional... > C++ Build Environment for applications running

  10. Requirements Document for Development of a Livermore Tomography Tools Interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seetho, I. M.

    In this document, we outline an exercise performed at LLNL to evaluate the user interface deficits of a LLNL-developed CT reconstruction software package, Livermore Tomography Tools (LTT). We observe that a difficult-to-use command line interface and the lack of support functions compound to generate a bottleneck in the CT reconstruction process when input parameters to key functions are not well known. Through the exercise of systems engineering best practices, we generate key performance parameters for a LTT interface refresh, and specify a combination of back-end (“test-mode” functions) and front-end (graphical user interface visualization and command scripting tools) solutions to LTT’smore » poor user interface that aim to mitigate issues and lower costs associated with CT reconstruction using LTT. Key functional and non-functional requirements and risk mitigation strategies for the solution are outlined and discussed.« less

  11. Interfaces and Integration of Medical Image Analysis Frameworks: Challenges and Opportunities.

    PubMed

    Covington, Kelsie; McCreedy, Evan S; Chen, Min; Carass, Aaron; Aucoin, Nicole; Landman, Bennett A

    2010-05-25

    Clinical research with medical imaging typically involves large-scale data analysis with interdependent software toolsets tied together in a processing workflow. Numerous, complementary platforms are available, but these are not readily compatible in terms of workflows or data formats. Both image scientists and clinical investigators could benefit from using the framework which is a most natural fit to the specific problem at hand, but pragmatic choices often dictate that a compromise platform is used for collaboration. Manual merging of platforms through carefully tuned scripts has been effective, but exceptionally time consuming and is not feasible for large-scale integration efforts. Hence, the benefits of innovation are constrained by platform dependence. Removing this constraint via integration of algorithms from one framework into another is the focus of this work. We propose and demonstrate a light-weight interface system to expose parameters across platforms and provide seamless integration. In this initial effort, we focus on four platforms Medical Image Analysis and Visualization (MIPAV), Java Image Science Toolkit (JIST), command line tools, and 3D Slicer. We explore three case studies: (1) providing a system for MIPAV to expose internal algorithms and utilize these algorithms within JIST, (2) exposing JIST modules through self-documenting command line interface for inclusion in scripting environments, and (3) detecting and using JIST modules in 3D Slicer. We review the challenges and opportunities for light-weight software integration both within development language (e.g., Java in MIPAV and JIST) and across languages (e.g., C/C++ in 3D Slicer and shell in command line tools).

  12. LC-IMS-MS Feature Finder: detecting multidimensional liquid chromatography, ion mobility and mass spectrometry features in complex datasets.

    PubMed

    Crowell, Kevin L; Slysz, Gordon W; Baker, Erin S; LaMarche, Brian L; Monroe, Matthew E; Ibrahim, Yehia M; Payne, Samuel H; Anderson, Gordon A; Smith, Richard D

    2013-11-01

    The addition of ion mobility spectrometry to liquid chromatography-mass spectrometry experiments requires new, or updated, software tools to facilitate data processing. We introduce a command line software application LC-IMS-MS Feature Finder that searches for molecular ion signatures in multidimensional liquid chromatography-ion mobility spectrometry-mass spectrometry (LC-IMS-MS) data by clustering deisotoped peaks with similar monoisotopic mass, charge state, LC elution time and ion mobility drift time values. The software application includes an algorithm for detecting and quantifying co-eluting chemical species, including species that exist in multiple conformations that may have been separated in the IMS dimension. LC-IMS-MS Feature Finder is available as a command-line tool for download at http://omics.pnl.gov/software/LC-IMS-MS_Feature_Finder.php. The Microsoft.NET Framework 4.0 is required to run the software. All other dependencies are included with the software package. Usage of this software is limited to non-profit research to use (see README). rds@pnnl.gov. Supplementary data are available at Bioinformatics online.

  13. Chapter 21: Programmatic Interfaces - STILTS

    NASA Astrophysics Data System (ADS)

    Fitzpatrick, M. J.

    STILTS is the Starlink Tables Infrastructure Library Tool Set developed by Mark Taylor of the former Starlink Project. STILTS is a command-line tool (see the NVOSS_HOME/bin/stilts command) providing access to the same functionality driving the TOPCAT application and can be run using either the STILTS-specific jar file, or the more general TOPCAT jar file (both are available in the NVOSS_HOME/java/lib directory and are included in the default software environment classpath). The heart of both STILTS and TOPCAT is the STIL Java library. STIL is designed to efficiently handle the input, output and processing of very large tabular datasets and the STILTS task interface makes it an ideal tool for the scripting environment. Multiple formats are supported (including FITS Binary Tables, VOTable, CSV, SQL databases and ASCII, amongst others) and while some tools will generically handle all supported formats, others are specific to the VOTable format. Converting a VOTable to a more script-friendly format is the first thing most users will encounter, but there are many other useful tools as well.

  14. 14 CFR 135.299 - Pilot in command: Line checks: Routes and airports.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 3 2014-01-01 2014-01-01 false Pilot in command: Line checks: Routes and... Crewmember Testing Requirements § 135.299 Pilot in command: Line checks: Routes and airports. (a) No certificate holder may use a pilot, nor may any person serve, as a pilot in command of a flight unless, since...

  15. 14 CFR 135.299 - Pilot in command: Line checks: Routes and airports.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 3 2012-01-01 2012-01-01 false Pilot in command: Line checks: Routes and... Crewmember Testing Requirements § 135.299 Pilot in command: Line checks: Routes and airports. (a) No certificate holder may use a pilot, nor may any person serve, as a pilot in command of a flight unless, since...

  16. 14 CFR 135.299 - Pilot in command: Line checks: Routes and airports.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Pilot in command: Line checks: Routes and... Crewmember Testing Requirements § 135.299 Pilot in command: Line checks: Routes and airports. (a) No certificate holder may use a pilot, nor may any person serve, as a pilot in command of a flight unless, since...

  17. 14 CFR 135.299 - Pilot in command: Line checks: Routes and airports.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 3 2013-01-01 2013-01-01 false Pilot in command: Line checks: Routes and... Crewmember Testing Requirements § 135.299 Pilot in command: Line checks: Routes and airports. (a) No certificate holder may use a pilot, nor may any person serve, as a pilot in command of a flight unless, since...

  18. 14 CFR 135.299 - Pilot in command: Line checks: Routes and airports.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 3 2011-01-01 2011-01-01 false Pilot in command: Line checks: Routes and... Crewmember Testing Requirements § 135.299 Pilot in command: Line checks: Routes and airports. (a) No certificate holder may use a pilot, nor may any person serve, as a pilot in command of a flight unless, since...

  19. BatTool: an R package with GUI for assessing the effect of White-nose syndrome and other take events on Myotis spp. of bats

    PubMed Central

    2014-01-01

    Background Myotis species of bats such as the Indiana Bat and Little Brown Bat are facing population declines because of White-nose syndrome (WNS). These species also face threats from anthropogenic activities such as wind energy development. Population models may be used to provide insights into threats facing these species. We developed a population model, BatTool, as an R package to help decision makers and natural resource managers examine factors influencing the dynamics of these species. The R package includes two components: 1) a deterministic and stochastic model that are accessible from the command line and 2) a graphical user interface (GUI). Results BatTool is an R package allowing natural resource managers and decision makers to understand Myotis spp. population dynamics. Through the use of a GUI, the model allows users to understand how WNS and other take events may affect the population. The results are saved both graphically and as data files. Additionally, R-savvy users may access the population functions through the command line and reuse the code as part of future research. This R package could also be used as part of a population dynamics or wildlife management course. Conclusions BatTool provides access to a Myotis spp. population model. This tool can help natural resource managers and decision makers with the Endangered Species Act deliberations for these species and with issuing take permits as part of regulatory decision making. The tool is available online as part of this publication. PMID:24955110

  20. BatTool: an R package with GUI for assessing the effect of White-nose syndrome and other take events on Myotis spp. of bats

    USGS Publications Warehouse

    Erickson, Richard A.; Thogmartin, Wayne E.; Szymanski, Jennifer A.

    2014-01-01

    Background: Myotis species of bats such as the Indiana Bat and Little Brown Bat are facing population declines because of White-nose syndrome (WNS). These species also face threats from anthropogenic activities such as wind energy development. Population models may be used to provide insights into threats facing these species. We developed a population model, BatTool, as an R package to help decision makers and natural resource managers examine factors influencing the dynamics of these species. The R package includes two components: 1) a deterministic and stochastic model that are accessible from the command line and 2) a graphical user interface (GUI). Results: BatTool is an R package allowing natural resource managers and decision makers to understand Myotis spp. population dynamics. Through the use of a GUI, the model allows users to understand how WNS and other take events may affect the population. The results are saved both graphically and as data files. Additionally, R-savvy users may access the population functions through the command line and reuse the code as part of future research. This R package could also be used as part of a population dynamics or wildlife management course. Conclusions: BatTool provides access to a Myotis spp. population model. This tool can help natural resource managers and decision makers with the Endangered Species Act deliberations for these species and with issuing take permits as part of regulatory decision making. The tool is available online as part of this publication.

  1. BatTool: an R package with GUI for assessing the effect of White-nose syndrome and other take events on Myotis spp. of bats.

    PubMed

    Erickson, Richard A; Thogmartin, Wayne E; Szymanski, Jennifer A

    2014-01-01

    Myotis species of bats such as the Indiana Bat and Little Brown Bat are facing population declines because of White-nose syndrome (WNS). These species also face threats from anthropogenic activities such as wind energy development. Population models may be used to provide insights into threats facing these species. We developed a population model, BatTool, as an R package to help decision makers and natural resource managers examine factors influencing the dynamics of these species. The R package includes two components: 1) a deterministic and stochastic model that are accessible from the command line and 2) a graphical user interface (GUI). BatTool is an R package allowing natural resource managers and decision makers to understand Myotis spp. population dynamics. Through the use of a GUI, the model allows users to understand how WNS and other take events may affect the population. The results are saved both graphically and as data files. Additionally, R-savvy users may access the population functions through the command line and reuse the code as part of future research. This R package could also be used as part of a population dynamics or wildlife management course. BatTool provides access to a Myotis spp. population model. This tool can help natural resource managers and decision makers with the Endangered Species Act deliberations for these species and with issuing take permits as part of regulatory decision making. The tool is available online as part of this publication.

  2. Interferometric correction system for a numerically controlled machine

    DOEpatents

    Burleson, Robert R.

    1978-01-01

    An interferometric correction system for a numerically controlled machine is provided to improve the positioning accuracy of a machine tool, for example, for a high-precision numerically controlled machine. A laser interferometer feedback system is used to monitor the positioning of the machine tool which is being moved by command pulses to a positioning system to position the tool. The correction system compares the commanded position as indicated by a command pulse train applied to the positioning system with the actual position of the tool as monitored by the laser interferometer. If the tool position lags the commanded position by a preselected error, additional pulses are added to the pulse train applied to the positioning system to advance the tool closer to the commanded position, thereby reducing the lag error. If the actual tool position is leading in comparison to the commanded position, pulses are deleted from the pulse train where the advance error exceeds the preselected error magnitude to correct the position error of the tool relative to the commanded position.

  3. STILTS -- Starlink Tables Infrastructure Library Tool Set

    NASA Astrophysics Data System (ADS)

    Taylor, Mark

    STILTS is a set of command-line tools for processing tabular data. It has been designed for, but is not restricted to, use on astronomical data such as source catalogues. It contains both generic (format-independent) table processing tools and tools for processing VOTable documents. Facilities offered include crossmatching, format conversion, format validation, column calculation and rearrangement, row selection, sorting, plotting, statistical calculations and metadata display. Calculations on cell data can be performed using a powerful and extensible expression language. The package is written in pure Java and based on STIL, the Starlink Tables Infrastructure Library. This gives it high portability, support for many data formats (including FITS, VOTable, text-based formats and SQL databases), extensibility and scalability. Where possible the tools are written to accept streamed data so the size of tables which can be processed is not limited by available memory. As well as the tutorial and reference information in this document, detailed on-line help is available from the tools themselves. STILTS is available under the GNU General Public Licence.

  4. A software architecture for automating operations processes

    NASA Technical Reports Server (NTRS)

    Miller, Kevin J.

    1994-01-01

    The Operations Engineering Lab (OEL) at JPL has developed a software architecture based on an integrated toolkit approach for simplifying and automating mission operations tasks. The toolkit approach is based on building adaptable, reusable graphical tools that are integrated through a combination of libraries, scripts, and system-level user interface shells. The graphical interface shells are designed to integrate and visually guide a user through the complex steps in an operations process. They provide a user with an integrated system-level picture of an overall process, defining the required inputs and possible output through interactive on-screen graphics. The OEL has developed the software for building these process-oriented graphical user interface (GUI) shells. The OEL Shell development system (OEL Shell) is an extension of JPL's Widget Creation Library (WCL). The OEL Shell system can be used to easily build user interfaces for running complex processes, applications with extensive command-line interfaces, and tool-integration tasks. The interface shells display a logical process flow using arrows and box graphics. They also allow a user to select which output products are desired and which input sources are needed, eliminating the need to know which program and its associated command-line parameters must be executed in each case. The shells have also proved valuable for use as operations training tools because of the OEL Shell hypertext help environment. The OEL toolkit approach is guided by several principles, including the use of ASCII text file interfaces with a multimission format, Perl scripts for mission-specific adaptation code, and programs that include a simple command-line interface for batch mode processing. Projects can adapt the interface shells by simple changes to the resources configuration file. This approach has allowed the development of sophisticated, automated software systems that are easy, cheap, and fast to build. This paper will discuss our toolkit approach and the OEL Shell interface builder in the context of a real operations process example. The paper will discuss the design and implementation of a Ulysses toolkit for generating the mission sequence of events. The Sequence of Events Generation (SEG) system provides an adaptable multimission toolkit for producing a time-ordered listing and timeline display of spacecraft commands, state changes, and required ground activities.

  5. Using AI/expert system technology to automate planning and replanning for the HST servicing missions

    NASA Technical Reports Server (NTRS)

    Bogovich, L.; Johnson, J; Tuchman, A.; Mclean, D.; Page, B.; Kispert, A.; Burkhardt, C.; Littlefield, R.; Potter, W.

    1993-01-01

    This paper describes a knowledge-based system that has been developed to automate planning and scheduling for the Hubble Space Telescope (HST) Servicing Missions. This new system is the Servicing Mission Planning and Replanning Tool (SM/PART). SM/PART has been delivered to the HST Flight Operations Team (FOT) at Goddard Space Flight Center (GSFC) where it is being used to build integrated time lines and command plans to control the activities of the HST, Shuttle, Crew and ground systems for the next HST Servicing Mission. SM/PART reuses and extends AI/expert system technology from Interactive Experimenter Planning System (IEPS) systems to build or rebuild time lines and command plans more rapidly than was possible for previous missions where they were built manually. This capability provides an important safety factor for the HST, Shuttle and Crew in case unexpected events occur during the mission.

  6. New Tools for a New Terrain Air Force Support to Special Operations in the Cyber Environment

    DTIC Science & Technology

    2016-08-01

    54 3 PREFACE As a career targeteer for the US...capabilities of a toolkit of cyber options, from hardware on the front lines to “digital reachback” relationships with USCYBERCOM, is to leave...of career fields, including, but not limited to, cyberspace operations, intelligence, aircrew operations, command and control systems operations, and

  7. Addressing the Digital Divide in Contemporary Biology: Lessons from Teaching UNIX.

    PubMed

    Mangul, Serghei; Martin, Lana S; Hoffmann, Alexander; Pellegrini, Matteo; Eskin, Eleazar

    2017-10-01

    Life and medical science researchers increasingly rely on applications that lack a graphical interface. Scientists who are not trained in computer science face an enormous challenge analyzing high-throughput data. We present a training model for use of command-line tools when the learner has little to no prior knowledge of UNIX. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. SeqLib: a C ++ API for rapid BAM manipulation, sequence alignment and sequence assembly

    PubMed Central

    Wala, Jeremiah; Beroukhim, Rameen

    2017-01-01

    Abstract We present SeqLib, a C ++ API and command line tool that provides a rapid and user-friendly interface to BAM/SAM/CRAM files, global sequence alignment operations and sequence assembly. Four C libraries perform core operations in SeqLib: HTSlib for BAM access, BWA-MEM and BLAT for sequence alignment and Fermi for error correction and sequence assembly. Benchmarking indicates that SeqLib has lower CPU and memory requirements than leading C ++ sequence analysis APIs. We demonstrate an example of how minimal SeqLib code can extract, error-correct and assemble reads from a CRAM file and then align with BWA-MEM. SeqLib also provides additional capabilities, including chromosome-aware interval queries and read plotting. Command line tools are available for performing integrated error correction, micro-assemblies and alignment. Availability and Implementation: SeqLib is available on Linux and OSX for the C ++98 standard and later at github.com/walaj/SeqLib. SeqLib is released under the Apache2 license. Additional capabilities for BLAT alignment are available under the BLAT license. Contact: jwala@broadinstitue.org; rameen@broadinstitute.org PMID:28011768

  9. SeqLib: a C ++ API for rapid BAM manipulation, sequence alignment and sequence assembly.

    PubMed

    Wala, Jeremiah; Beroukhim, Rameen

    2017-03-01

    We present SeqLib, a C ++ API and command line tool that provides a rapid and user-friendly interface to BAM/SAM/CRAM files, global sequence alignment operations and sequence assembly. Four C libraries perform core operations in SeqLib: HTSlib for BAM access, BWA-MEM and BLAT for sequence alignment and Fermi for error correction and sequence assembly. Benchmarking indicates that SeqLib has lower CPU and memory requirements than leading C ++ sequence analysis APIs. We demonstrate an example of how minimal SeqLib code can extract, error-correct and assemble reads from a CRAM file and then align with BWA-MEM. SeqLib also provides additional capabilities, including chromosome-aware interval queries and read plotting. Command line tools are available for performing integrated error correction, micro-assemblies and alignment. SeqLib is available on Linux and OSX for the C ++98 standard and later at github.com/walaj/SeqLib. SeqLib is released under the Apache2 license. Additional capabilities for BLAT alignment are available under the BLAT license. jwala@broadinstitue.org ; rameen@broadinstitute.org. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  10. Systematically evaluating interfaces for RNA-seq analysis from a life scientist perspective.

    PubMed

    Poplawski, Alicia; Marini, Federico; Hess, Moritz; Zeller, Tanja; Mazur, Johanna; Binder, Harald

    2016-03-01

    RNA-sequencing (RNA-seq) has become an established way for measuring gene expression in model organisms and humans. While methods development for refining the corresponding data processing and analysis pipeline is ongoing, protocols for typical steps have been proposed and are widely used. Several user interfaces have been developed for making such analysis steps accessible to life scientists without extensive knowledge of command line tools. We performed a systematic search and evaluation of such interfaces to investigate to what extent these can indeed facilitate RNA-seq data analysis. We found a total of 29 open source interfaces, and six of the more widely used interfaces were evaluated in detail. Central criteria for evaluation were ease of configuration, documentation, usability, computational demand and reporting. No interface scored best in all of these criteria, indicating that the final choice will depend on the specific perspective of users and the corresponding weighting of criteria. Considerable technical hurdles had to be overcome in our evaluation. For many users, this will diminish potential benefits compared with command line tools, leaving room for future improvement of interfaces. © The Author 2015. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  11. Application of PSAT to Load Flow Analysis with STATCOM under Load Increase Scenario and Line Contingencies

    NASA Astrophysics Data System (ADS)

    Telang, Aparna S.; Bedekar, P. P.

    2017-09-01

    Load flow analysis is the initial and essential step for any power system computation. It is required for choosing better options for power system expansion to meet with ever increasing load demand. Implementation of Flexible AC Transmission System (FACTS) device like STATCOM, in the load flow, which is having fast and very flexible control, is one of the important tasks for power system researchers. This paper presents a simple and systematic approach for steady state power flow calculations with FACTS controller, static synchronous compensator (STATCOM) using command line usage of MATLAB tool-power system analysis toolbox (PSAT). The complexity of MATLAB language programming increases due to incorporation of STATCOM in an existing Newton-Raphson load flow algorithm. Thus, the main contribution of this paper is to show how command line usage of user friendly MATLAB tool, PSAT, can extensively be used for quicker and wider interpretation of the results of load flow with STATCOM. The novelty of this paper lies in the method of applying the load increase pattern, where the active and reactive loads have been changed simultaneously at all the load buses under consideration for creating stressed conditions for load flow analysis with STATCOM. The performance have been evaluated on many standard IEEE test systems and the results for standard IEEE-30 bus system, IEEE-57 bus system, and IEEE-118 bus system are presented.

  12. Enhancing the Breadth and Efficacy of Therapeutic Vaccines for Breast Cancer

    DTIC Science & Technology

    2012-10-01

    this collaboration. BODY: Generate tumor lysates pooled from BC cell lines of each major subtype (luminal, HER2+, basal) [Task 4a] Given the...presented epitopes. The IEDB provides downloadable command-line driven tools for the prediction of input sequence binding affinity to MHC class I or class ...69 9 Epirubicin 317 nM 29 16 35 10 Etoposide 4 µM 60 43 48 11 Fascaplysin 167 nM 22 -19 -14 12 Ibandronate 63 µM -34 -56 -52 13 ICRF-193 11 µM

  13. Improving Earth Science Metadata: Modernizing ncISO

    NASA Astrophysics Data System (ADS)

    O'Brien, K.; Schweitzer, R.; Neufeld, D.; Burger, E. F.; Signell, R. P.; Arms, S. C.; Wilcox, K.

    2016-12-01

    ncISO is a package of tools developed at NOAA's National Center for Environmental Information (NCEI) that facilitates the generation of ISO 19115-2 metadata from NetCDF data sources. The tool currently exists in two iterations: a command line utility and a web-accessible service within the THREDDS Data Server (TDS). Several projects, including NOAA's Unified Access Framework (UAF), depend upon ncISO to generate the ISO-compliant metadata from their data holdings and use the resulting information to populate discovery tools such as NCEI's ESRI Geoportal and NOAA's data.noaa.gov CKAN system. In addition to generating ISO 19115-2 metadata, the tool calculates a rubric score based on how well the dataset follows the Attribute Conventions for Dataset Discovery (ACDD). The result of this rubric calculation, along with information about what has been included and what is missing is displayed in an HTML document generated by the ncISO software package. Recently ncISO has fallen behind in terms of supporting updates to conventions such updates to the ACDD. With the blessing of the original programmer, NOAA's UAF has been working to modernize the ncISO software base. In addition to upgrading ncISO to utilize version1.3 of the ACDD, we have been working with partners at Unidata and IOOS to unify the tool's code base. In essence, we are merging the command line capabilities into the same software that will now be used by the TDS service, allowing easier updates when conventions such as ACDD are updated in the future. In this presentation, we will discuss the work the UAF project has done to support updated conventions within ncISO, as well as describe how the updated tool is helping to improve metadata throughout the earth and ocean sciences.

  14. D-peaks: a visual tool to display ChIP-seq peaks along the genome.

    PubMed

    Brohée, Sylvain; Bontempi, Gianluca

    2012-01-01

    ChIP-sequencing is a method of choice to localize the positions of protein binding sites on DNA on a whole genomic scale. The deciphering of the sequencing data produced by this novel technique is challenging and it is achieved by their rigorous interpretation using dedicated tools and adapted visualization programs. Here, we present a bioinformatics tool (D-peaks) that adds several possibilities (including, user-friendliness, high-quality, relative position with respect to the genomic features) to the well-known visualization browsers or databases already existing. D-peaks is directly available through its web interface http://rsat.ulb.ac.be/dpeaks/ as well as a command line tool.

  15. Simplifying structure analysis projects with customizable chime-based templates*.

    PubMed

    Thompson, Scott E; Sears, Duane W

    2005-09-01

    Structure/function relationships are fundamental to understanding the properties of biological molecules, and thus it is imperative that biochemistry students learn how to analyze such relationships. Here we describe Chime-based web page templates and tutorials designed to help students develop their own strategies for exploring macromolecular three-dimensional structures like those on our course website. The templates can easily be customized for any structure of interest, and some templates include a Command Entry Line and a Message Recall Box for more refined macromolecular exploration using RasMol/Chime image modification commands. The tutorials present students with an integrated overview of the image modification capabilities of the Chime plug-in and its underlying RasMol-based command structure as accessed through the Command Entry Line. The tutorial also illustrates how RasMol/Chime command syntax addresses specific formatted structural information in a standard Protein Data Bank file. Judging by the high quality of structure-based presentations given by students who have used these templates and tutorials, it appears that these resources can help students learn to analyze complex macromolecular structures while also providing them with convenient tools for creating scientifically meaningful and visually effective molecular images to share with others. (The templates, tutorials, and our course website can be viewed at the following URLs, respectively: tutor.lscf.ucsb.edu/instdev/sears/biochemistry/presentations/demos-downloads.htm, tutor.lscf.ucsb.edu/instdev/sears/biochemistry/tutorials/pdbtutorial/frontwindow.html, and tutor.lscf.ucsb.edu/instdev/sears/biochemistry/.). Copyright © 2005 International Union of Biochemistry and Molecular Biology, Inc.

  16. The Biological Reference Repository (BioR): a rapid and flexible system for genomics annotation.

    PubMed

    Kocher, Jean-Pierre A; Quest, Daniel J; Duffy, Patrick; Meiners, Michael A; Moore, Raymond M; Rider, David; Hossain, Asif; Hart, Steven N; Dinu, Valentin

    2014-07-01

    The Biological Reference Repository (BioR) is a toolkit for annotating variants. BioR stores public and user-specific annotation sources in indexed JSON-encoded flat files (catalogs). The BioR toolkit provides the functionality to combine and retrieve annotation from these catalogs via the command-line interface. Several catalogs from commonly used annotation sources and instructions for creating user-specific catalogs are provided. Commands from the toolkit can be combined with other UNIX commands for advanced annotation processing. We also provide instructions for the development of custom annotation pipelines. The package is implemented in Java and makes use of external tools written in Java and Perl. The toolkit can be executed on Mac OS X 10.5 and above or any Linux distribution. The BioR application, quickstart, and user guide documents and many biological examples are available at http://bioinformaticstools.mayo.edu. © The Author 2014. Published by Oxford University Press.

  17. Overview of Virtual Observatory Tools

    NASA Astrophysics Data System (ADS)

    Allen, M. G.

    2009-07-01

    I provide a brief introduction and tour of selected Virtual Observatory tools to highlight some of the core functions provided by the VO, and the way that astronomers may use the tools and services for doing science. VO tools provide advanced functions for searching and using images, catalogues and spectra that have been made available in the VO. The tools may work together by providing efficient and innovative browsing and analysis of data, and I also describe how many VO services may be accessed by a scripting or command line environment. Early science usage of the VO provides important feedback on the development of the system, and I show how VO portals try to address early user comments about the navigation and use of the VO.

  18. LAMPAT and LAMPATNL User’s Manual

    DTIC Science & Technology

    2012-09-01

    nonlinearity. These tools are implemented as subroutines in the finite element software ABAQUS . This user’s manual provides information on the proper...model either through the General tab of the Edit Job dialog box in Abaqus /CAE or the command line with user=( subroutine filename). Table 1...Selection of software product and subroutine . Static Analysis With Abaqus /Standard Dynamic Analysis With Abaqus /Explicit Linear, uncoupled

  19. gPhoton: Time-tagged GALEX photon events analysis tools

    NASA Astrophysics Data System (ADS)

    Million, Chase C.; Fleming, S. W.; Shiao, B.; Loyd, P.; Seibert, M.; Smith, M.

    2016-03-01

    Written in Python, gPhoton calibrates and sky-projects the ~1.1 trillion ultraviolet photon events detected by the microchannel plates on the Galaxy Evolution Explorer Spacecraft (GALEX), archives these events in a publicly accessible database at the Mikulski Archive for Space Telescopes (MAST), and provides tools for working with the database to extract scientific results, particularly over short time domains. The software includes a re-implementation of core functionality of the GALEX mission calibration pipeline to produce photon list files from raw spacecraft data as well as a suite of command line tools to generate calibrated light curves, images, and movies from the MAST database.

  20. Docker Container Manager: A Simple Toolkit for Isolated Work with Shared Computational, Storage, and Network Resources

    NASA Astrophysics Data System (ADS)

    Polyakov, S. P.; Kryukov, A. P.; Demichev, A. P.

    2018-01-01

    We present a simple set of command line interface tools called Docker Container Manager (DCM) that allow users to create and manage Docker containers with preconfigured SSH access while keeping the users isolated from each other and restricting their access to the Docker features that could potentially disrupt the work of the server. Users can access DCM server via SSH and are automatically redirected to DCM interface tool. From there, they can create new containers, stop, restart, pause, unpause, and remove containers and view the status of the existing containers. By default, the containers are also accessible via SSH using the same private key(s) but through different server ports. Additional publicly available ports can be mapped to the respective ports of a container, allowing for some network services to be run within it. The containers are started from read-only filesystem images. Some initial images must be provided by the DCM server administrators, and after containers are configured to meet one’s needs, the changes can be saved as new images. Users can see the available images and remove their own images. DCM server administrators are provided with commands to create and delete users. All commands were implemented as Python scripts. The tools allow to deploy and debug medium-sized distributed systems for simulation in different fields on one or several local computers.

  1. Unix Survival Guide.

    PubMed

    Stein, Lincoln D

    2015-09-03

    Most bioinformatics software has been designed to run on Linux and other Unix-like systems. Unix is different from most desktop operating systems because it makes extensive use of a text-only command-line interface. It can be a challenge to become familiar with the command line, but once a person becomes used to it, there are significant rewards, such as the ability to string a commonly used series of commands together with a script. This appendix will get you started with the command line and other Unix essentials. Copyright © 2015 John Wiley & Sons, Inc.

  2. ATLAS software configuration and build tool optimisation

    NASA Astrophysics Data System (ADS)

    Rybkin, Grigory; Atlas Collaboration

    2014-06-01

    ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of reads of requirements files that are now read once per package by a CMT build command that generates cached requirements files for subsequent CMT build commands; introduction of more fine-grained build parallelism at package task level, i.e., dependent applications and libraries are compiled in parallel; code optimisation of CMT commands used for build; introduction of package level build parallelism, i. e., parallelise the build of independent packages. By default, CMT launches NUMBER-OF-PROCESSORS build commands in parallel. The other focus was on CMT commands optimisation in general that made them approximately 2 times faster. CMT can generate a cached requirements file for the environment setup command, which is especially useful for deployment on distributed file systems like AFS or CERN VMFS. The use of parallelism, caching and code optimisation significantly-by several times-reduced software build time, environment setup time, increased the efficiency of multi-core computing resources utilisation, and considerably improved software developer and user experience.

  3. Interactive Spectral Analysis and Computation (ISAAC)

    NASA Technical Reports Server (NTRS)

    Lytle, D. M.

    1992-01-01

    Isaac is a task in the NSO external package for IRAF. A descendant of a FORTRAN program written to analyze data from a Fourier transform spectrometer, the current implementation has been generalized sufficiently to make it useful for general spectral analysis and other one dimensional data analysis tasks. The user interface for Isaac is implemented as an interpreted mini-language containing a powerful, programmable vector calculator. Built-in commands provide much of the functionality needed to produce accurate line lists from input spectra. These built-in functions include automated spectral line finding, least squares fitting of Voigt profiles to spectral lines including equality constraints, various filters including an optimal filter construction tool, continuum fitting, and various I/O functions.

  4. Reporting Differences Between Spacecraft Sequence Files

    NASA Technical Reports Server (NTRS)

    Khanampompan, Teerapat; Gladden, Roy E.; Fisher, Forest W.

    2010-01-01

    A suite of computer programs, called seq diff suite, reports differences between the products of other computer programs involved in the generation of sequences of commands for spacecraft. These products consist of files of several types: replacement sequence of events (RSOE), DSN keyword file [DKF (wherein DSN signifies Deep Space Network)], spacecraft activities sequence file (SASF), spacecraft sequence file (SSF), and station allocation file (SAF). These products can include line numbers, request identifications, and other pieces of information that are not relevant when generating command sequence products, though these fields can result in the appearance of many changes to the files, particularly when using the UNIX diff command to inspect file differences. The outputs of prior software tools for reporting differences between such products include differences in these non-relevant pieces of information. In contrast, seq diff suite removes the fields containing the irrelevant pieces of information before processing to extract differences, so that only relevant differences are reported. Thus, seq diff suite is especially useful for reporting changes between successive versions of the various products and in particular flagging difference in fields relevant to the sequence command generation and review process.

  5. MOTIFSIM 2.1: An Enhanced Software Platform for Detecting Similarity in Multiple DNA Motif Data Sets

    PubMed Central

    Huang, Chun-Hsi

    2017-01-01

    Abstract Finding binding site motifs plays an important role in bioinformatics as it reveals the transcription factors that control the gene expression. The development for motif finders has flourished in the past years with many tools have been introduced to the research community. Although these tools possess exceptional features for detecting motifs, they report different results for an identical data set. Hence, using multiple tools is recommended because motifs reported by several tools are likely biologically significant. However, the results from multiple tools need to be compared for obtaining common significant motifs. MOTIFSIM web tool and command-line tool were developed for this purpose. In this work, we present several technical improvements as well as additional features to further support the motif analysis in our new release MOTIFSIM 2.1. PMID:28632401

  6. XTCE GOVSAT Tool Suite 1.0

    NASA Technical Reports Server (NTRS)

    Rice, J. Kevin

    2013-01-01

    The XTCE GOVSAT software suite contains three tools: validation, search, and reporting. The Extensible Markup Language (XML) Telemetric and Command Exchange (XTCE) GOVSAT Tool Suite is written in Java for manipulating XTCE XML files. XTCE is a Consultative Committee for Space Data Systems (CCSDS) and Object Management Group (OMG) specification for describing the format and information in telemetry and command packet streams. These descriptions are files that are used to configure real-time telemetry and command systems for mission operations. XTCE s purpose is to exchange database information between different systems. XTCE GOVSAT consists of rules for narrowing the use of XTCE for missions. The Validation Tool is used to syntax check GOVSAT XML files. The Search Tool is used to search (i.e. command and telemetry mnemonics) the GOVSAT XML files and view the results. Finally, the Reporting Tool is used to create command and telemetry reports. These reports can be displayed or printed for use by the operations team.

  7. cloudPEST - A python module for cloud-computing deployment of PEST, a program for parameter estimation

    USGS Publications Warehouse

    Fienen, Michael N.; Kunicki, Thomas C.; Kester, Daniel E.

    2011-01-01

    This report documents cloudPEST-a Python module with functions to facilitate deployment of the model-independent parameter estimation code PEST on a cloud-computing environment. cloudPEST makes use of low-level, freely available command-line tools that interface with the Amazon Elastic Compute Cloud (EC2(TradeMark)) that are unlikely to change dramatically. This report describes the preliminary setup for both Python and EC2 tools and subsequently describes the functions themselves. The code and guidelines have been tested primarily on the Windows(Registered) operating system but are extensible to Linux(Registered).

  8. orthAgogue: an agile tool for the rapid prediction of orthology relations.

    PubMed

    Ekseth, Ole Kristian; Kuiper, Martin; Mironov, Vladimir

    2014-03-01

    The comparison of genes and gene products across species depends on high-quality tools to determine the relationships between gene or protein sequences from various species. Although some excellent applications are available and widely used, their performance leaves room for improvement. We developed orthAgogue: a multithreaded C application for high-speed estimation of homology relations in massive datasets, operated via a flexible and easy command-line interface. The orthAgogue software is distributed under the GNU license. The source code and binaries compiled for Linux are available at https://code.google.com/p/orthagogue/.

  9. Command and Control Software Development

    NASA Technical Reports Server (NTRS)

    Wallace, Michael

    2018-01-01

    The future of the National Aeronautics and Space Administration (NASA) depends on its innovation and efficiency in the coming years. With ambitious goals to reach Mars and explore the vast universe, correct steps must be taken to ensure our space program reaches its destination safely. The interns in the Exploration Systems and Operations Division at the Kennedy Space Center (KSC) have been tasked with building command line tools to ease the process of managing and testing the data being produced by the ground control systems while its recording system is not in use. While working alongside full-time engineers, we were able to create multiple programs that reduce the cost and time it takes to test the subsystems that launch rockets to outer space.

  10. NanoPack: visualizing and processing long read sequencing data.

    PubMed

    De Coster, Wouter; D'Hert, Svenn; Schultz, Darrin T; Cruts, Marc; Van Broeckhoven, Christine

    2018-03-14

    Here we describe NanoPack, a set of tools developed for visualization and processing of long read sequencing data from Oxford Nanopore Technologies and Pacific Biosciences. The NanoPack tools are written in Python3 and released under the GNU GPL3.0 License. The source code can be found at https://github.com/wdecoster/nanopack, together with links to separate scripts and their documentation. The scripts are compatible with Linux, Mac OS and the MS Windows 10 subsystem for Linux and are available as a graphical user interface, a web service at http://nanoplot.bioinf.be and command line tools. wouter.decoster@molgen.vib-ua.be. Supplementary tables and figures are available at Bioinformatics online.

  11. FAST: FAST Analysis of Sequences Toolbox

    PubMed Central

    Lawrence, Travis J.; Kauffman, Kyle T.; Amrine, Katherine C. H.; Carper, Dana L.; Lee, Raymond S.; Becich, Peter J.; Canales, Claudia J.; Ardell, David H.

    2015-01-01

    FAST (FAST Analysis of Sequences Toolbox) provides simple, powerful open source command-line tools to filter, transform, annotate and analyze biological sequence data. Modeled after the GNU (GNU's Not Unix) Textutils such as grep, cut, and tr, FAST tools such as fasgrep, fascut, and fastr make it easy to rapidly prototype expressive bioinformatic workflows in a compact and generic command vocabulary. Compact combinatorial encoding of data workflows with FAST commands can simplify the documentation and reproducibility of bioinformatic protocols, supporting better transparency in biological data science. Interface self-consistency and conformity with conventions of GNU, Matlab, Perl, BioPerl, R, and GenBank help make FAST easy and rewarding to learn. FAST automates numerical, taxonomic, and text-based sorting, selection and transformation of sequence records and alignment sites based on content, index ranges, descriptive tags, annotated features, and in-line calculated analytics, including composition and codon usage. Automated content- and feature-based extraction of sites and support for molecular population genetic statistics make FAST useful for molecular evolutionary analysis. FAST is portable, easy to install and secure thanks to the relative maturity of its Perl and BioPerl foundations, with stable releases posted to CPAN. Development as well as a publicly accessible Cookbook and Wiki are available on the FAST GitHub repository at https://github.com/tlawrence3/FAST. The default data exchange format in FAST is Multi-FastA (specifically, a restriction of BioPerl FastA format). Sanger and Illumina 1.8+ FastQ formatted files are also supported. FAST makes it easier for non-programmer biologists to interactively investigate and control biological data at the speed of thought. PMID:26042145

  12. Command Center Library Model Document. Comprehensive Approach to Reusable Defense Software (CARDS)

    DTIC Science & Technology

    1992-05-31

    system, and functionality for specifying the layout of the document. 3.7.16.1 FrameMaker FrameMaker is a Commercial Off The Shelf (COTS) component...facilitating WYSIWYG creation of formatted reports with embedded graphics. FrameMaker is an advanced publishing tool that integrates word processing...available for the component FrameMaker : * Product evaluation reports in ASCII and postscript formats • Product assessment on line in model 0 Product

  13. 32 CFR 700.1058 - Command of a submarine.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 5 2011-07-01 2011-07-01 false Command of a submarine. 700.1058 Section 700... Command Detail to Duty § 700.1058 Command of a submarine. The officer detailed to command a submarine shall be an officer of the line in the Navy, eligible for command at sea and qualified for command of...

  14. 32 CFR 700.1058 - Command of a submarine.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 5 2010-07-01 2010-07-01 false Command of a submarine. 700.1058 Section 700... Command Detail to Duty § 700.1058 Command of a submarine. The officer detailed to command a submarine shall be an officer of the line in the Navy, eligible for command at sea and qualified for command of...

  15. 32 CFR 700.1058 - Command of a submarine.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 32 National Defense 5 2012-07-01 2012-07-01 false Command of a submarine. 700.1058 Section 700... Command Detail to Duty § 700.1058 Command of a submarine. The officer detailed to command a submarine shall be an officer of the line in the Navy, eligible for command at sea and qualified for command of...

  16. 32 CFR 700.1058 - Command of a submarine.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 32 National Defense 5 2014-07-01 2014-07-01 false Command of a submarine. 700.1058 Section 700... Command Detail to Duty § 700.1058 Command of a submarine. The officer detailed to command a submarine shall be an officer of the line in the Navy, eligible for command at sea and qualified for command of...

  17. 32 CFR 700.1058 - Command of a submarine.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 32 National Defense 5 2013-07-01 2013-07-01 false Command of a submarine. 700.1058 Section 700... Command Detail to Duty § 700.1058 Command of a submarine. The officer detailed to command a submarine shall be an officer of the line in the Navy, eligible for command at sea and qualified for command of...

  18. BpWrapper: BioPerl-based sequence and tree utilities for rapid prototyping of bioinformatics pipelines.

    PubMed

    Hernández, Yözen; Bernstein, Rocky; Pagan, Pedro; Vargas, Levy; McCaig, William; Ramrattan, Girish; Akther, Saymon; Larracuente, Amanda; Di, Lia; Vieira, Filipe G; Qiu, Wei-Gang

    2018-03-02

    Automated bioinformatics workflows are more robust, easier to maintain, and results more reproducible when built with command-line utilities than with custom-coded scripts. Command-line utilities further benefit by relieving bioinformatics developers to learn the use of, or to interact directly with, biological software libraries. There is however a lack of command-line utilities that leverage popular Open Source biological software toolkits such as BioPerl ( http://bioperl.org ) to make many of the well-designed, robust, and routinely used biological classes available for a wider base of end users. Designed as standard utilities for UNIX-family operating systems, BpWrapper makes functionality of some of the most popular BioPerl modules readily accessible on the command line to novice as well as to experienced bioinformatics practitioners. The initial release of BpWrapper includes four utilities with concise command-line user interfaces, bioseq, bioaln, biotree, and biopop, specialized for manipulation of molecular sequences, sequence alignments, phylogenetic trees, and DNA polymorphisms, respectively. Over a hundred methods are currently available as command-line options and new methods are easily incorporated. Performance of BpWrapper utilities lags that of precompiled utilities while equivalent to that of other utilities based on BioPerl. BpWrapper has been tested on BioPerl Release 1.6, Perl versions 5.10.1 to 5.25.10, and operating systems including Apple macOS, Microsoft Windows, and GNU/Linux. Release code is available from the Comprehensive Perl Archive Network (CPAN) at https://metacpan.org/pod/Bio::BPWrapper . Source code is available on GitHub at https://github.com/bioperl/p5-bpwrapper . BpWrapper improves on existing sequence utilities by following the design principles of Unix text utilities such including a concise user interface, extensive command-line options, and standard input/output for serialized operations. Further, dozens of novel methods for manipulation of sequences, alignments, and phylogenetic trees, unavailable in existing utilities (e.g., EMBOSS, Newick Utilities, and FAST), are provided. Bioinformaticians should find BpWrapper useful for rapid prototyping of workflows on the command-line without creating custom scripts for comparative genomics and other bioinformatics applications.

  19. BamTools: a C++ API and toolkit for analyzing and managing BAM files.

    PubMed

    Barnett, Derek W; Garrison, Erik K; Quinlan, Aaron R; Strömberg, Michael P; Marth, Gabor T

    2011-06-15

    Analysis of genomic sequencing data requires efficient, easy-to-use access to alignment results and flexible data management tools (e.g. filtering, merging, sorting, etc.). However, the enormous amount of data produced by current sequencing technologies is typically stored in compressed, binary formats that are not easily handled by the text-based parsers commonly used in bioinformatics research. We introduce a software suite for programmers and end users that facilitates research analysis and data management using BAM files. BamTools provides both the first C++ API publicly available for BAM file support as well as a command-line toolkit. BamTools was written in C++, and is supported on Linux, Mac OSX and MS Windows. Source code and documentation are freely available at http://github.org/pezmaster31/bamtools.

  20. The Commander’s Emergency Response Program: A Model for Future Implementation

    DTIC Science & Technology

    2010-04-07

    unintended Effects. The INVEST-E methodology serves as a tool for commanders and their designated practitioners to properly select projects, increasing...for commanders and their designated practitioners to properly select projects, increasing the effectiveness of CERP funds. 4 TABLE OF...and unintended Effects. The INVEST-E methodology serves as a tool for commanders and their designated practitioners to properly select projects

  1. MicroShell Minimalist Shell for Xilinx Microprocessors

    NASA Technical Reports Server (NTRS)

    Werne, Thomas A.

    2011-01-01

    MicroShell is a lightweight shell environment for engineers and software developers working with embedded microprocessors in Xilinx FPGAs. (MicroShell has also been successfully ported to run on ARM Cortex-M1 microprocessors in Actel ProASIC3 FPGAs, but without project-integration support.) Micro Shell decreases the time spent performing initial tests of field-programmable gate array (FPGA) designs, simplifies running customizable one-time-only experiments, and provides a familiar-feeling command-line interface. The program comes with a collection of useful functions and enables the designer to add an unlimited number of custom commands, which are callable from the command-line. The commands are parameterizable (using the C-based command-line parameter idiom), so the designer can use one function to exercise hardware with different values. Also, since many hardware peripherals instantiated in FPGAs have reasonably simple register-mapped I/O interfaces, the engineer can edit and view hardware parameter settings at any time without stopping the processor. MicroShell comes with a set of support scripts that interface seamlessly with Xilinx's EDK tool. Adding an instance of MicroShell to a project is as simple as marking a check box in a library configuration dialog box and specifying a software project directory. The support scripts then examine the hardware design, build design-specific functions, conditionally include processor-specific functions, and complete the compilation process. For code-size constrained designs, most of the stock functionality can be excluded from the compiled library. When all of the configurable options are removed from the binary, MicroShell has an unoptimized memory footprint of about 4.8 kB and a size-optimized footprint of about 2.3 kB. Since MicroShell allows unfettered access to all processor-accessible memory locations, it is possible to perform live patching on a running system. This can be useful, for instance, if a bug is discovered in a routine but the system cannot be rebooted: Shell allows a skilled operator to directly edit the binary executable in memory. With some forethought, MicroShell code can be located in a different memory location from custom code, permitting the custom functionality to be overwritten at any time without stopping the controlling shell.

  2. C-Shell Cookbook

    NASA Astrophysics Data System (ADS)

    Currie, Malcolm J.

    This cookbook describes the fundamentals of writing scripts using the UNIX C shell. It shows how to combine Starlink and private applications with shell commands and constructs to create powerful and time-saving tools for performing repetitive jobs, creating data-processing pipelines, and encapsulating useful recipes. The cookbook aims to give practical and reassuring examples to at least get you started without having to consult a UNIX manual. However, it does not offer a comprehensive description of C-shell syntax to prevent you from being overwhelmed or intimidated. The topics covered are: how to run a script, defining shell variables, prompting, arithmetic and string processing, passing information between Starlink applications, obtaining dataset attributes and FITS header information, processing multiple files and filename modification, command-line arguments and options, and loops. There is also a glossary.

  3. New Version of SeismicHandler (SHX) based on ObsPy

    NASA Astrophysics Data System (ADS)

    Stammler, Klaus; Walther, Marcus

    2016-04-01

    The command line version of SeismicHandler (SH), a scientific analysis tool for seismic waveform data developed around 1990, has been redesigned in the recent years, based on a project funded by the Deutsche Forschungsgemeinschaft (DFG). The aim was to address new data access techniques, simplified metadata handling and a modularized software design. As a result the program was rewritten in Python in its main parts, taking advantage of simplicity of this script language and its variety of well developed software libraries, including ObsPy. SHX provides an easy access to waveforms and metadata via arclink and FDSN webservice protocols, also access to event catalogs is implemented. With single commands whole networks or stations within a certain area may be read in, the metadata are retrieved from the servers and stored in a local database. For data processing the large set of SH commands is available, as well as the SH scripting language. Via this SH language scripts or additional Python modules the command set of SHX is easily extendable. The program is open source, tested on Linux operating systems, documentation and download is found at URL "https://www.seismic-handler.org/".

  4. Measuring the Influence of Mainstream Media on Twitter Users

    DTIC Science & Technology

    2014-07-01

    dataset or called from a Java code. Weka contains tools for data pre-processing, classification, regression, clustering, association rules, and...server at CAU. The command line to start Weka is: java -jar /opt/weka-3-6-9/weka.jar & The first window that appears is the Weka’s graphical user...website hosts all detailed information at the fedora website at1. We chose the 140dev streaming API to store the tweets into our fedora using MySQL

  5. The Generic Mapping Tools 6: Classic versus Modern Mode

    NASA Astrophysics Data System (ADS)

    Wessel, P.; Uieda, L.; Luis, J. M. F.; Scharroo, R.; Smith, W. H. F.; Wobbe, F.

    2017-12-01

    The Generic Mapping Tools (GMT; gmt.soest.hawaii.edu) is a 25-year old, mature open-source software package for the analysis and display of geoscience data (e.g., interpolate, filter, manipulate, project and plot temporal and spatial data). The GMT "toolbox" includes about 80 core and 40 supplemental modules sharing a common set of command options, file structures, and documentation. GMT5, when released in 2013, introduced an application programming interface (API) to allow programmatic access to GMT from other computing environments. Since then, we have released a GMT/MATLAB toolbox, an experimental GMT/Julia package, and will soon introduce a GMT/Python module. In developing these extensions, we wanted to simplify the GMT learning curve but quickly realized the main stumbling blocks to GMT command-line mastery would be ported to the external environments unless we introduced major changes. With thousands of GMT scripts already in use by scientists around the world, we were acutely aware of the need for backwards compatibility. Our solution, to be released as GMT 6, was to add a modern run mode that complements the classic mode offered so far. Modern mode completely eliminates the top three obstacles for new (and not so new) GMT users: (1) The responsibility to properly stack PostScript layers manually (i.e., the -O -K dance), (2) the responsibility of handling output redirection of PostScript (create versus append), and (3) the need to provide commands with repeated information about regions (-R) and projections (-J). Thus, modern mode results in shorter, simpler scripts with fewer pitfalls, without interfering with classic scripts. Our implementation adds five new commands that begin and end a modern session, simplify figure management, automate the conversion of PostScript to more suitable formats, automate region detection, and offer a new automated subplot environment for multi-panel illustrations. Here, we highlight the GMT modern mode and the simplifications it offers, both for command-line use and in external environments. GMT 6 is in beta mode but accessible from our repository. Numerous improvements have been added in addition to modern mode; we expect a formal release in early 2018. Publication partially supported by FCT project UID/GEO/50019/2013 - Instituto D. Luiz.

  6. Freiburg RNA tools: a central online resource for RNA-focused research and teaching.

    PubMed

    Raden, Martin; Ali, Syed M; Alkhnbashi, Omer S; Busch, Anke; Costa, Fabrizio; Davis, Jason A; Eggenhofer, Florian; Gelhausen, Rick; Georg, Jens; Heyne, Steffen; Hiller, Michael; Kundu, Kousik; Kleinkauf, Robert; Lott, Steffen C; Mohamed, Mostafa M; Mattheis, Alexander; Miladi, Milad; Richter, Andreas S; Will, Sebastian; Wolff, Joachim; Wright, Patrick R; Backofen, Rolf

    2018-05-21

    The Freiburg RNA tools webserver is a well established online resource for RNA-focused research. It provides a unified user interface and comprehensive result visualization for efficient command line tools. The webserver includes RNA-RNA interaction prediction (IntaRNA, CopraRNA, metaMIR), sRNA homology search (GLASSgo), sequence-structure alignments (LocARNA, MARNA, CARNA, ExpaRNA), CRISPR repeat classification (CRISPRmap), sequence design (antaRNA, INFO-RNA, SECISDesign), structure aberration evaluation of point mutations (RaSE), and RNA/protein-family models visualization (CMV), and other methods. Open education resources offer interactive visualizations of RNA structure and RNA-RNA interaction prediction as well as basic and advanced sequence alignment algorithms. The services are freely available at http://rna.informatik.uni-freiburg.de.

  7. 32 CFR 700.1054 - Command of a naval base.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 32 National Defense 5 2012-07-01 2012-07-01 false Command of a naval base. 700.1054 Section 700... Command Detail to Duty § 700.1054 Command of a naval base. The officer detailed to command a naval base shall be an officer of the line in the Navy, eligible for command at sea. ...

  8. 32 CFR 700.1054 - Command of a naval base.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 32 National Defense 5 2014-07-01 2014-07-01 false Command of a naval base. 700.1054 Section 700... Command Detail to Duty § 700.1054 Command of a naval base. The officer detailed to command a naval base shall be an officer of the line in the Navy, eligible for command at sea. ...

  9. 32 CFR 700.1056 - Command of a ship.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 32 National Defense 5 2013-07-01 2013-07-01 false Command of a ship. 700.1056 Section 700.1056... Command Detail to Duty § 700.1056 Command of a ship. (a) The officer detailed to command a commissioned ship shall be an officer of the line in the Navy eligible for command at sea. (b) The officer detailed...

  10. 32 CFR 700.1054 - Command of a naval base.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 5 2011-07-01 2011-07-01 false Command of a naval base. 700.1054 Section 700... Command Detail to Duty § 700.1054 Command of a naval base. The officer detailed to command a naval base shall be an officer of the line in the Navy, eligible for command at sea. ...

  11. 32 CFR 700.1054 - Command of a naval base.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 32 National Defense 5 2013-07-01 2013-07-01 false Command of a naval base. 700.1054 Section 700... Command Detail to Duty § 700.1054 Command of a naval base. The officer detailed to command a naval base shall be an officer of the line in the Navy, eligible for command at sea. ...

  12. 32 CFR 700.1054 - Command of a naval base.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 5 2010-07-01 2010-07-01 false Command of a naval base. 700.1054 Section 700... Command Detail to Duty § 700.1054 Command of a naval base. The officer detailed to command a naval base shall be an officer of the line in the Navy, eligible for command at sea. ...

  13. 32 CFR 700.1056 - Command of a ship.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 32 National Defense 5 2014-07-01 2014-07-01 false Command of a ship. 700.1056 Section 700.1056... Command Detail to Duty § 700.1056 Command of a ship. (a) The officer detailed to command a commissioned ship shall be an officer of the line in the Navy eligible for command at sea. (b) The officer detailed...

  14. 32 CFR 700.1056 - Command of a ship.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 5 2010-07-01 2010-07-01 false Command of a ship. 700.1056 Section 700.1056... Command Detail to Duty § 700.1056 Command of a ship. (a) The officer detailed to command a commissioned ship shall be an officer of the line in the Navy eligible for command at sea. (b) The officer detailed...

  15. 32 CFR 700.1056 - Command of a ship.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 5 2011-07-01 2011-07-01 false Command of a ship. 700.1056 Section 700.1056... Command Detail to Duty § 700.1056 Command of a ship. (a) The officer detailed to command a commissioned ship shall be an officer of the line in the Navy eligible for command at sea. (b) The officer detailed...

  16. 32 CFR 700.1056 - Command of a ship.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 32 National Defense 5 2012-07-01 2012-07-01 false Command of a ship. 700.1056 Section 700.1056... Command Detail to Duty § 700.1056 Command of a ship. (a) The officer detailed to command a commissioned ship shall be an officer of the line in the Navy eligible for command at sea. (b) The officer detailed...

  17. Passing in Command Line Arguments and Parallel Cluster/Multicore Batching in R with batch.

    PubMed

    Hoffmann, Thomas J

    2011-03-01

    It is often useful to rerun a command line R script with some slight change in the parameters used to run it - a new set of parameters for a simulation, a different dataset to process, etc. The R package batch provides a means to pass in multiple command line options, including vectors of values in the usual R format, easily into R. The same script can be setup to run things in parallel via different command line arguments. The R package batch also provides a means to simplify this parallel batching by allowing one to use R and an R-like syntax for arguments to spread a script across a cluster or local multicore/multiprocessor computer, with automated syntax for several popular cluster types. Finally it provides a means to aggregate the results together of multiple processes run on a cluster.

  18. Neural network submodel as an abstraction tool: relating network performance to combat outcome

    NASA Astrophysics Data System (ADS)

    Jablunovsky, Greg; Dorman, Clark; Yaworsky, Paul S.

    2000-06-01

    Simulation of Command and Control (C2) networks has historically emphasized individual system performance with little architectural context or credible linkage to `bottom- line' measures of combat outcomes. Renewed interest in modeling C2 effects and relationships stems from emerging network intensive operational concepts. This demands improved methods to span the analytical hierarchy between C2 system performance models and theater-level models. Neural network technology offers a modeling approach that can abstract the essential behavior of higher resolution C2 models within a campaign simulation. The proposed methodology uses off-line learning of the relationships between network state and campaign-impacting performance of a complex C2 architecture and then approximation of that performance as a time-varying parameter in an aggregated simulation. Ultimately, this abstraction tool offers an increased fidelity of C2 system simulation that captures dynamic network dependencies within a campaign context.

  19. MEGA7: Molecular Evolutionary Genetics Analysis Version 7.0 for Bigger Datasets.

    PubMed

    Kumar, Sudhir; Stecher, Glen; Tamura, Koichiro

    2016-07-01

    We present the latest version of the Molecular Evolutionary Genetics Analysis (Mega) software, which contains many sophisticated methods and tools for phylogenomics and phylomedicine. In this major upgrade, Mega has been optimized for use on 64-bit computing systems for analyzing larger datasets. Researchers can now explore and analyze tens of thousands of sequences in Mega The new version also provides an advanced wizard for building timetrees and includes a new functionality to automatically predict gene duplication events in gene family trees. The 64-bit Mega is made available in two interfaces: graphical and command line. The graphical user interface (GUI) is a native Microsoft Windows application that can also be used on Mac OS X. The command line Mega is available as native applications for Windows, Linux, and Mac OS X. They are intended for use in high-throughput and scripted analysis. Both versions are available from www.megasoftware.net free of charge. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  20. BamTools: a C++ API and toolkit for analyzing and managing BAM files

    PubMed Central

    Barnett, Derek W.; Garrison, Erik K.; Quinlan, Aaron R.; Strömberg, Michael P.; Marth, Gabor T.

    2011-01-01

    Motivation: Analysis of genomic sequencing data requires efficient, easy-to-use access to alignment results and flexible data management tools (e.g. filtering, merging, sorting, etc.). However, the enormous amount of data produced by current sequencing technologies is typically stored in compressed, binary formats that are not easily handled by the text-based parsers commonly used in bioinformatics research. Results: We introduce a software suite for programmers and end users that facilitates research analysis and data management using BAM files. BamTools provides both the first C++ API publicly available for BAM file support as well as a command-line toolkit. Availability: BamTools was written in C++, and is supported on Linux, Mac OSX and MS Windows. Source code and documentation are freely available at http://github.org/pezmaster31/bamtools. Contact: barnetde@bc.edu PMID:21493652

  1. The Volume Grid Manipulator (VGM): A Grid Reusability Tool

    NASA Technical Reports Server (NTRS)

    Alter, Stephen J.

    1997-01-01

    This document is a manual describing how to use the Volume Grid Manipulation (VGM) software. The code is specifically designed to alter or manipulate existing surface and volume structured grids to improve grid quality through the reduction of grid line skewness, removal of negative volumes, and adaption of surface and volume grids to flow field gradients. The software uses a command language to perform all manipulations thereby offering the capability of executing multiple manipulations on a single grid during an execution of the code. The command language can be input to the VGM code by a UNIX style redirected file, or interactively while the code is executing. The manual consists of 14 sections. The first is an introduction to grid manipulation; where it is most applicable and where the strengths of such software can be utilized. The next two sections describe the memory management and the manipulation command language. The following 8 sections describe simple and complex manipulations that can be used in conjunction with one another to smooth, adapt, and reuse existing grids for various computations. These are accompanied by a tutorial section that describes how to use the commands and manipulations to solve actual grid generation problems. The last two sections are a command reference guide and trouble shooting sections to aid in the use of the code as well as describe problems associated with generated scripts for manipulation control.

  2. Military Review: The Professional Journal of the U.S. Army. Volume 82, Number 5, September-October 2002

    DTIC Science & Technology

    2002-10-01

    with modern technology.11 The history of war- fare is full of examples of people who relied on the sophistication of their own technology while they... optics , with no surprises in between.8 Using combinations of enemy template overlay, circular, and direct line-of-sight tools, the commander can visualize...continued through the current campaign against terrorism re- sulted in the Army performing a wide range of mili- tary operations across the full spectrum

  3. MYRaf: An Easy Aperture Photometry GUI for IRAF

    NASA Astrophysics Data System (ADS)

    Niaei, M. S.; KiliÇ, Y.; Özeren, F. F.

    2015-07-01

    We describe the design and development of MYRaf, a GUI (Graphical User Interface) that aims to be completely open-source under General Public License (GPL). MYRaf is an easy to use, reliable, and a fast IRAF aperture photometry GUI tool for those who are conversant with text-based software and command-line procedures in GNU/Linux OSs. MYRaf uses IRAF, PyRAF, matplotlib, ginga, alipy, and SExtractor with the general-purpose and high-level programming language Python, and uses the Qt framework.

  4. Neural-Network-Development Program

    NASA Technical Reports Server (NTRS)

    Phillips, Todd A.

    1993-01-01

    NETS, software tool for development and evaluation of neural networks, provides simulation of neural-network algorithms plus computing environment for development of such algorithms. Uses back-propagation learning method for all of networks it creates. Enables user to customize patterns of connections between layers of network. Also provides features for saving, during learning process, values of weights, providing more-precise control over learning process. Written in ANSI standard C language. Machine-independent version (MSC-21588) includes only code for command-line-interface version of NETS 3.0.

  5. An Upgrade of the Aeroheating Software ''MINIVER''

    NASA Technical Reports Server (NTRS)

    Louderback, Pierce

    2013-01-01

    Detailed computational modeling: CFO often used to create and execute computational domains. Increasing complexity when moving from 20 to 30 geometries. Computational time increased as finer grids are used (accuracy). Strong tool, but takes time to set up and run. MINIVER: Uses theoretical and empirical correlations. Orders of magnitude faster to set up and run. Not as accurate as CFO, but gives reasonable estimations. MINIVER's Drawbacks: Rigid command-line interface. Lackluster, unorganized documentation. No central control; multiple versions exist and have diverged.

  6. Unit Testing for Command and Control Systems

    NASA Technical Reports Server (NTRS)

    Alexander, Joshua

    2018-01-01

    Unit tests were created to evaluate the functionality of a Data Generation and Publication tool for a command and control system. These unit tests are developed to constantly evaluate the tool and ensure it functions properly as the command and control system grows in size and scope. Unit tests are a crucial part of testing any software project and are especially instrumental in the development of a command and control system. They save resources, time and costs associated with testing, and catch issues before they become increasingly difficult and costly. The unit tests produced for the Data Generation and Publication tool to be used in a command and control system assure the users and stakeholders of its functionality and offer assurances which are vital in the launching of spacecraft safely.

  7. PAPST, a User Friendly and Powerful Java Platform for ChIP-Seq Peak Co-Localization Analysis and Beyond.

    PubMed

    Bible, Paul W; Kanno, Yuka; Wei, Lai; Brooks, Stephen R; O'Shea, John J; Morasso, Maria I; Loganantharaj, Rasiah; Sun, Hong-Wei

    2015-01-01

    Comparative co-localization analysis of transcription factors (TFs) and epigenetic marks (EMs) in specific biological contexts is one of the most critical areas of ChIP-Seq data analysis beyond peak calling. Yet there is a significant lack of user-friendly and powerful tools geared towards co-localization analysis based exploratory research. Most tools currently used for co-localization analysis are command line only and require extensive installation procedures and Linux expertise. Online tools partially address the usability issues of command line tools, but slow response times and few customization features make them unsuitable for rapid data-driven interactive exploratory research. We have developed PAPST: Peak Assignment and Profile Search Tool, a user-friendly yet powerful platform with a unique design, which integrates both gene-centric and peak-centric co-localization analysis into a single package. Most of PAPST's functions can be completed in less than five seconds, allowing quick cycles of data-driven hypothesis generation and testing. With PAPST, a researcher with or without computational expertise can perform sophisticated co-localization pattern analysis of multiple TFs and EMs, either against all known genes or a set of genomic regions obtained from public repositories or prior analysis. PAPST is a versatile, efficient, and customizable tool for genome-wide data-driven exploratory research. Creatively used, PAPST can be quickly applied to any genomic data analysis that involves a comparison of two or more sets of genomic coordinate intervals, making it a powerful tool for a wide range of exploratory genomic research. We first present PAPST's general purpose features then apply it to several public ChIP-Seq data sets to demonstrate its rapid execution and potential for cutting-edge research with a case study in enhancer analysis. To our knowledge, PAPST is the first software of its kind to provide efficient and sophisticated post peak-calling ChIP-Seq data analysis as an easy-to-use interactive application. PAPST is available at https://github.com/paulbible/papst and is a public domain work.

  8. PAPST, a User Friendly and Powerful Java Platform for ChIP-Seq Peak Co-Localization Analysis and Beyond

    PubMed Central

    Bible, Paul W.; Kanno, Yuka; Wei, Lai; Brooks, Stephen R.; O’Shea, John J.; Morasso, Maria I.; Loganantharaj, Rasiah; Sun, Hong-Wei

    2015-01-01

    Comparative co-localization analysis of transcription factors (TFs) and epigenetic marks (EMs) in specific biological contexts is one of the most critical areas of ChIP-Seq data analysis beyond peak calling. Yet there is a significant lack of user-friendly and powerful tools geared towards co-localization analysis based exploratory research. Most tools currently used for co-localization analysis are command line only and require extensive installation procedures and Linux expertise. Online tools partially address the usability issues of command line tools, but slow response times and few customization features make them unsuitable for rapid data-driven interactive exploratory research. We have developed PAPST: Peak Assignment and Profile Search Tool, a user-friendly yet powerful platform with a unique design, which integrates both gene-centric and peak-centric co-localization analysis into a single package. Most of PAPST’s functions can be completed in less than five seconds, allowing quick cycles of data-driven hypothesis generation and testing. With PAPST, a researcher with or without computational expertise can perform sophisticated co-localization pattern analysis of multiple TFs and EMs, either against all known genes or a set of genomic regions obtained from public repositories or prior analysis. PAPST is a versatile, efficient, and customizable tool for genome-wide data-driven exploratory research. Creatively used, PAPST can be quickly applied to any genomic data analysis that involves a comparison of two or more sets of genomic coordinate intervals, making it a powerful tool for a wide range of exploratory genomic research. We first present PAPST’s general purpose features then apply it to several public ChIP-Seq data sets to demonstrate its rapid execution and potential for cutting-edge research with a case study in enhancer analysis. To our knowledge, PAPST is the first software of its kind to provide efficient and sophisticated post peak-calling ChIP-Seq data analysis as an easy-to-use interactive application. PAPST is available at https://github.com/paulbible/papst and is a public domain work. PMID:25970601

  9. The neXtProt peptide uniqueness checker: a tool for the proteomics community.

    PubMed

    Schaeffer, Mathieu; Gateau, Alain; Teixeira, Daniel; Michel, Pierre-André; Zahn-Zabal, Monique; Lane, Lydie

    2017-11-01

    The neXtProt peptide uniqueness checker allows scientists to define which peptides can be used to validate the existence of human proteins, i.e. map uniquely versus multiply to human protein sequences taking into account isobaric substitutions, alternative splicing and single amino acid variants. The pepx program is available at https://github.com/calipho-sib/pepx and can be launched from the command line or through a cgi web interface. Indexing requires a sequence file in FASTA format. The peptide uniqueness checker tool is freely available on the web at https://www.nextprot.org/tools/peptide-uniqueness-checker and from the neXtProt API at https://api.nextprot.org/. lydie.lane@sib.swiss. © The Author(s) 2017. Published by Oxford University Press.

  10. UNICON: A Powerful and Easy-to-Use Compound Library Converter.

    PubMed

    Sommer, Kai; Friedrich, Nils-Ole; Bietz, Stefan; Hilbig, Matthias; Inhester, Therese; Rarey, Matthias

    2016-06-27

    The accurate handling of different chemical file formats and the consistent conversion between them play important roles for calculations in complex cheminformatics workflows. Working with different cheminformatic tools often makes the conversion between file formats a mandatory step. Such a conversion might become a difficult task in cases where the information content substantially differs. This paper describes UNICON, an easy-to-use software tool for this task. The functionality of UNICON ranges from file conversion between standard formats SDF, MOL2, SMILES, PDB, and PDBx/mmCIF via the generation of 2D structure coordinates and 3D structures to the enumeration of tautomeric forms, protonation states, and conformer ensembles. For this purpose, UNICON bundles the key elements of the previously described NAOMI library in a single, easy-to-use command line tool.

  11. 32 CFR 700.902 - Eligibility for command at sea.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 32 National Defense 5 2012-07-01 2012-07-01 false Eligibility for command at sea. 700.902 Section... Present Contents § 700.902 Eligibility for command at sea. All officers of the line of the Navy, including... deck duties afloat, are eligible for command at sea. ...

  12. 32 CFR 700.902 - Eligibility for command at sea.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 5 2011-07-01 2011-07-01 false Eligibility for command at sea. 700.902 Section... Present Contents § 700.902 Eligibility for command at sea. All officers of the line of the Navy, including... deck duties afloat, are eligible for command at sea. ...

  13. 32 CFR 700.902 - Eligibility for command at sea.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 32 National Defense 5 2013-07-01 2013-07-01 false Eligibility for command at sea. 700.902 Section... Present Contents § 700.902 Eligibility for command at sea. All officers of the line of the Navy, including... deck duties afloat, are eligible for command at sea. ...

  14. 32 CFR 700.902 - Eligibility for command at sea.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 32 National Defense 5 2014-07-01 2014-07-01 false Eligibility for command at sea. 700.902 Section... Present Contents § 700.902 Eligibility for command at sea. All officers of the line of the Navy, including... deck duties afloat, are eligible for command at sea. ...

  15. 32 CFR 700.902 - Eligibility for command at sea.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 5 2010-07-01 2010-07-01 false Eligibility for command at sea. 700.902 Section... Present Contents § 700.902 Eligibility for command at sea. All officers of the line of the Navy, including... deck duties afloat, are eligible for command at sea. ...

  16. Sequence Segmentation with changeptGUI.

    PubMed

    Tasker, Edward; Keith, Jonathan M

    2017-01-01

    Many biological sequences have a segmental structure that can provide valuable clues to their content, structure, and function. The program changept is a tool for investigating the segmental structure of a sequence, and can also be applied to multiple sequences in parallel to identify a common segmental structure, thus providing a method for integrating multiple data types to identify functional elements in genomes. In the previous edition of this book, a command line interface for changept is described. Here we present a graphical user interface for this package, called changeptGUI. This interface also includes tools for pre- and post-processing of data and results to facilitate investigation of the number and characteristics of segment classes.

  17. 33 CFR 83.27 - Vessels not under command or restricted in their ability to maneuver (Rule 27).

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 1 2013-07-01 2013-07-01 false Vessels not under command or... not under command or restricted in their ability to maneuver (Rule 27). (a) Vessels not under command. A vessel not under command shall exhibit: (1) Two all-round red lights in a vertical line where they...

  18. 33 CFR 83.27 - Vessels not under command or restricted in their ability to maneuver (Rule 27).

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 1 2012-07-01 2012-07-01 false Vessels not under command or... not under command or restricted in their ability to maneuver (Rule 27). (a) Vessels not under command. A vessel not under command shall exhibit: (1) Two all-round red lights in a vertical line where they...

  19. 33 CFR 83.27 - Vessels not under command or restricted in their ability to maneuver (Rule 27).

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 1 2014-07-01 2014-07-01 false Vessels not under command or... not under command or restricted in their ability to maneuver (Rule 27). (a) Vessels not under command. A vessel not under command shall exhibit: (1) Two all-round red lights in a vertical line where they...

  20. 33 CFR 83.27 - Vessels not under command or restricted in their ability to maneuver (Rule 27).

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 1 2011-07-01 2011-07-01 false Vessels not under command or... not under command or restricted in their ability to maneuver (Rule 27). (a) Vessels not under command. A vessel not under command shall exhibit: (1) Two all-round red lights in a vertical line where they...

  1. 33 CFR 83.27 - Vessels not under command or restricted in their ability to maneuver (Rule 27).

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Vessels not under command or... not under command or restricted in their ability to maneuver (Rule 27). (a) Vessels not under command. A vessel not under command shall exhibit: (1) Two all-round red lights in a vertical line where they...

  2. Preliminary Assessment of Primary Flight Display Symbology for Electro- Optic Head-Down Displays

    DTIC Science & Technology

    1991-06-01

    information :elated to pitch and power; the vertica! line provides information related to bank and heading. As a result of this geometrica ...steering bar are centered over the aircraft symbol. -n -1-- If the bars are centered, the aircraft is either correcting properly or is flying the desired...a•Isd bas,:ý muve to provide a new pitch command. Roll theading correction ) commands are seen as unbalanced line width, the low command bar side

  3. Resistance is Futile: STScI's Science Planning and Scheduling Team Switches From VMS to Unix Operations

    NASA Astrophysics Data System (ADS)

    Adler, D. S.

    2000-12-01

    The Science Planning and Scheduling Team (SPST) of the Space Telescope Science Institute (STScI) has historically operated exclusively under VMS. Due to diminished support for VMS-based platforms at STScI, SPST is in the process of transitioning to Unix operations. In the summer of 1999, SPST selected Python as the primary scripting language for the operational tools and began translation of the VMS DCL code. As of October 2000, SPST has installed a utility library of 16 modules consisting of 8000 lines of code and 80 Python tools consisting of 13000 lines of code. All tasks related to calendar generation have been switched to Unix operations. Current work focuses on translating the tools used to generate the Science Mission Specifications (SMS). The software required to generate the Mission Schedule and Command Loads (PASS), maintained by another team at STScI, will take longer to translate than the rest of the SPST operational code. SPST is planning on creating tools to access PASS from Unix in the short term. We are on schedule to complete the work needed to fully transition SPST to Unix operations (while remotely accessing PASS on VMS) by the fall of 2001.

  4. Use of Semi-Autonomous Tools for ISS Commanding and Monitoring

    NASA Technical Reports Server (NTRS)

    Brzezinski, Amy S.

    2014-01-01

    As the International Space Station (ISS) has moved into a utilization phase, operations have shifted to become more ground-based with fewer mission control personnel monitoring and commanding multiple ISS systems. This shift to fewer people monitoring more systems has prompted use of semi-autonomous console tools in the ISS Mission Control Center (MCC) to help flight controllers command and monitor the ISS. These console tools perform routine operational procedures while keeping the human operator "in the loop" to monitor and intervene when off-nominal events arise. Two such tools, the Pre-positioned Load (PPL) Loader and Automatic Operators Recorder Manager (AutoORM), are used by the ISS Communications RF Onboard Networks Utilization Specialist (CRONUS) flight control position. CRONUS is responsible for simultaneously commanding and monitoring the ISS Command & Data Handling (C&DH) and Communications and Tracking (C&T) systems. PPL Loader is used to uplink small pieces of frequently changed software data tables, called PPLs, to ISS computers to support different ISS operations. In order to uplink a PPL, a data load command must be built that contains multiple user-input fields. Next, a multiple step commanding and verification procedure must be performed to enable an onboard computer for software uplink, uplink the PPL, verify the PPL has incorporated correctly, and disable the computer for software uplink. PPL Loader provides different levels of automation in both building and uplinking these commands. In its manual mode, PPL Loader automatically builds the PPL data load commands but allows the flight controller to verify and save the commands for future uplink. In its auto mode, PPL Loader automatically builds the PPL data load commands for flight controller verification, but automatically performs the PPL uplink procedure by sending commands and performing verification checks while notifying CRONUS of procedure step completion. If an off-nominal condition occurs during procedure execution, PPL Loader notifies CRONUS through popup messages, allowing CRONUS to examine the situation and choose an option of how PPL loader should proceed with the procedure. The use of PPL Loader to perform frequent, routine PPL uplinks offloads CRONUS to better monitor two ISS systems. It also reduces procedure performance time and decreases risk of command errors. AutoORM identifies ISS communication outage periods and builds commands to lock, playback, and unlock ISS Operations Recorder files. Operation Recorder files are circular buffer files of continually recorded ISS telemetry data. Sections of these files can be locked from further writing, be played back to capture telemetry data that occurred during an ISS loss of signal (LOS) period, and then be unlocked for future recording use. Downlinked Operation Recorder files are used by mission support teams for data analysis, especially if failures occur during LOS. The commands to lock, playback, and unlock Operations Recorder files are encompassed in three different operational procedures and contain multiple user-input fields. AutoORM provides different levels of automation for building and uplinking the commands to lock, playback, and unlock Operations Recorder files. In its automatic mode, AutoORM automatically detects ISS LOS periods, then generates and uplinks the commands to lock, playback, and unlock Operations Recorder files when MCC regains signal with ISS. AutoORM also features semi-autonomous and manual modes which integrate CRONUS more into the command verification and uplink process. AutoORMs ability to automatically detect ISS LOS periods and build the necessary commands to preserve, playback, and release recorded telemetry data greatly offloads CRONUS to perform more high-level cognitive tasks, such as mission planning and anomaly troubleshooting. Additionally, since Operations Recorder commands contain numerical time input fields which are tedious for a human to manually build, AutoORM's ability to automatically build commands reduces operational command errors. PPL Loader and AutoORM demonstrate principles of semi-autonomous operational tools that will benefit future space mission operations. Both tools employ different levels of automation to perform simple and routine procedures, thereby offloading human operators to perform higher-level cognitive tasks. Because both tools provide procedure execution status and highlight off-nominal indications, the flight controller is able to intervene during procedure execution if needed. Semi-autonomous tools and systems that can perform routine procedures, yet keep human operators informed of execution, will be essential in future long-duration missions where the onboard crew will be solely responsible for spacecraft monitoring and control.

  5. Networked sensors for the combat forces

    NASA Astrophysics Data System (ADS)

    Klager, Gene

    2004-11-01

    Real-time and detailed information is critical to the success of ground combat forces. Current manned reconnaissance, surveillance, and target acquisition (RSTA) capabilities are not sufficient to cover battlefield intelligence gaps, provide Beyond-Line-of-Sight (BLOS) targeting, and the ambush avoidance information necessary for combat forces operating in hostile situations, complex terrain, and conducting military operations in urban terrain. This paper describes a current US Army program developing advanced networked unmanned/unattended sensor systems to survey these gaps and provide the Commander with real-time, pertinent information. Networked Sensors for the Combat Forces plans to develop and demonstrate a new generation of low cost distributed unmanned sensor systems organic to the RSTA Element. Networked unmanned sensors will provide remote monitoring of gaps, will increase a unit"s area of coverage, and will provide the commander organic assets to complete his Battlefield Situational Awareness (BSA) picture for direct and indirect fire weapons, early warning, and threat avoidance. Current efforts include developing sensor packages for unmanned ground vehicles, small unmanned aerial vehicles, and unattended ground sensors using advanced sensor technologies. These sensors will be integrated with robust networked communications and Battle Command tools for mission planning, intelligence "reachback", and sensor data management. The network architecture design is based on a model that identifies a three-part modular design: 1) standardized sensor message protocols, 2) Sensor Data Management, and 3) Service Oriented Architecture. This simple model provides maximum flexibility for data exchange, information management and distribution. Products include: Sensor suites optimized for unmanned platforms, stationary and mobile versions of the Sensor Data Management Center, Battle Command planning tools, networked communications, and sensor management software. Details of these products and recent test results will be presented.

  6. EasyModeller: A graphical interface to MODELLER

    PubMed Central

    2010-01-01

    Background MODELLER is a program for automated protein Homology Modeling. It is one of the most widely used tool for homology or comparative modeling of protein three-dimensional structures, but most users find it a bit difficult to start with MODELLER as it is command line based and requires knowledge of basic Python scripting to use it efficiently. Findings The study was designed with an aim to develop of "EasyModeller" tool as a frontend graphical interface to MODELLER using Perl/Tk, which can be used as a standalone tool in windows platform with MODELLER and Python preinstalled. It helps inexperienced users to perform modeling, assessment, visualization, and optimization of protein models in a simple and straightforward way. Conclusion EasyModeller provides a graphical straight forward interface and functions as a stand-alone tool which can be used in a standard personal computer with Microsoft Windows as the operating system. PMID:20712861

  7. PyKE3: data analysis tools for NASA's Kepler, K2, and TESS missions

    NASA Astrophysics Data System (ADS)

    Hedges, Christina L.; Cardoso, Jose Vinicius De Miranda; Barentsen, Geert; Gully-Santiago, Michael A.; Cody, Ann Marie; Barclay, Thomas; Still, Martin; BAY AREA ENVIRONMENTAL RESEARCH IN

    2018-01-01

    The PyKE package is a set of easy to use tools for working with Kepler/K2 data. This includes tools to correct light curves for cotrending basis vectors, turn the raw Target Pixel File data into motion corrected light curves, check for exoplanet false positives and run new PSF photometry. We are now releasing PyKE 3, which is compatible with Python 3, is pip installable and no longer depends on PyRAF. Tools are available both as Python routines and from the command line. New tutorials are available and under construction for users to learn about Kepler and K2 data and how to best use it for their science goals. PyKE is open source and welcomes contributions from the community. Routines and more information are available on the PyKE repository on GitHub.

  8. Extraction and Analysis of Display Data

    NASA Technical Reports Server (NTRS)

    Land, Chris; Moye, Kathryn

    2008-01-01

    The Display Audit Suite is an integrated package of software tools that partly automates the detection of Portable Computer System (PCS) Display errors. [PCS is a lap top computer used onboard the International Space Station (ISS).] The need for automation stems from the large quantity of PCS displays (6,000+, with 1,000,000+ lines of command and telemetry data). The Display Audit Suite includes data-extraction tools, automatic error detection tools, and database tools for generating analysis spread sheets. These spread sheets allow engineers to more easily identify many different kinds of possible errors. The Suite supports over 40 independent analyses, 16 NASA Tech Briefs, November 2008 and complements formal testing by being comprehensive (all displays can be checked) and by revealing errors that are difficult to detect via test. In addition, the Suite can be run early in the development cycle to find and correct errors in advance of testing.

  9. Mobilization Base Requirements Model (MOBREM) Study. Phases I-V.

    DTIC Science & Technology

    1984-08-01

    Department Health Services Command Base Mobilization Plan; DARCOM; Army Communications Command (ACC); Military Transportation Manage- ment Command...Chief of Staff. c. The major commands in CONUS are represented on the next line. FORSCOM, DARCOM, TRADOC, and Health Service Commands are the larger...specialized combat support and combat service support training. Tile general support force (GSF) units are non- deployable ’inits supporting tne CONUS

  10. TRAVEL WITH COMMANDER QUALICIA

    EPA Science Inventory

    Commander Qualicia is a cartoon character created for an on-line training course that describes the quality system for the National Exposure Research Laboratory. In the training, which was developed by the QA staff and graphics/IT support contractors, Commander Qualicia and the ...

  11. The Canonical Robot Command Language (CRCL).

    PubMed

    Proctor, Frederick M; Balakirsky, Stephen B; Kootbally, Zeid; Kramer, Thomas R; Schlenoff, Craig I; Shackleford, William P

    2016-01-01

    Industrial robots can perform motion with sub-millimeter repeatability when programmed using the teach-and-playback method. While effective, this method requires significant up-front time, tying up the robot and a person during the teaching phase. Off-line programming can be used to generate robot programs, but the accuracy of this method is poor unless supplemented with good calibration to remove systematic errors, feed-forward models to anticipate robot response to loads, and sensing to compensate for unmodeled errors. These increase the complexity and up-front cost of the system, but the payback in the reduction of recurring teach programming time can be worth the effort. This payback especially benefits small-batch, short-turnaround applications typical of small-to-medium enterprises, who need the agility afforded by off-line application development to be competitive against low-cost manual labor. To fully benefit from this agile application tasking model, a common representation of tasks should be used that is understood by all of the resources required for the job: robots, tooling, sensors, and people. This paper describes an information model, the Canonical Robot Command Language (CRCL), which provides a high-level description of robot tasks and associated control and status information.

  12. The Canonical Robot Command Language (CRCL)

    PubMed Central

    Proctor, Frederick M.; Balakirsky, Stephen B.; Kootbally, Zeid; Kramer, Thomas R.; Schlenoff, Craig I.; Shackleford, William P.

    2017-01-01

    Industrial robots can perform motion with sub-millimeter repeatability when programmed using the teach-and-playback method. While effective, this method requires significant up-front time, tying up the robot and a person during the teaching phase. Off-line programming can be used to generate robot programs, but the accuracy of this method is poor unless supplemented with good calibration to remove systematic errors, feed-forward models to anticipate robot response to loads, and sensing to compensate for unmodeled errors. These increase the complexity and up-front cost of the system, but the payback in the reduction of recurring teach programming time can be worth the effort. This payback especially benefits small-batch, short-turnaround applications typical of small-to-medium enterprises, who need the agility afforded by off-line application development to be competitive against low-cost manual labor. To fully benefit from this agile application tasking model, a common representation of tasks should be used that is understood by all of the resources required for the job: robots, tooling, sensors, and people. This paper describes an information model, the Canonical Robot Command Language (CRCL), which provides a high-level description of robot tasks and associated control and status information. PMID:28529393

  13. fgui: A Method for Automatically Creating Graphical User Interfaces for Command-Line R Packages

    PubMed Central

    Hoffmann, Thomas J.; Laird, Nan M.

    2009-01-01

    The fgui R package is designed for developers of R packages, to help rapidly, and sometimes fully automatically, create a graphical user interface for a command line R package. The interface is built upon the Tcl/Tk graphical interface included in R. The package further facilitates the developer by loading in the help files from the command line functions to provide context sensitive help to the user with no additional effort from the developer. Passing a function as the argument to the routines in the fgui package creates a graphical interface for the function, and further options are available to tweak this interface for those who want more flexibility. PMID:21625291

  14. CINE: Comet INfrared Excitation

    NASA Astrophysics Data System (ADS)

    de Val-Borro, Miguel; Cordiner, Martin A.; Milam, Stefanie N.; Charnley, Steven B.

    2017-08-01

    CINE calculates infrared pumping efficiencies that can be applied to the most common molecules found in cometary comae such as water, hydrogen cyanide or methanol. One of the main mechanisms for molecular excitation in comets is the fluorescence by the solar radiation followed by radiative decay to the ground vibrational state. This command-line tool calculates the effective pumping rates for rotational levels in the ground vibrational state scaled by the heliocentric distance of the comet. Fluorescence coefficients are useful for modeling rotational emission lines observed in cometary spectra at sub-millimeter wavelengths. Combined with computational methods to solve the radiative transfer equations based, e.g., on the Monte Carlo algorithm, this model can retrieve production rates and rotational temperatures from the observed emission spectrum.

  15. Combat Service Support (CSS) Enabler Functional Assessment (CEFA)

    DTIC Science & Technology

    1998-07-01

    CDR), Combined Arms Support Command (CASCOM) with a tool to aid decision making related to mitigating E/I peacetime (programmatic) and wartime risks...not be fielded by Fiscal Year (FY) 10. Based on their estimates, any decisions , especially reductions in manpower, which rely on the existence of the E...Support (CSS) enablers/initiatives (E/I), thereby providing the Commander (CDR), Combined Arms Support Command (CASCOM) with a tool to aid decision

  16. MEGA X: Molecular Evolutionary Genetics Analysis across Computing Platforms.

    PubMed

    Kumar, Sudhir; Stecher, Glen; Li, Michael; Knyaz, Christina; Tamura, Koichiro

    2018-06-01

    The Molecular Evolutionary Genetics Analysis (Mega) software implements many analytical methods and tools for phylogenomics and phylomedicine. Here, we report a transformation of Mega to enable cross-platform use on Microsoft Windows and Linux operating systems. Mega X does not require virtualization or emulation software and provides a uniform user experience across platforms. Mega X has additionally been upgraded to use multiple computing cores for many molecular evolutionary analyses. Mega X is available in two interfaces (graphical and command line) and can be downloaded from www.megasoftware.net free of charge.

  17. Adaptive refinement tools for tetrahedral unstructured grids

    NASA Technical Reports Server (NTRS)

    Pao, S. Paul (Inventor); Abdol-Hamid, Khaled S. (Inventor)

    2011-01-01

    An exemplary embodiment providing one or more improvements includes software which is robust, efficient, and has a very fast run time for user directed grid enrichment and flow solution adaptive grid refinement. All user selectable options (e.g., the choice of functions, the choice of thresholds, etc.), other than a pre-marked cell list, can be entered on the command line. The ease of application is an asset for flow physics research and preliminary design CFD analysis where fast grid modification is often needed to deal with unanticipated development of flow details.

  18. MeV+R: using MeV as a graphical user interface for Bioconductor applications in microarray analysis

    PubMed Central

    Chu, Vu T; Gottardo, Raphael; Raftery, Adrian E; Bumgarner, Roger E; Yeung, Ka Yee

    2008-01-01

    We present MeV+R, an integration of the JAVA MultiExperiment Viewer program with Bioconductor packages. This integration of MultiExperiment Viewer and R is easily extensible to other R packages and provides users with point and click access to traditionally command line driven tools written in R. We demonstrate the ability to use MultiExperiment Viewer as a graphical user interface for Bioconductor applications in microarray data analysis by incorporating three Bioconductor packages, RAMA, BRIDGE and iterativeBMA. PMID:18652698

  19. 50 CFR 600.10 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... consisting of a float and one or more lines suspended therefrom. A hook or hooks are on the lines at or near... live fish on board a vessel. Center means one of the five NMFS Fisheries Science Centers. Charter boat... carry six or fewer passengers for hire. Coast Guard Commander means one of the commanding officers of...

  20. NHD, riverspill, and the development of the incident command tool for drinking water protection.

    Treesearch

    William B. Samuels; Rakesh Bahadur; Michael C. Monteith; David E. Amstutz; Jonathan M. Pickus; Katherine Parker; Douglas Ryan

    2006-01-01

    This project involved the development of an information tool that gives Incident Commanders the critical information they need to make informed decisions regarding the consequences of threats to public water supply intakes.

  1. Tools for automating spacecraft ground systems: The Intelligent Command and Control (ICC) approach

    NASA Technical Reports Server (NTRS)

    Stoffel, A. William; Mclean, David

    1996-01-01

    The practical application of scripting languages and World Wide Web tools to the support of spacecraft ground system automation, is reported on. The mission activities and the automation tools used at the Goddard Space Flight Center (MD) are reviewed. The use of the Tool Command Language (TCL) and the Practical Extraction and Report Language (PERL) scripting tools for automating mission operations is discussed together with the application of different tools for the Compton Gamma Ray Observatory ground system.

  2. deepTools2: a next generation web server for deep-sequencing data analysis.

    PubMed

    Ramírez, Fidel; Ryan, Devon P; Grüning, Björn; Bhardwaj, Vivek; Kilpert, Fabian; Richter, Andreas S; Heyne, Steffen; Dündar, Friederike; Manke, Thomas

    2016-07-08

    We present an update to our Galaxy-based web server for processing and visualizing deeply sequenced data. Its core tool set, deepTools, allows users to perform complete bioinformatic workflows ranging from quality controls and normalizations of aligned reads to integrative analyses, including clustering and visualization approaches. Since we first described our deepTools Galaxy server in 2014, we have implemented new solutions for many requests from the community and our users. Here, we introduce significant enhancements and new tools to further improve data visualization and interpretation. deepTools continue to be open to all users and freely available as a web service at deeptools.ie-freiburg.mpg.de The new deepTools2 suite can be easily deployed within any Galaxy framework via the toolshed repository, and we also provide source code for command line usage under Linux and Mac OS X. A public and documented API for access to deepTools functionality is also available. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  3. NCBI BLAST+ integrated into Galaxy.

    PubMed

    Cock, Peter J A; Chilton, John M; Grüning, Björn; Johnson, James E; Soranzo, Nicola

    2015-01-01

    The NCBI BLAST suite has become ubiquitous in modern molecular biology and is used for small tasks such as checking capillary sequencing results of single PCR products, genome annotation or even larger scale pan-genome analyses. For early adopters of the Galaxy web-based biomedical data analysis platform, integrating BLAST into Galaxy was a natural step for sequence comparison workflows. The command line NCBI BLAST+ tool suite was wrapped for use within Galaxy. Appropriate datatypes were defined as needed. The integration of the BLAST+ tool suite into Galaxy has the goal of making common BLAST tasks easy and advanced tasks possible. This project is an informal international collaborative effort, and is deployed and used on Galaxy servers worldwide. Several examples of applications are described here.

  4. A Software Upgrade of the NASA Aeroheating Code "MINIVER"

    NASA Technical Reports Server (NTRS)

    Louderback, Pierce Mathew

    2013-01-01

    Computational Fluid Dynamics (CFD) is a powerful and versatile tool simulating fluid and thermal environments of launch and re-entry vehicles alike. Where it excels in power and accuracy, however, it lacks in speed. An alternative tool for this purpose is known as MINIVER, an aeroheating code widely used by NASA and within the aerospace industry. Capable of providing swift, reasonably accurate approximations of the fluid and thermal environment of launch vehicles, MINIVER is used where time is of the essence and accuracy need not be exact. However, MINIVER is an old, aging tool: running on a user-unfriendly, legacy command-line interface, it is difficult for it to keep pace with more modem software tools. Florida Institute of Technology was tasked with the construction of a new Graphical User Interface (GUI) that implemented the legacy version's capabilities and enhanced them with new tools and utilities. This thesis provides background to the legacy version of the program, the progression and final version of a modem user interface, and benchmarks to demonstrate its usefulness.

  5. Garrison Leadership: Enlisting Others

    DTIC Science & Technology

    2010-01-01

    days), the GC should conduct an organizational diagnosis to assess the culture and command climate using the existing IMCOM tools of the or...conduct an organizational diagnosis to assess the culture and command climate using the existing IMCOM tools of the or- ganizational self-assessment

  6. Mass Storage System - Gyrfalcon | High-Performance Computing | NREL

    Science.gov Websites

    . At the command line of one of Peregrine's login nodes, enter one of the following commands to copy directory.tgz /mss/ Option 3: The rsync command compares one directory to another and makes > Option 4: The simple Linux cp command can be used to copy a file from one directory to another

  7. Autonomous Reconfigurable Control Allocation (ARCA) for Reusable Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Hodel, A. S.; Callahan, Ronnie; Jackson, Scott (Technical Monitor)

    2002-01-01

    The role of control allocation (CA) in modern aerospace vehicles is to compute a command vector delta(sub c) is a member of IR(sup n(sub a)) that corresponding to commanded or desired body-frame torques (moments) tou(sub c) = [L M N](sup T) to the vehicle, compensating for and/or responding to inaccuracies in off-line nominal control allocation calculations, actuator failures and/or degradations (reduced effectiveness), or actuator limitations (rate/position saturation). The command vector delta(sub c) may govern the behavior of, e.g., acrosurfaces, reaction thrusters, engine gimbals and/or thrust vectoring. Typically, the individual moments generated in response to each of the n(sub a) commands does not lie strictly in the roll, pitch, or yaw axes, and so a common practice is to group or gang actuators so that a one-to-one mapping from torque commands tau(sub c) actuator commands delta(sub c) may be achieved in an off-line computed CA function.

  8. Gender Differences between Graphical User Interfaces and Command Line Interfaces in Computer Instruction.

    ERIC Educational Resources Information Center

    Barker, Dan L.

    This study focused primarily on two types of computer interfaces and the differences in academic performance that resulted from their use; it was secondarily designed to examine gender differences that may have existed before and after any change in interface. Much of the basic research in computer use was conducted with command line interface…

  9. Joint Interdiction

    DTIC Science & Technology

    2016-09-09

    law enforcement detachment (USCG) LEO law enforcement operations LOC line of communications MACCS Marine air command and control system MAS...enemy command and control [C2], intelligence, fires, reinforcing units, lines of communications [ LOCs ], logistics, and other operational- and tactical...enemy naval, engineering, and personnel resources to the tasks of repairing and recovering damaged equipment, facilities, and LOCs . It can draw the

  10. Open | SpeedShop: An Open Source Infrastructure for Parallel Performance Analysis

    DOE PAGES

    Schulz, Martin; Galarowicz, Jim; Maghrak, Don; ...

    2008-01-01

    Over the last decades a large number of performance tools has been developed to analyze and optimize high performance applications. Their acceptance by end users, however, has been slow: each tool alone is often limited in scope and comes with widely varying interfaces and workflow constraints, requiring different changes in the often complex build and execution infrastructure of the target application. We started the Open | SpeedShop project about 3 years ago to overcome these limitations and provide efficient, easy to apply, and integrated performance analysis for parallel systems. Open | SpeedShop has two different faces: it provides an interoperable tool set covering themore » most common analysis steps as well as a comprehensive plugin infrastructure for building new tools. In both cases, the tools can be deployed to large scale parallel applications using DPCL/Dyninst for distributed binary instrumentation. Further, all tools developed within or on top of Open | SpeedShop are accessible through multiple fully equivalent interfaces including an easy-to-use GUI as well as an interactive command line interface reducing the usage threshold for those tools.« less

  11. CellSegm - a MATLAB toolbox for high-throughput 3D cell segmentation

    PubMed Central

    2013-01-01

    The application of fluorescence microscopy in cell biology often generates a huge amount of imaging data. Automated whole cell segmentation of such data enables the detection and analysis of individual cells, where a manual delineation is often time consuming, or practically not feasible. Furthermore, compared to manual analysis, automation normally has a higher degree of reproducibility. CellSegm, the software presented in this work, is a Matlab based command line software toolbox providing an automated whole cell segmentation of images showing surface stained cells, acquired by fluorescence microscopy. It has options for both fully automated and semi-automated cell segmentation. Major algorithmic steps are: (i) smoothing, (ii) Hessian-based ridge enhancement, (iii) marker-controlled watershed segmentation, and (iv) feature-based classfication of cell candidates. Using a wide selection of image recordings and code snippets, we demonstrate that CellSegm has the ability to detect various types of surface stained cells in 3D. After detection and outlining of individual cells, the cell candidates can be subject to software based analysis, specified and programmed by the end-user, or they can be analyzed by other software tools. A segmentation of tissue samples with appropriate characteristics is also shown to be resolvable in CellSegm. The command-line interface of CellSegm facilitates scripting of the separate tools, all implemented in Matlab, offering a high degree of flexibility and tailored workflows for the end-user. The modularity and scripting capabilities of CellSegm enable automated workflows and quantitative analysis of microscopic data, suited for high-throughput image based screening. PMID:23938087

  12. CellSegm - a MATLAB toolbox for high-throughput 3D cell segmentation.

    PubMed

    Hodneland, Erlend; Kögel, Tanja; Frei, Dominik Michael; Gerdes, Hans-Hermann; Lundervold, Arvid

    2013-08-09

    : The application of fluorescence microscopy in cell biology often generates a huge amount of imaging data. Automated whole cell segmentation of such data enables the detection and analysis of individual cells, where a manual delineation is often time consuming, or practically not feasible. Furthermore, compared to manual analysis, automation normally has a higher degree of reproducibility. CellSegm, the software presented in this work, is a Matlab based command line software toolbox providing an automated whole cell segmentation of images showing surface stained cells, acquired by fluorescence microscopy. It has options for both fully automated and semi-automated cell segmentation. Major algorithmic steps are: (i) smoothing, (ii) Hessian-based ridge enhancement, (iii) marker-controlled watershed segmentation, and (iv) feature-based classfication of cell candidates. Using a wide selection of image recordings and code snippets, we demonstrate that CellSegm has the ability to detect various types of surface stained cells in 3D. After detection and outlining of individual cells, the cell candidates can be subject to software based analysis, specified and programmed by the end-user, or they can be analyzed by other software tools. A segmentation of tissue samples with appropriate characteristics is also shown to be resolvable in CellSegm. The command-line interface of CellSegm facilitates scripting of the separate tools, all implemented in Matlab, offering a high degree of flexibility and tailored workflows for the end-user. The modularity and scripting capabilities of CellSegm enable automated workflows and quantitative analysis of microscopic data, suited for high-throughput image based screening.

  13. Gnuastro: GNU Astronomy Utilities

    NASA Astrophysics Data System (ADS)

    Akhlaghi, Mohammad

    2018-01-01

    Gnuastro (GNU Astronomy Utilities) manipulates and analyzes astronomical data. It is an official GNU package of a large collection of programs and C/C++ library functions. Command-line programs perform arithmetic operations on images, convert FITS images to common types like JPG or PDF, convolve an image with a given kernel or matching of kernels, perform cosmological calculations, crop parts of large images (possibly in multiple files), manipulate FITS extensions and keywords, and perform statistical operations. In addition, it contains programs to make catalogs from detection maps, add noise, make mock profiles with a variety of radial functions using monte-carlo integration for their centers, match catalogs, and detect objects in an image among many other operations. The command-line programs share the same basic command-line user interface for the comfort of both the users and developers. Gnuastro is written to comply fully with the GNU coding standards and integrates well with all Unix-like operating systems. This enables astronomers to expect a fully familiar experience in the source code, building, installing and command-line user interaction that they have seen in all the other GNU software that they use. Gnuastro's extensive library is included for users who want to build their own unique programs.

  14. Nephele: a cloud platform for simplified, standardized and reproducible microbiome data analysis.

    PubMed

    Weber, Nick; Liou, David; Dommer, Jennifer; MacMenamin, Philip; Quiñones, Mariam; Misner, Ian; Oler, Andrew J; Wan, Joe; Kim, Lewis; Coakley McCarthy, Meghan; Ezeji, Samuel; Noble, Karlynn; Hurt, Darrell E

    2018-04-15

    Widespread interest in the study of the microbiome has resulted in data proliferation and the development of powerful computational tools. However, many scientific researchers lack the time, training, or infrastructure to work with large datasets or to install and use command line tools. The National Institute of Allergy and Infectious Diseases (NIAID) has created Nephele, a cloud-based microbiome data analysis platform with standardized pipelines and a simple web interface for transforming raw data into biological insights. Nephele integrates common microbiome analysis tools as well as valuable reference datasets like the healthy human subjects cohort of the Human Microbiome Project (HMP). Nephele is built on the Amazon Web Services cloud, which provides centralized and automated storage and compute capacity, thereby reducing the burden on researchers and their institutions. https://nephele.niaid.nih.gov and https://github.com/niaid/Nephele. darrell.hurt@nih.gov.

  15. TabSQL: a MySQL tool to facilitate mapping user data to public databases.

    PubMed

    Xia, Xiao-Qin; McClelland, Michael; Wang, Yipeng

    2010-06-23

    With advances in high-throughput genomics and proteomics, it is challenging for biologists to deal with large data files and to map their data to annotations in public databases. We developed TabSQL, a MySQL-based application tool, for viewing, filtering and querying data files with large numbers of rows. TabSQL provides functions for downloading and installing table files from public databases including the Gene Ontology database (GO), the Ensembl databases, and genome databases from the UCSC genome bioinformatics site. Any other database that provides tab-delimited flat files can also be imported. The downloaded gene annotation tables can be queried together with users' data in TabSQL using either a graphic interface or command line. TabSQL allows queries across the user's data and public databases without programming. It is a convenient tool for biologists to annotate and enrich their data.

  16. TabSQL: a MySQL tool to facilitate mapping user data to public databases

    PubMed Central

    2010-01-01

    Background With advances in high-throughput genomics and proteomics, it is challenging for biologists to deal with large data files and to map their data to annotations in public databases. Results We developed TabSQL, a MySQL-based application tool, for viewing, filtering and querying data files with large numbers of rows. TabSQL provides functions for downloading and installing table files from public databases including the Gene Ontology database (GO), the Ensembl databases, and genome databases from the UCSC genome bioinformatics site. Any other database that provides tab-delimited flat files can also be imported. The downloaded gene annotation tables can be queried together with users' data in TabSQL using either a graphic interface or command line. Conclusions TabSQL allows queries across the user's data and public databases without programming. It is a convenient tool for biologists to annotate and enrich their data. PMID:20573251

  17. Nephele: a cloud platform for simplified, standardized and reproducible microbiome data analysis

    PubMed Central

    Weber, Nick; Liou, David; Dommer, Jennifer; MacMenamin, Philip; Quiñones, Mariam; Misner, Ian; Oler, Andrew J; Wan, Joe; Kim, Lewis; Coakley McCarthy, Meghan; Ezeji, Samuel; Noble, Karlynn; Hurt, Darrell E

    2018-01-01

    Abstract Motivation Widespread interest in the study of the microbiome has resulted in data proliferation and the development of powerful computational tools. However, many scientific researchers lack the time, training, or infrastructure to work with large datasets or to install and use command line tools. Results The National Institute of Allergy and Infectious Diseases (NIAID) has created Nephele, a cloud-based microbiome data analysis platform with standardized pipelines and a simple web interface for transforming raw data into biological insights. Nephele integrates common microbiome analysis tools as well as valuable reference datasets like the healthy human subjects cohort of the Human Microbiome Project (HMP). Nephele is built on the Amazon Web Services cloud, which provides centralized and automated storage and compute capacity, thereby reducing the burden on researchers and their institutions. Availability and implementation https://nephele.niaid.nih.gov and https://github.com/niaid/Nephele Contact darrell.hurt@nih.gov PMID:29028892

  18. Applicability of Performance Assessment Tools to Marine Corps Air Ground Task Force C4 System of Systems Performance Assessment

    DTIC Science & Technology

    2010-09-01

    application of existing assessment tools that may be applicable to Marine Air Ground Task Force (MAGTF) Command, Control, Communications and...of existing assessment tools that may be applicable to Marine Air Ground Task Force (MAGTF) Command, Control, Communications and Computers (C4...assessment tools and analysis concepts that may be extended to the Marine Corps’ C4 System of Systems assessment methodology as a means to obtain a

  19. GenomicTools: a computational platform for developing high-throughput analytics in genomics.

    PubMed

    Tsirigos, Aristotelis; Haiminen, Niina; Bilal, Erhan; Utro, Filippo

    2012-01-15

    Recent advances in sequencing technology have resulted in the dramatic increase of sequencing data, which, in turn, requires efficient management of computational resources, such as computing time, memory requirements as well as prototyping of computational pipelines. We present GenomicTools, a flexible computational platform, comprising both a command-line set of tools and a C++ API, for the analysis and manipulation of high-throughput sequencing data such as DNA-seq, RNA-seq, ChIP-seq and MethylC-seq. GenomicTools implements a variety of mathematical operations between sets of genomic regions thereby enabling the prototyping of computational pipelines that can address a wide spectrum of tasks ranging from pre-processing and quality control to meta-analyses. Additionally, the GenomicTools platform is designed to analyze large datasets of any size by minimizing memory requirements. In practical applications, where comparable, GenomicTools outperforms existing tools in terms of both time and memory usage. The GenomicTools platform (version 2.0.0) was implemented in C++. The source code, documentation, user manual, example datasets and scripts are available online at http://code.google.com/p/ibm-cbc-genomic-tools.

  20. Command History for 1990

    DTIC Science & Technology

    1991-05-01

    Marine Corps Tiaining Systems (CBESS) memorization training Inteligence Center, Dam Neck Threat memorization training Commander Tactical Wings, Atlantic...News Shipbuilding Technical training AEGIS Training Center, Dare Artificial Intelligence (Al) Tools Computerized firm-end analysis tools NETSCPAC...Technology Department and provides computational and electronic mail support for research in areas of artificial intelligence, computer-assisted instruction

  1. Using PSEA-Quant for Protein Set Enrichment Analysis of Quantitative Mass Spectrometry-Based Proteomics

    PubMed Central

    Lavallée-Adam, Mathieu

    2017-01-01

    PSEA-Quant analyzes quantitative mass spectrometry-based proteomics datasets to identify enrichments of annotations contained in repositories such as the Gene Ontology and Molecular Signature databases. It allows users to identify the annotations that are significantly enriched for reproducibly quantified high abundance proteins. PSEA-Quant is available on the web and as a command-line tool. It is compatible with all label-free and isotopic labeling-based quantitative proteomics methods. This protocol describes how to use PSEA-Quant and interpret its output. The importance of each parameter as well as troubleshooting approaches are also discussed. PMID:27010334

  2. KERNELHR: A program for estimating animal home ranges

    USGS Publications Warehouse

    Seaman, D.E.; Griffith, B.; Powell, R.A.

    1998-01-01

    Kernel methods are state of the art for estimating animal home-range area and utilization distribution (UD). The KERNELHR program was developed to provide researchers and managers a tool to implement this extremely flexible set of methods with many variants. KERNELHR runs interactively or from the command line on any personal computer (PC) running DOS. KERNELHR provides output of fixed and adaptive kernel home-range estimates, as well as density values in a format suitable for in-depth statistical and spatial analyses. An additional package of programs creates contour files for plotting in geographic information systems (GIS) and estimates core areas of ranges.

  3. proBAMconvert: A Conversion Tool for proBAM/proBed.

    PubMed

    Olexiouk, Volodimir; Menschaert, Gerben

    2017-07-07

    The introduction of new standard formats, proBAM and proBed, improves the integration of genomics and proteomics information, thus aiding proteogenomics applications. These novel formats enable peptide spectrum matches (PSM) to be stored, inspected, and analyzed within the context of the genome. However, an easy-to-use and transparent tool to convert mass spectrometry identification files to these new formats is indispensable. proBAMconvert enables the conversion of common identification file formats (mzIdentML, mzTab, and pepXML) to proBAM/proBed using an intuitive interface. Furthermore, ProBAMconvert enables information to be output both at the PSM and peptide levels and has a command line interface next to the graphical user interface. Detailed documentation and a completely worked-out tutorial is available at http://probam.biobix.be .

  4. Stackfile Database

    NASA Technical Reports Server (NTRS)

    deVarvalho, Robert; Desai, Shailen D.; Haines, Bruce J.; Kruizinga, Gerhard L.; Gilmer, Christopher

    2013-01-01

    This software provides storage retrieval and analysis functionality for managing satellite altimetry data. It improves the efficiency and analysis capabilities of existing database software with improved flexibility and documentation. It offers flexibility in the type of data that can be stored. There is efficient retrieval either across the spatial domain or the time domain. Built-in analysis tools are provided for frequently performed altimetry tasks. This software package is used for storing and manipulating satellite measurement data. It was developed with a focus on handling the requirements of repeat-track altimetry missions such as Topex and Jason. It was, however, designed to work with a wide variety of satellite measurement data [e.g., Gravity Recovery And Climate Experiment -- GRACE). The software consists of several command-line tools for importing, retrieving, and analyzing satellite measurement data.

  5. CAT 2 - An improved version of Cryogenic Analysis Tools for online and offline monitoring and analysis of large size cryostats

    NASA Astrophysics Data System (ADS)

    Pagliarone, C. E.; Uttaro, S.; Cappelli, L.; Fallone, M.; Kartal, S.

    2017-02-01

    CAT, Cryogenic Analysis Tools is a software package developed using LabVIEW and ROOT environments to analyze the performances of large size cryostats, where many parameters, input, and control variables need to be acquired and studied at the same time. The present paper describes how CAT works and which are the main improvements achieved in the new version: CAT 2. New Graphical User Interfaces have been developed in order to make the use of the full package more user-friendly as well as a process of resource optimization has been carried out. The offline analysis of the full cryostat performances is available both trough ROOT line command interface band also by using the new graphical interfaces.

  6. Reaction Decoder Tool (RDT): extracting features from chemical reactions.

    PubMed

    Rahman, Syed Asad; Torrance, Gilliean; Baldacci, Lorenzo; Martínez Cuesta, Sergio; Fenninger, Franz; Gopal, Nimish; Choudhary, Saket; May, John W; Holliday, Gemma L; Steinbeck, Christoph; Thornton, Janet M

    2016-07-01

    Extracting chemical features like Atom-Atom Mapping (AAM), Bond Changes (BCs) and Reaction Centres from biochemical reactions helps us understand the chemical composition of enzymatic reactions. Reaction Decoder is a robust command line tool, which performs this task with high accuracy. It supports standard chemical input/output exchange formats i.e. RXN/SMILES, computes AAM, highlights BCs and creates images of the mapped reaction. This aids in the analysis of metabolic pathways and the ability to perform comparative studies of chemical reactions based on these features. This software is implemented in Java, supported on Windows, Linux and Mac OSX, and freely available at https://github.com/asad/ReactionDecoder : asad@ebi.ac.uk or s9asad@gmail.com. © The Author 2016. Published by Oxford University Press.

  7. The Supreme Allied Commander’s Operational Approach

    DTIC Science & Technology

    2014-05-22

    the Seine below Paris , and then began converging toward the northern line of march just below the first. A third line broke off from the second in a...toward the Seine Basin and Paris .” Therefore, in the aftermath of Goodwood Eisenhower felt both disappointed and deceived. After Goodwood, Eisenhower...command. Simultaneously, Bradley’s armies were to “[capture] Brest , [protect] the southern flank of the

  8. msap: a tool for the statistical analysis of methylation-sensitive amplified polymorphism data.

    PubMed

    Pérez-Figueroa, A

    2013-05-01

    In this study msap, an R package which analyses methylation-sensitive amplified polymorphism (MSAP or MS-AFLP) data is presented. The program provides a deep analysis of epigenetic variation starting from a binary data matrix indicating the banding pattern between the isoesquizomeric endonucleases HpaII and MspI, with differential sensitivity to cytosine methylation. After comparing the restriction fragments, the program determines if each fragment is susceptible to methylation (representative of epigenetic variation) or if there is no evidence of methylation (representative of genetic variation). The package provides, in a user-friendly command line interface, a pipeline of different analyses of the variation (genetic and epigenetic) among user-defined groups of samples, as well as the classification of the methylation occurrences in those groups. Statistical testing provides support to the analyses. A comprehensive report of the analyses and several useful plots could help researchers to assess the epigenetic and genetic variation in their MSAP experiments. msap is downloadable from CRAN (http://cran.r-project.org/) and its own webpage (http://msap.r-forge.R-project.org/). The package is intended to be easy to use even for those people unfamiliar with the R command line environment. Advanced users may take advantage of the available source code to adapt msap to more complex analyses. © 2013 Blackwell Publishing Ltd.

  9. Telemetry-Enhancing Scripts

    NASA Technical Reports Server (NTRS)

    Maimone, Mark W.

    2009-01-01

    Scripts Providing a Cool Kit of Telemetry Enhancing Tools (SPACKLE) is a set of software tools that fill gaps in capabilities of other software used in processing downlinked data in the Mars Exploration Rovers (MER) flight and test-bed operations. SPACKLE tools have helped to accelerate the automatic processing and interpretation of MER mission data, enabling non-experts to understand and/or use MER query and data product command simulation software tools more effectively. SPACKLE has greatly accelerated some operations and provides new capabilities. The tools of SPACKLE are written, variously, in Perl or the C or C++ language. They perform a variety of search and shortcut functions that include the following: Generating text-only, Event Report-annotated, and Web-enhanced views of command sequences; Labeling integer enumerations with their symbolic meanings in text messages and engineering channels; Systematic detecting of corruption within data products; Generating text-only displays of data-product catalogs including downlink status; Validating and labeling of commands related to data products; Performing of convenient searches of detailed engineering data spanning multiple Martian solar days; Generating tables of initial conditions pertaining to engineering, health, and accountability data; Simplified construction and simulation of command sequences; and Fast time format conversions and sorting.

  10. QDENSITY—A Mathematica Quantum Computer simulation

    NASA Astrophysics Data System (ADS)

    Juliá-Díaz, Bruno; Burdis, Joseph M.; Tabakin, Frank

    2006-06-01

    This Mathematica 5.2 package is a simulation of a Quantum Computer. The program provides a modular, instructive approach for generating the basic elements that make up a quantum circuit. The main emphasis is on using the density matrix, although an approach using state vectors is also implemented in the package. The package commands are defined in Qdensity.m which contains the tools needed in quantum circuits, e.g., multiqubit kets, projectors, gates, etc. Selected examples of the basic commands are presented here and a tutorial notebook, Tutorial.nb is provided with the package (available on our website) that serves as a full guide to the package. Finally, application is made to a variety of relevant cases, including Teleportation, Quantum Fourier transform, Grover's search and Shor's algorithm, in separate notebooks: QFT.nb, Teleportation.nb, Grover.nb and Shor.nb where each algorithm is explained in detail. Finally, two examples of the construction and manipulation of cluster states, which are part of "one way computing" ideas, are included as an additional tool in the notebook Cluster.nb. A Mathematica palette containing most commands in QDENSITY is also included: QDENSpalette.nb. Program summaryTitle of program: QDENSITY Catalogue identifier: ADXH_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXH_v1_0 Program available from: CPC Program Library, Queen's University of Belfast, N. Ireland Operating systems: Any which supports Mathematica; tested under Microsoft Windows XP, Macintosh OS X, and Linux FC4 Programming language used: Mathematica 5.2 No. of bytes in distributed program, including test data, etc.: 180 581 No. of lines in distributed program, including test data, etc.: 19 382 Distribution format: tar.gz Method of solution: A Mathematica package is provided which contains commands to create and analyze quantum circuits. Several Mathematica notebooks containing relevant examples: Teleportation, Shor's Algorithm and Grover's search are explained in detail. A tutorial, Tutorial.nb is also enclosed. QDENSITY is available at http://www.pitt.edu/~tabakin/QDENSITY.

  11. BrEPS 2.0: Optimization of sequence pattern prediction for enzyme annotation.

    PubMed

    Dudek, Christian-Alexander; Dannheim, Henning; Schomburg, Dietmar

    2017-01-01

    The prediction of gene functions is crucial for a large number of different life science areas. Faster high throughput sequencing techniques generate more and larger datasets. The manual annotation by classical wet-lab experiments is not suitable for these large amounts of data. We showed earlier that the automatic sequence pattern-based BrEPS protocol, based on manually curated sequences, can be used for the prediction of enzymatic functions of genes. The growing sequence databases provide the opportunity for more reliable patterns, but are also a challenge for the implementation of automatic protocols. We reimplemented and optimized the BrEPS pattern generation to be applicable for larger datasets in an acceptable timescale. Primary improvement of the new BrEPS protocol is the enhanced data selection step. Manually curated annotations from Swiss-Prot are used as reliable source for function prediction of enzymes observed on protein level. The pool of sequences is extended by highly similar sequences from TrEMBL and SwissProt. This allows us to restrict the selection of Swiss-Prot entries, without losing the diversity of sequences needed to generate significant patterns. Additionally, a supporting pattern type was introduced by extending the patterns at semi-conserved positions with highly similar amino acids. Extended patterns have an increased complexity, increasing the chance to match more sequences, without losing the essential structural information of the pattern. To enhance the usability of the database, we introduced enzyme function prediction based on consensus EC numbers and IUBMB enzyme nomenclature. BrEPS is part of the Braunschweig Enzyme Database (BRENDA) and is available on a completely redesigned website and as download. The database can be downloaded and used with the BrEPScmd command line tool for large scale sequence analysis. The BrEPS website and downloads for the database creation tool, command line tool and database are freely accessible at http://breps.tu-bs.de.

  12. BrEPS 2.0: Optimization of sequence pattern prediction for enzyme annotation

    PubMed Central

    Schomburg, Dietmar

    2017-01-01

    The prediction of gene functions is crucial for a large number of different life science areas. Faster high throughput sequencing techniques generate more and larger datasets. The manual annotation by classical wet-lab experiments is not suitable for these large amounts of data. We showed earlier that the automatic sequence pattern-based BrEPS protocol, based on manually curated sequences, can be used for the prediction of enzymatic functions of genes. The growing sequence databases provide the opportunity for more reliable patterns, but are also a challenge for the implementation of automatic protocols. We reimplemented and optimized the BrEPS pattern generation to be applicable for larger datasets in an acceptable timescale. Primary improvement of the new BrEPS protocol is the enhanced data selection step. Manually curated annotations from Swiss-Prot are used as reliable source for function prediction of enzymes observed on protein level. The pool of sequences is extended by highly similar sequences from TrEMBL and SwissProt. This allows us to restrict the selection of Swiss-Prot entries, without losing the diversity of sequences needed to generate significant patterns. Additionally, a supporting pattern type was introduced by extending the patterns at semi-conserved positions with highly similar amino acids. Extended patterns have an increased complexity, increasing the chance to match more sequences, without losing the essential structural information of the pattern. To enhance the usability of the database, we introduced enzyme function prediction based on consensus EC numbers and IUBMB enzyme nomenclature. BrEPS is part of the Braunschweig Enzyme Database (BRENDA) and is available on a completely redesigned website and as download. The database can be downloaded and used with the BrEPScmd command line tool for large scale sequence analysis. The BrEPS website and downloads for the database creation tool, command line tool and database are freely accessible at http://breps.tu-bs.de. PMID:28750104

  13. Multilingual Speech and Language Processing

    DTIC Science & Technology

    2003-04-01

    client software handles the user end of the transaction. Historically, four clients were provided: e-mail, web, FrameMaker , and command line. By...command-line client and an API. The API allows integration of CyberTrans into a number of processes including word processing packages ( FrameMaker ...preservation and logging, and others. The available clients remain e-mail, Web and FrameMaker . Platforms include both Unix and PC for clients, with

  14. Serial Interface through Stream Protocol on EPICS Platform for Distributed Control and Monitoring

    NASA Astrophysics Data System (ADS)

    Das Gupta, Arnab; Srivastava, Amit K.; Sunil, S.; Khan, Ziauddin

    2017-04-01

    Remote operation of any equipment or device is implemented in distributed systems in order to control and proper monitoring of process values. For such remote operations, Experimental Physics and Industrial Control System (EPICS) is used as one of the important software tool for control and monitoring of a wide range of scientific parameters. A hardware interface is developed for implementation of EPICS software so that different equipment such as data converters, power supplies, pump controllers etc. could be remotely operated through stream protocol. EPICS base was setup on windows as well as Linux operating system for control and monitoring while EPICS modules such as asyn and stream device were used to interface the equipment with standard RS-232/RS-485 protocol. Stream Device protocol communicates with the serial line with an interface to asyn drivers. Graphical user interface and alarm handling were implemented with Motif Editor and Display Manager (MEDM) and Alarm Handler (ALH) command line channel access utility tools. This paper will describe the developed application which was tested with different equipment and devices serially interfaced to the PCs on a distributed network.

  15. MicMac GIS application: free open source

    NASA Astrophysics Data System (ADS)

    Duarte, L.; Moutinho, O.; Teodoro, A.

    2016-10-01

    The use of Remotely Piloted Aerial System (RPAS) for remote sensing applications is becoming more frequent as the technologies on on-board cameras and the platform itself are becoming a serious contender to satellite and airplane imagery. MicMac is a photogrammetric tool for image matching that can be used in different contexts. It is an open source software and it can be used as a command line or with a graphic interface (for each command). The main objective of this work was the integration of MicMac with QGIS, which is also an open source software, in order to create a new open source tool applied to photogrammetry/remote sensing. Python language was used to develop the application. This tool would be very useful in the manipulation and 3D modelling of a set of images. The main objective was to create a toolbar in QGIS with the basic functionalities with intuitive graphic interfaces. The toolbar is composed by three buttons: produce the points cloud, create the Digital Elevation Model (DEM) and produce the orthophoto of the study area. The application was tested considering 35 photos, a subset of images acquired by a RPAS in the Aguda beach area, Porto, Portugal. They were used in order to create a 3D terrain model and from this model obtain an orthophoto and the corresponding DEM. The code is open and can be modified according to the user requirements. This integration would be very useful in photogrammetry and remote sensing community combined with GIS capabilities.

  16. Dynamo Catalogue: Geometrical tools and data management for particle picking in subtomogram averaging of cryo-electron tomograms.

    PubMed

    Castaño-Díez, Daniel; Kudryashev, Mikhail; Stahlberg, Henning

    2017-02-01

    Cryo electron tomography allows macromolecular complexes within vitrified, intact, thin cells or sections thereof to be visualized, and structural analysis to be performed in situ by averaging over multiple copies of the same molecules. Image processing for subtomogram averaging is specific and cumbersome, due to the large amount of data and its three dimensional nature and anisotropic resolution. Here, we streamline data processing for subtomogram averaging by introducing an archiving system, Dynamo Catalogue. This system manages tomographic data from multiple tomograms and allows visual feedback during all processing steps, including particle picking, extraction, alignment and classification. The file structure of a processing project file structure includes logfiles of performed operations, and can be backed up and shared between users. Command line commands, database queries and a set of GUIs give the user versatile control over the process. Here, we introduce a set of geometric tools that streamline particle picking from simple (filaments, spheres, tubes, vesicles) and complex geometries (arbitrary 2D surfaces, rare instances on proteins with geometric restrictions, and 2D and 3D crystals). Advanced functionality, such as manual alignment and subboxing, is useful when initial templates are generated for alignment and for project customization. Dynamo Catalogue is part of the open source package Dynamo and includes tools to ensure format compatibility with the subtomogram averaging functionalities of other packages, such as Jsubtomo, PyTom, PEET, EMAN2, XMIPP and Relion. Copyright © 2016. Published by Elsevier Inc.

  17. Method and apparatus for characterizing and enhancing the functional performance of machine tools

    DOEpatents

    Barkman, William E; Babelay, Jr., Edwin F; Smith, Kevin Scott; Assaid, Thomas S; McFarland, Justin T; Tursky, David A; Woody, Bethany; Adams, David

    2013-04-30

    Disclosed are various systems and methods for assessing and improving the capability of a machine tool. The disclosure applies to machine tools having at least one slide configured to move along a motion axis. Various patterns of dynamic excitation commands are employed to drive the one or more slides, typically involving repetitive short distance displacements. A quantification of a measurable merit of machine tool response to the one or more patterns of dynamic excitation commands is typically derived for the machine tool. Examples of measurable merits of machine tool performance include workpiece surface finish, and the ability to generate chips of the desired length.

  18. Method and apparatus for characterizing and enhancing the dynamic performance of machine tools

    DOEpatents

    Barkman, William E; Babelay, Jr., Edwin F

    2013-12-17

    Disclosed are various systems and methods for assessing and improving the capability of a machine tool. The disclosure applies to machine tools having at least one slide configured to move along a motion axis. Various patterns of dynamic excitation commands are employed to drive the one or more slides, typically involving repetitive short distance displacements. A quantification of a measurable merit of machine tool response to the one or more patterns of dynamic excitation commands is typically derived for the machine tool. Examples of measurable merits of machine tool performance include dynamic one axis positional accuracy of the machine tool, dynamic cross-axis stability of the machine tool, and dynamic multi-axis positional accuracy of the machine tool.

  19. Review of Ground Systems Development and Operations (GSDO) Tools for Verifying Command and Control Software

    NASA Technical Reports Server (NTRS)

    Aguilar, Michael L.; Bonanne, Kevin H.; Favretto, Jeffrey A.; Jackson, Maddalena M.; Jones, Stephanie L.; Mackey, Ryan M.; Sarrel, Marc A.; Simpson, Kimberly A.

    2014-01-01

    The Exploration Systems Development (ESD) Standing Review Board (SRB) requested the NASA Engineering and Safety Center (NESC) conduct an independent review of the plan developed by Ground Systems Development and Operations (GSDO) for identifying models and emulators to create a tool(s) to verify their command and control software. The NESC was requested to identify any issues or weaknesses in the GSDO plan. This document contains the outcome of the NESC review.

  20. New information technology tools for a medical command system for mass decontamination.

    PubMed

    Fuse, Akira; Okumura, Tetsu; Hagiwara, Jun; Tanabe, Tomohide; Fukuda, Reo; Masuno, Tomohiko; Mimura, Seiji; Yamamoto, Kaname; Yokota, Hiroyuki

    2013-06-01

    In a mass decontamination during a nuclear, biological, or chemical (NBC) response, the capability to command, control, and communicate is crucial for the proper flow of casualties at the scene and their subsequent evacuation to definitive medical facilities. Information Technology (IT) tools can be used to strengthen medical control, command, and communication during such a response. Novel IT tools comprise a vehicle-based, remote video camera and communication network systems. During an on-site verification event, an image from a remote video camera system attached to the personal protective garment of a medical responder working in the warm zone was transmitted to the on-site Medical Commander for aid in decision making. Similarly, a communication network system was used for personnel at the following points: (1) the on-site Medical Headquarters; (2) the decontamination hot zone; (3) an on-site coordination office; and (4) a remote medical headquarters of a local government office. A specially equipped, dedicated vehicle was used for the on-site medical headquarters, and facilitated the coordination with other agencies. The use of these IT tools proved effective in assisting with the medical command and control of medical resources and patient transport decisions during a mass-decontamination exercise, but improvements are required to overcome transmission delays and camera direction settings, as well as network limitations in certain areas.

  1. G4RNA screener web server: User focused interface for RNA G-quadruplex prediction.

    PubMed

    Garant, Jean-Michel; Perreault, Jean-Pierre; Scott, Michelle S

    2018-06-06

    Though RNA G-quadruplexes became a focus of study over a decade ago, the main challenge associated with the identification of new potential G-quadruplexes remains a bottleneck step. It slows the study of these non-canonical structures in nucleic acids, and thus the understanding of their significance. The G4RNA screener is an accurate tool for the prediction of RNA G-quadruplexes but its deployment has brought to light an issue with its accessibility to G-quadruplex experts and biologists. G4RNA screener web server is a platform that provides a much needed interface to manage the input, parameters and result display of the main command-line ready tool. It is accessible at http://scottgroup.med.usherbrooke.ca/G4RNA_screener/. Copyright © 2018. Published by Elsevier B.V.

  2. Annual Historical Review.

    DTIC Science & Technology

    1987-01-01

    8217. ’. .4,. .. *4*’. 5.4* 4 4.. .4- *44 =1. 44* 4 .4 %SS 4- MAJOR GENERAL FRED HISSONG, JR. Commanding General US Army Armament, Munitions and Chemical...Command 4, 44 44. 3 4’ ~ 4~\\S~4~5........................ . . - AMCCOM Deputy Commanding Generals vii. % % TABLE OF CONTENTS Chapter p% i! COMMAND...Handling for Brake and Clutch Repair V 54 Steam Cleaners V 54 Tool Improvement Program Suggestions V 54-. Test Stand Automotive Generator , Alternator

  3. Ada software productivity prototypes: A case study

    NASA Technical Reports Server (NTRS)

    Hihn, Jairus M.; Habib-Agahi, Hamid; Malhotra, Shan

    1988-01-01

    A case study of the impact of Ada on a Command and Control project completed at the Jet Propulsion Laboratory (JPL) is given. The data for this study was collected as part of a general survey of software costs and productivity at JPL and other NASA sites. The task analyzed is a successful example of the use of rapid prototyping as applied to command and control for the U.S. Air Force and provides the U.S. Air Force Military Airlift Command with the ability to track aircraft, air crews and payloads worldwide. The task consists of a replicated database at several globally distributed sites. The local databases at each site can be updated within seconds after changes are entered at any one site. The system must be able to handle up to 400,000 activities per day. There are currently seven sites, each with a local area network of computers and a variety of user displays; the local area networks are tied together into a single wide area network. Using data obtained for eight modules, totaling approximately 500,000 source lines of code, researchers analyze the differences in productivities between subtasks. Factors considered are percentage of Ada used in coding, years of programmer experience, and the use of Ada tools and modern programming practices. The principle findings are the following. Productivity is very sensitive to programmer experience. The use of Ada software tools and the use of modern programming practices are important; without such use Ada is just a large complex language which can cause productivity to decrease. The impact of Ada on development effort phases is consistent with earlier reports at the project level but not at the module level.

  4. MARTA: a suite of Java-based tools for assigning taxonomic status to DNA sequences.

    PubMed

    Horton, Matthew; Bodenhausen, Natacha; Bergelson, Joy

    2010-02-15

    We have created a suite of Java-based software to better provide taxonomic assignments to DNA sequences. We anticipate that the program will be useful for protistologists, virologists, mycologists and other microbial ecologists. The program relies on NCBI utilities including the BLAST software and Taxonomy database and is easily manipulated at the command-line to specify a BLAST candidate's query-coverage or percent identity requirements; other options include the ability to set minimal consensus requirements (%) for each of the eight major taxonomic ranks (Domain, Kingdom, Phylum, ...) and whether to consider lower scoring candidates when the top-hit lacks taxonomic classification.

  5. User's manual for EZPLOT version 5.5: A FORTRAN program for 2-dimensional graphic display of data

    NASA Technical Reports Server (NTRS)

    Garbinski, Charles; Redin, Paul C.; Budd, Gerald D.

    1988-01-01

    EZPLOT is a computer applications program that converts data resident on a file into a plot displayed on the screen of a graphics terminal. This program generates either time history or x-y plots in response to commands entered interactively from a terminal keyboard. Plot parameters consist of a single independent parameter and from one to eight dependent parameters. Various line patterns, symbol shapes, axis scales, text labels, and data modification techniques are available. This user's manual describes EZPLOT as it is implemented on the Ames Research Center, Dryden Research Facility ELXSI computer using DI-3000 graphics software tools.

  6. The PyRosetta Toolkit: a graphical user interface for the Rosetta software suite.

    PubMed

    Adolf-Bryfogle, Jared; Dunbrack, Roland L

    2013-01-01

    The Rosetta Molecular Modeling suite is a command-line-only collection of applications that enable high-resolution modeling and design of proteins and other molecules. Although extremely useful, Rosetta can be difficult to learn for scientists with little computational or programming experience. To that end, we have created a Graphical User Interface (GUI) for Rosetta, called the PyRosetta Toolkit, for creating and running protocols in Rosetta for common molecular modeling and protein design tasks and for analyzing the results of Rosetta calculations. The program is highly extensible so that developers can add new protocols and analysis tools to the PyRosetta Toolkit GUI.

  7. A Web Tool for Research in Nonlinear Optics

    NASA Astrophysics Data System (ADS)

    Prikhod'ko, Nikolay V.; Abramovsky, Viktor A.; Abramovskaya, Natalia V.; Demichev, Andrey P.; Kryukov, Alexandr P.; Polyakov, Stanislav P.

    2016-02-01

    This paper presents a project of developing the web platform called WebNLO for computer modeling of nonlinear optics phenomena. We discuss a general scheme of the platform and a model for interaction between the platform modules. The platform is built as a set of interacting RESTful web services (SaaS approach). Users can interact with the platform through a web browser or command line interface. Such a resource has no analogues in the field of nonlinear optics and will be created for the first time therefore allowing researchers to access high-performance computing resources that will significantly reduce the cost of the research and development process.

  8. Afloat Surface Line Commanding Officer Leadership: A Comprehensive Study

    DTIC Science & Technology

    1992-05-01

    This thesis explored the leadership styles of Navy commanding officers of afloat commands to determine if there were any differences in leadership styles and the effect, if any, of rank, age, commissioning source, education, ethnicity, location, and ship community type that influenced that leadership style. A review of the literature indicated that the Navy adopted the Situational Leadership Model in 1976. The Navy concurred with

  9. 2005 9th Annual Army Small Business Conference

    DTIC Science & Technology

    2005-11-03

    field commanders who conduct acquisitions. All the Army’s major commands located in the United States will be represented. The conference...Engineer Squad Vehicle i r i l Mobile Gun System il yst Medical Evacuation Vehicle i l v ti i l Reconnaissance Vehicle iss i l Mortar Carrier rt r rri r...Manned Systems Unmanned Air Vehicles Class I ARV-A (L) Small (Manpackable) UGV Non-Line of Sight Cannon Non-Line of Sight Mortar Medical Treatment and

  10. Apollo 12 - Bean - Conrad - during geological field trip

    NASA Image and Video Library

    1969-10-24

    S69-55667 (10 Oct. 1969) --- Astronauts Charles Conrad Jr. and Alan L. Bean train for their upcoming Apollo 12 lunar landing mission. Here they are entering a simulated lunar surface area near Flagstaff, Arizona. Both are wearing lunar surface cameras strapped to their bodies. Conrad (left), the Apollo 12 mission commander, is carrying some of the tools from the geological tool container. The geological tool container, being carried here by Bean, the lunar module pilot, is similar to the one which will be used during scheduled extravehicular activity (EVA) periods on Nov. 19 and 20, 1969, on the lunar surface. While astronauts Conrad and Bean conduct their scheduled EVA on the moon's surface, astronaut Richard F. Gordon Jr., command module pilot, will man the Command and Service Modules (CSM) in lunar orbit.

  11. Solvation Structure and Thermodynamic Mapping (SSTMap): An Open-Source, Flexible Package for the Analysis of Water in Molecular Dynamics Trajectories.

    PubMed

    Haider, Kamran; Cruz, Anthony; Ramsey, Steven; Gilson, Michael K; Kurtzman, Tom

    2018-01-09

    We have developed SSTMap, a software package for mapping structural and thermodynamic water properties in molecular dynamics trajectories. The package introduces automated analysis and mapping of local measures of frustration and enhancement of water structure. The thermodynamic calculations are based on Inhomogeneous Fluid Solvation Theory (IST), which is implemented using both site-based and grid-based approaches. The package also extends the applicability of solvation analysis calculations to multiple molecular dynamics (MD) simulation programs by using existing cross-platform tools for parsing MD parameter and trajectory files. SSTMap is implemented in Python and contains both command-line tools and a Python module to facilitate flexibility in setting up calculations and for automated generation of large data sets involving analysis of multiple solutes. Output is generated in formats compatible with popular Python data science packages. This tool will be used by the molecular modeling community for computational analysis of water in problems of biophysical interest such as ligand binding and protein function.

  12. BioSigPlot: an opensource tool for the visualization of multi-channel biomedical signals with Matlab.

    PubMed

    Boudet, Samuel; Peyrodie, Laurent; Gallois, Philippe; de l'Aulnoit, Denis Houzé; Cao, Hua; Forzy, Gérard

    2013-01-01

    This paper presents a Matlab-based software (MathWorks inc.) called BioSigPlot for the visualization of multi-channel biomedical signals, particularly for the EEG. This tool is designed for researchers on both engineering and medicine who have to collaborate to visualize and analyze signals. It aims to provide a highly customizable interface for signal processing experimentation in order to plot several kinds of signals while integrating the common tools for physician. The main advantages compared to other existing programs are the multi-dataset displaying, the synchronization with video and the online processing. On top of that, this program uses object oriented programming, so that the interface can be controlled by both graphic controls and command lines. It can be used as EEGlab plug-in but, since it is not limited to EEG, it would be distributed separately. BioSigPlot is distributed free of charge (http://biosigplot.sourceforge.net), under the terms of GNU Public License for non-commercial use and open source development.

  13. A flexible tool for diagnosing water, energy, and entropy budgets in climate models

    NASA Astrophysics Data System (ADS)

    Lembo, Valerio; Lucarini, Valerio

    2017-04-01

    We have developed a new flexible software for studying the global energy budget, the hydrological cycle, and the material entropy production of global climate models. The program receives as input radiative, latent and sensible energy fluxes, with the requirement that the variable names are in agreement with the Climate and Forecast (CF) conventions for the production of NetCDF datasets. Annual mean maps, meridional sections and time series are computed by means of Climate Data Operators (CDO) collection of command line operators developed at Max-Planck Institute for Meteorology (MPI-M). If a land-sea mask is provided, the program also computes the required quantities separately on the continents and oceans. Depending on the user's choice, the program also calls the MATLAB software to compute meridional heat transports and location and intensities of the peaks in the two hemispheres. We are currently planning to adapt the program in order to be included in the Earth System Model eValuation Tool (ESMValTool) community diagnostics.

  14. Stata companion.

    PubMed

    Brennan, Jennifer Sousa

    2010-01-01

    This chapter is an introductory reference guide highlighting some of the most common statistical topics, broken down into both command-line syntax and graphical interface point-and-click commands. This chapter serves to supplement more formal statistics lessons and expedite using Stata to compute basic analyses.

  15. GES DISC Data Recipes in Jupyter Notebooks

    NASA Astrophysics Data System (ADS)

    Li, A.; Banavige, B.; Garimella, K.; Rice, J.; Shen, S.; Liu, Z.

    2017-12-01

    The Earth Science Data and Information System (ESDIS) Project manages twelve Distributed Active Archive Centers (DAACs) which are geographically dispersed across the United States. The DAACs are responsible for ingesting, processing, archiving, and distributing Earth science data produced from various sources (satellites, aircraft, field measurements, etc.). In response to projections of an exponential increase in data production, there has been a recent effort to prototype various DAAC activities in the cloud computing environment. This, in turn, led to the creation of an initiative, called the Cloud Analysis Toolkit to Enable Earth Science (CATEES), to develop a Python software package in order to transition Earth science data processing to the cloud. This project, in particular, supports CATEES and has two primary goals. One, transition data recipes created by the Goddard Earth Science Data and Information Service Center (GES DISC) DAAC into an interactive and educational environment using Jupyter Notebooks. Two, acclimate Earth scientists to cloud computing. To accomplish these goals, we create Jupyter Notebooks to compartmentalize the different steps of data analysis and help users obtain and parse data from the command line. We also develop a Docker container, comprised of Jupyter Notebooks, Python library dependencies, and command line tools, and configure it into an easy to deploy package. The end result is an end-to-end product that simulates the use case of end users working in the cloud computing environment.

  16. Computer Center Reference Manual. Volume 1

    DTIC Science & Technology

    1990-09-30

    Unlimited o- 0 0 91o1 UNCLASSI FI ED SECURITY CLASSIFICATION OF THIS PAGE REPORT DOCUMENTATION PAGE la . REPORT SECURITY CLASSIFICATION lb. RESTRICTIVE...with connection to INTERNET ) (host tables allow transfer to some other networks) OASYS - the DTRC Office Automation System The following can be reached...and buffers, two windows, and some word processing commands. Advanced editing commands are entered through the use of a command line. EVE las its own

  17. The "grep" command but not FusionMap, FusionFinder or ChimeraScan captures the CIC-DUX4 fusion gene from whole transcriptome sequencing data on a small round cell tumor with t(4;19)(q35;q13).

    PubMed

    Panagopoulos, Ioannis; Gorunova, Ludmila; Bjerkehagen, Bodil; Heim, Sverre

    2014-01-01

    Whole transcriptome sequencing was used to study a small round cell tumor in which a t(4;19)(q35;q13) was part of the complex karyotype but where the initial reverse transcriptase PCR (RT-PCR) examination did not detect a CIC-DUX4 fusion transcript previously described as the crucial gene-level outcome of this specific translocation. The RNA sequencing data were analysed using the FusionMap, FusionFinder, and ChimeraScan programs which are specifically designed to identify fusion genes. FusionMap, FusionFinder, and ChimeraScan identified 1017, 102, and 101 fusion transcripts, respectively, but CIC-DUX4 was not among them. Since the RNA sequencing data are in the fastq text-based format, we searched the files using the "grep" command-line utility. The "grep" command searches the text for specific expressions and displays, by default, the lines where matches occur. The "specific expression" was a sequence of 20 nucleotides from the coding part of the last exon 20 of CIC (Reference Sequence: NM_015125.3) chosen since all the so far reported CIC breakpoints have occurred here. Fifteen chimeric CIC-DUX4 cDNA sequences were captured and the fusion between the CIC and DUX4 genes was mapped precisely. New primer combinations were constructed based on these findings and were used together with a polymerase suitable for amplification of GC-rich DNA templates to amplify CIC-DUX4 cDNA fragments which had the same fusion point found with "grep". In conclusion, FusionMap, FusionFinder, and ChimeraScan generated a plethora of fusion transcripts but did not detect the biologically important CIC-DUX4 chimeric transcript; they are generally useful but evidently suffer from imperfect both sensitivity and specificity. The "grep" command is an excellent tool to capture chimeric transcripts from RNA sequencing data when the pathological and/or cytogenetic information strongly indicates the presence of a specific fusion gene.

  18. Astronaut John Young replaces tools in Lunar Roving Vehicle during EVA

    NASA Image and Video Library

    1972-04-22

    AS16-110-17960 (22 April 1972) --- Astronaut John W. Young, commander, replaces tools in the Apollo Lunar Hand Tool (ALHT) carrier at the aft end of the Lunar Roving Vehicle (LRV) during the second Apollo 16 extravehicular activity (EVA) on the high side of Stone Mountain at the Descartes landing site. Astronaut Charles M. Duke Jr., lunar module pilot, took this photograph near the conclusion of Station 4 activities. Smoky Mountain, with the large Ravine Crater on its flank, is in the left background. This view is looking northeast. While astronauts Young and Duke descended in the Apollo 16 Lunar Module (LM) "Orion" to explore the Descartes highlands landing site on the moon, astronaut Thomas K. Mattingly II, command module pilot, remained with the Command and Service Modules (CSM) "Casper" in lunar orbit.

  19. Perspectives of US military commanders on tobacco use and tobacco control policy.

    PubMed

    Poston, Walker S C; Haddock, Christopher K; Jahnke, Sara A; Jitnarin, Nattinee; Malone, Ruth E; Smith, Elizabeth A

    2017-05-01

    Tobacco use among members of the US military service is unacceptably high, resulting in substantial healthcare and personnel costs. Support of military command is critical to the success of tobacco control policies because line commanders are responsible for implementation and enforcement. This study is the first to examine US military line commanders' perspectives about current tobacco control policies and the impact of tobacco on readiness. We conducted key-informant interviews with 20 officers at the US Army's Command and General Staff College about military tobacco use and tobacco control policy. Participants identified the long-term impact of tobacco use on military members, but were unaware of proximal effects on health and readiness other than lost productivity due to smoke breaks. Officers also discussed nicotine addiction and the logistics of ensuring that an addicted population had access to tobacco. Regarding policy, most knew about regulations governing smoke-free areas and were open to stronger restrictions, but were unaware of current policies governing prevention, intervention and product sales. Findings suggest that strong policy that takes advantage of the hierarchical and disciplined nature of the military, supported by senior line and civilian leadership up to and including the secretaries of the services and the Secretary of Defense, will be critical to substantially diminishing tobacco use by military personnel. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  20. WARCProcessor: An Integrative Tool for Building and Management of Web Spam Corpora.

    PubMed

    Callón, Miguel; Fdez-Glez, Jorge; Ruano-Ordás, David; Laza, Rosalía; Pavón, Reyes; Fdez-Riverola, Florentino; Méndez, Jose Ramón

    2017-12-22

    In this work we present the design and implementation of WARCProcessor, a novel multiplatform integrative tool aimed to build scientific datasets to facilitate experimentation in web spam research. The developed application allows the user to specify multiple criteria that change the way in which new corpora are generated whilst reducing the number of repetitive and error prone tasks related with existing corpus maintenance. For this goal, WARCProcessor supports up to six commonly used data sources for web spam research, being able to store output corpus in standard WARC format together with complementary metadata files. Additionally, the application facilitates the automatic and concurrent download of web sites from Internet, giving the possibility of configuring the deep of the links to be followed as well as the behaviour when redirected URLs appear. WARCProcessor supports both an interactive GUI interface and a command line utility for being executed in background.

  1. Alview: Portable Software for Viewing Sequence Reads in BAM Formatted Files.

    PubMed

    Finney, Richard P; Chen, Qing-Rong; Nguyen, Cu V; Hsu, Chih Hao; Yan, Chunhua; Hu, Ying; Abawi, Massih; Bian, Xiaopeng; Meerzaman, Daoud M

    2015-01-01

    The name Alview is a contraction of the term Alignment Viewer. Alview is a compiled to native architecture software tool for visualizing the alignment of sequencing data. Inputs are files of short-read sequences aligned to a reference genome in the SAM/BAM format and files containing reference genome data. Outputs are visualizations of these aligned short reads. Alview is written in portable C with optional graphical user interface (GUI) code written in C, C++, and Objective-C. The application can run in three different ways: as a web server, as a command line tool, or as a native, GUI program. Alview is compatible with Microsoft Windows, Linux, and Apple OS X. It is available as a web demo at https://cgwb.nci.nih.gov/cgi-bin/alview. The source code and Windows/Mac/Linux executables are available via https://github.com/NCIP/alview.

  2. Using Cloud Computing infrastructure with CloudBioLinux, CloudMan and Galaxy

    PubMed Central

    Afgan, Enis; Chapman, Brad; Jadan, Margita; Franke, Vedran; Taylor, James

    2012-01-01

    Cloud computing has revolutionized availability and access to computing and storage resources; making it possible to provision a large computational infrastructure with only a few clicks in a web browser. However, those resources are typically provided in the form of low-level infrastructure components that need to be procured and configured before use. In this protocol, we demonstrate how to utilize cloud computing resources to perform open-ended bioinformatics analyses, with fully automated management of the underlying cloud infrastructure. By combining three projects, CloudBioLinux, CloudMan, and Galaxy into a cohesive unit, we have enabled researchers to gain access to more than 100 preconfigured bioinformatics tools and gigabytes of reference genomes on top of the flexible cloud computing infrastructure. The protocol demonstrates how to setup the available infrastructure and how to use the tools via a graphical desktop interface, a parallel command line interface, and the web-based Galaxy interface. PMID:22700313

  3. PhySortR: a fast, flexible tool for sorting phylogenetic trees in R.

    PubMed

    Stephens, Timothy G; Bhattacharya, Debashish; Ragan, Mark A; Chan, Cheong Xin

    2016-01-01

    A frequent bottleneck in interpreting phylogenomic output is the need to screen often thousands of trees for features of interest, particularly robust clades of specific taxa, as evidence of monophyletic relationship and/or reticulated evolution. Here we present PhySortR, a fast, flexible R package for classifying phylogenetic trees. Unlike existing utilities, PhySortR allows for identification of both exclusive and non-exclusive clades uniting the target taxa based on tip labels (i.e., leaves) on a tree, with customisable options to assess clades within the context of the whole tree. Using simulated and empirical datasets, we demonstrate the potential and scalability of PhySortR in analysis of thousands of phylogenetic trees without a priori assumption of tree-rooting, and in yielding readily interpretable trees that unambiguously satisfy the query. PhySortR is a command-line tool that is freely available and easily automatable.

  4. Using cloud computing infrastructure with CloudBioLinux, CloudMan, and Galaxy.

    PubMed

    Afgan, Enis; Chapman, Brad; Jadan, Margita; Franke, Vedran; Taylor, James

    2012-06-01

    Cloud computing has revolutionized availability and access to computing and storage resources, making it possible to provision a large computational infrastructure with only a few clicks in a Web browser. However, those resources are typically provided in the form of low-level infrastructure components that need to be procured and configured before use. In this unit, we demonstrate how to utilize cloud computing resources to perform open-ended bioinformatic analyses, with fully automated management of the underlying cloud infrastructure. By combining three projects, CloudBioLinux, CloudMan, and Galaxy, into a cohesive unit, we have enabled researchers to gain access to more than 100 preconfigured bioinformatics tools and gigabytes of reference genomes on top of the flexible cloud computing infrastructure. The protocol demonstrates how to set up the available infrastructure and how to use the tools via a graphical desktop interface, a parallel command-line interface, and the Web-based Galaxy interface.

  5. Testing Separability and Independence of Perceptual Dimensions with General Recognition Theory: A Tutorial and New R Package (grtools).

    PubMed

    Soto, Fabian A; Zheng, Emily; Fonseca, Johnny; Ashby, F Gregory

    2017-01-01

    Determining whether perceptual properties are processed independently is an important goal in perceptual science, and tools to test independence should be widely available to experimental researchers. The best analytical tools to test for perceptual independence are provided by General Recognition Theory (GRT), a multidimensional extension of signal detection theory. Unfortunately, there is currently a lack of software implementing GRT analyses that is ready-to-use by experimental psychologists and neuroscientists with little training in computational modeling. This paper presents grtools , an R package developed with the explicit aim of providing experimentalists with the ability to perform full GRT analyses using only a couple of command lines. We describe the software and provide a practical tutorial on how to perform each of the analyses available in grtools . We also provide advice to researchers on best practices for experimental design and interpretation of results when applying GRT and grtools .

  6. WARCProcessor: An Integrative Tool for Building and Management of Web Spam Corpora

    PubMed Central

    Callón, Miguel; Fdez-Glez, Jorge; Ruano-Ordás, David; Laza, Rosalía; Pavón, Reyes; Méndez, Jose Ramón

    2017-01-01

    In this work we present the design and implementation of WARCProcessor, a novel multiplatform integrative tool aimed to build scientific datasets to facilitate experimentation in web spam research. The developed application allows the user to specify multiple criteria that change the way in which new corpora are generated whilst reducing the number of repetitive and error prone tasks related with existing corpus maintenance. For this goal, WARCProcessor supports up to six commonly used data sources for web spam research, being able to store output corpus in standard WARC format together with complementary metadata files. Additionally, the application facilitates the automatic and concurrent download of web sites from Internet, giving the possibility of configuring the deep of the links to be followed as well as the behaviour when redirected URLs appear. WARCProcessor supports both an interactive GUI interface and a command line utility for being executed in background. PMID:29271913

  7. BioPCD - A Language for GUI Development Requiring a Minimal Skill Set.

    PubMed

    Alvare, Graham Gm; Roche-Lima, Abiel; Fristensky, Brian

    2012-11-01

    BioPCD is a new language whose purpose is to simplify the creation of Graphical User Interfaces (GUIs) by biologists with minimal programming skills. The first step in developing BioPCD was to create a minimal superset of the language referred to as PCD (Pythonesque Command Description). PCD defines the core of terminals and high-level nonterminals required to describe data of almost any type. BioPCD adds to PCD the constructs necessary to describe GUI components and the syntax for executing system commands. BioPCD is implemented using JavaCC to convert the grammar into code. BioPCD is designed to be terse and readable and simple enough to be learned by copying and modifying existing BioPCD files. We demonstrate that BioPCD can easily be used to generate GUIs for existing command line programs. Although BioPCD was designed to make it easier to run bioinformatics programs, it could be used in any domain in which many useful command line programs exist that do not have GUI interfaces.

  8. Tools for Integrating Data Access from the IRIS DMC into Research Workflows

    NASA Astrophysics Data System (ADS)

    Reyes, C. G.; Suleiman, Y. Y.; Trabant, C.; Karstens, R.; Weertman, B. R.

    2012-12-01

    Web service interfaces at the IRIS Data Management Center (DMC) provide access to a vast archive of seismological and related geophysical data. These interfaces are designed to easily incorporate data access into data processing workflows. Examples of data that may be accessed include: time series data, related metadata, and earthquake information. The DMC has developed command line scripts, MATLAB® interfaces and a Java library to support a wide variety of data access needs. Users of these interfaces do not need to concern themselves with web service details, networking, or even (in most cases) data conversion. Fetch scripts allow access to the DMC archive and are a comfortable fit for command line users. These scripts are written in Perl and are well suited for automation and integration into existing workflows on most operating systems. For metdata and event information, the Fetch scripts even parse the returned data into simple text summaries. The IRIS Java Web Services Library (IRIS-WS Library) allows Java developers the ability to create programs that access the DMC archives seamlessly. By returning the data and information as native Java objects the Library insulates the developer from data formats, network programming and web service details. The MATLAB interfaces leverage this library to allow users access to the DMC archive directly from within MATLAB (r2009b or newer), returning data into variables for immediate use. Data users and research groups are developing other toolkits that use the DMC's web services. Notably, the ObsPy framework developed at LMU Munich is a Python Toolbox that allows seamless access to data and information via the DMC services. Another example is the MATLAB-based GISMO and Waveform Suite developments that can now access data via web services. In summary, there now exist a host of ways that researchers can bring IRIS DMC data directly into their workflows. MATLAB users can use irisFetch.m, command line users can use the various Fetch scripts, Java users can use the IRIS-WS library, and Python users may request data through ObsPy. To learn more about any of these clients see http://www.iris.edu/ws/wsclients/.

  9. drPACS: A Simple UNIX Execution Pipeline

    NASA Astrophysics Data System (ADS)

    Teuben, P.

    2011-07-01

    We describe a very simple yet flexible and effective pipeliner for UNIX commands. It creates a Makefile to define a set of serially dependent commands. The commands in the pipeline share a common set of parameters by which they can communicate. Commands must follow a simple convention to retrieve and store parameters. Pipeline parameters can optionally be made persistent across multiple runs of the pipeline. Tools were added to simplify running a large series of pipelines, which can then also be run in parallel.

  10. Airpower Leadership on the Front Line: Lt Gen George H. Brett and Combat Command

    DTIC Science & Technology

    2006-09-01

    front.indd 5 11/7/06 10:29:42 AM insight into the makings of effective leadership and successful command. THOMAS HUGHES Associate Professor School...transformed impossibilities into tasks completed. My thesis reader, Dr. Thomas Hughes, lent his unerring sense of style and his gifted historical...project. The commandant, Col Thomas E. Griffith, provided papers pertaining to General Brett from his collection of historical documents. Dr. Harold R

  11. Galaxy tools and workflows for sequence analysis with applications in molecular plant pathology.

    PubMed

    Cock, Peter J A; Grüning, Björn A; Paszkiewicz, Konrad; Pritchard, Leighton

    2013-01-01

    The Galaxy Project offers the popular web browser-based platform Galaxy for running bioinformatics tools and constructing simple workflows. Here, we present a broad collection of additional Galaxy tools for large scale analysis of gene and protein sequences. The motivating research theme is the identification of specific genes of interest in a range of non-model organisms, and our central example is the identification and prediction of "effector" proteins produced by plant pathogens in order to manipulate their host plant. This functional annotation of a pathogen's predicted capacity for virulence is a key step in translating sequence data into potential applications in plant pathology. This collection includes novel tools, and widely-used third-party tools such as NCBI BLAST+ wrapped for use within Galaxy. Individual bioinformatics software tools are typically available separately as standalone packages, or in online browser-based form. The Galaxy framework enables the user to combine these and other tools to automate organism scale analyses as workflows, without demanding familiarity with command line tools and scripting. Workflows created using Galaxy can be saved and are reusable, so may be distributed within and between research groups, facilitating the construction of a set of standardised, reusable bioinformatic protocols. The Galaxy tools and workflows described in this manuscript are open source and freely available from the Galaxy Tool Shed (http://usegalaxy.org/toolshed or http://toolshed.g2.bx.psu.edu).

  12. Embedded CLIPS for SDI BM/C3 simulation and analysis

    NASA Technical Reports Server (NTRS)

    Gossage, Brett; Nanney, Van

    1990-01-01

    Nichols Research Corporation is developing the BM/C3 Requirements Analysis Tool (BRAT) for the U.S. Army Strategic Defense Command. BRAT uses embedded CLIPS/Ada to model the decision making processes used by the human commander of a defense system. Embedding CLlPS/Ada in BRAT allows the user to explore the role of the human in Command and Control (C2) and the use of expert systems for automated C2. BRAT models assert facts about the current state of the system, the simulated scenario, and threat information into CLIPS/Ada. A user-defined rule set describes the decision criteria for the commander. We have extended CLIPS/Ada with user-defined functions that allow the firing of a rule to invoke a system action such as weapons release or a change in strategy. The use of embedded CLIPS/Ada will provide a powerful modeling tool for our customer at minimal cost.

  13. Spacecraft command verification: The AI solution

    NASA Technical Reports Server (NTRS)

    Fesq, Lorraine M.; Stephan, Amy; Smith, Brian K.

    1990-01-01

    Recently, a knowledge-based approach was used to develop a system called the Command Constraint Checker (CCC) for TRW. CCC was created to automate the process of verifying spacecraft command sequences. To check command files by hand for timing and sequencing errors is a time-consuming and error-prone task. Conventional software solutions were rejected when it was estimated that it would require 36 man-months to build an automated tool to check constraints by conventional methods. Using rule-based representation to model the various timing and sequencing constraints of the spacecraft, CCC was developed and tested in only three months. By applying artificial intelligence techniques, CCC designers were able to demonstrate the viability of AI as a tool to transform difficult problems into easily managed tasks. The design considerations used in developing CCC are discussed and the potential impact of this system on future satellite programs is examined.

  14. Seten: a tool for systematic identification and comparison of processes, phenotypes, and diseases associated with RNA-binding proteins from condition-specific CLIP-seq profiles.

    PubMed

    Budak, Gungor; Srivastava, Rajneesh; Janga, Sarath Chandra

    2017-06-01

    RNA-binding proteins (RBPs) control the regulation of gene expression in eukaryotic genomes at post-transcriptional level by binding to their cognate RNAs. Although several variants of CLIP (crosslinking and immunoprecipitation) protocols are currently available to study the global protein-RNA interaction landscape at single-nucleotide resolution in a cell, currently there are very few tools that can facilitate understanding and dissecting the functional associations of RBPs from the resulting binding maps. Here, we present Seten, a web-based and command line tool, which can identify and compare processes, phenotypes, and diseases associated with RBPs from condition-specific CLIP-seq profiles. Seten uses BED files resulting from most peak calling algorithms, which include scores reflecting the extent of binding of an RBP on the target transcript, to provide both traditional functional enrichment as well as gene set enrichment results for a number of gene set collections including BioCarta, KEGG, Reactome, Gene Ontology (GO), Human Phenotype Ontology (HPO), and MalaCards Disease Ontology for several organisms including fruit fly, human, mouse, rat, worm, and yeast. It also provides an option to dynamically compare the associated gene sets across data sets as bubble charts, to facilitate comparative analysis. Benchmarking of Seten using eCLIP data for IGF2BP1, SRSF7, and PTBP1 against their corresponding CRISPR RNA-seq in K562 cells as well as randomized negative controls, demonstrated that its gene set enrichment method outperforms functional enrichment, with scores significantly contributing to the discovery of true annotations. Comparative performance analysis using these CRISPR control data sets revealed significantly higher precision and comparable recall to that observed using ChIP-Enrich. Seten's web interface currently provides precomputed results for about 200 CLIP-seq data sets and both command line as well as web interfaces can be used to analyze CLIP-seq data sets. We highlight several examples to show the utility of Seten for rapid profiling of various CLIP-seq data sets. Seten is available on http://www.iupui.edu/∼sysbio/seten/. © 2017 Budak et al.; Published by Cold Spring Harbor Laboratory Press for the RNA Society.

  15. Increasing Usability in Ocean Observing Systems

    NASA Astrophysics Data System (ADS)

    Chase, A. C.; Gomes, K.; O'Reilly, T.

    2005-12-01

    As observatory systems move to more advanced techniques for instrument configuration and data management, standardized frameworks are being developed to benefit from commodities of scale. ACE (A Configuror and Editor) is a tool that was developed for SIAM (Software Infrastructure and Application for MOOS), a framework for the seamless integration of self-describing plug-and-work instruments into the Monterey Ocean Observing System. As a comprehensive solution, the SIAM infrastructure requires a number of processes to be run to configure an instrument for use within its framework. As solutions move from the lab to the field, the steps needed to implement the solution must be made bulletproof so that they may be used in the field with confidence. Loosely defined command line interfaces don't always provide enough user feedback and business logic can be difficult to maintain over a series of scripts. ACE is a tool developed for guiding the user through a number of complicated steps, removing the reliance on command-line utilities and reducing the difficulty of completing the necessary steps, while also preventing operator error and enforcing system constraints. Utilizing the cross-platform nature of the Java programming language, ACE provides a complete solution for deploying an instrument within the SIAM infrastructure without depending on special software being installed on the users computer. Requirements such as the installation of a Unix emulator for users running Windows machines, and the installation of, and ability to use, a CVS client, have all been removed by providing the equivalent functionality from within ACE. In order to achieve a "one stop shop" for configuring instruments, ACE had to be written to handle a wide variety of functionality including: compiling java code, interacting with a CVS server and maintaining client-side CVS information, editing XML, interacting with a server side database, and negotiating serial port communications through Java. This paper will address the relative tradeoffs of including all the afore-mentioned functionality in a single tool, its affects on user adoption of the framework (SIAM) it provides access to, as well as further discussion of some of the functionality generally pertinent to data management (XML editing, source code management and compilation, etc).

  16. TOAD Editor

    NASA Technical Reports Server (NTRS)

    Bingle, Bradford D.; Shea, Anne L.; Hofler, Alicia S.

    1993-01-01

    Transferable Output ASCII Data (TOAD) computer program (LAR-13755), implements format designed to facilitate transfer of data across communication networks and dissimilar host computer systems. Any data file conforming to TOAD format standard called TOAD file. TOAD Editor is interactive software tool for manipulating contents of TOAD files. Commonly used to extract filtered subsets of data for visualization of results of computation. Also offers such user-oriented features as on-line help, clear English error messages, startup file, macroinstructions defined by user, command history, user variables, UNDO features, and full complement of mathematical statistical, and conversion functions. Companion program, TOAD Gateway (LAR-14484), converts data files from variety of other file formats to that of TOAD. TOAD Editor written in FORTRAN 77.

  17. Using PSEA-Quant for Protein Set Enrichment Analysis of Quantitative Mass Spectrometry-Based Proteomics.

    PubMed

    Lavallée-Adam, Mathieu; Yates, John R

    2016-03-24

    PSEA-Quant analyzes quantitative mass spectrometry-based proteomics datasets to identify enrichments of annotations contained in repositories such as the Gene Ontology and Molecular Signature databases. It allows users to identify the annotations that are significantly enriched for reproducibly quantified high abundance proteins. PSEA-Quant is available on the Web and as a command-line tool. It is compatible with all label-free and isotopic labeling-based quantitative proteomics methods. This protocol describes how to use PSEA-Quant and interpret its output. The importance of each parameter as well as troubleshooting approaches are also discussed. © 2016 by John Wiley & Sons, Inc. Copyright © 2016 John Wiley & Sons, Inc.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blom, Philip Stephen; Marcillo, Omar Eduardo; Euler, Garrett Gene

    InfraPy is a Python-based analysis toolkit being development at LANL. The algorithms are intended for ground-based nuclear detonation detection applications to detect, locate, and characterize explosive sources using infrasonic observations. The implementation is usable as a stand-alone Python library or as a command line driven tool operating directly on a database. With multiple scientists working on the project, we've begun using a LANL git repository for collaborative development and version control. Current and planned work on InfraPy focuses on the development of new algorithms and propagation models. Collaboration with Southern Methodist University (SMU) has helped identify bugs and limitations ofmore » the algorithms. The current focus of usage development is focused on library imports and CLI.« less

  19. Scripps Genome ADVISER: Annotation and Distributed Variant Interpretation SERver

    PubMed Central

    Pham, Phillip H.; Shipman, William J.; Erikson, Galina A.; Schork, Nicholas J.; Torkamani, Ali

    2015-01-01

    Interpretation of human genomes is a major challenge. We present the Scripps Genome ADVISER (SG-ADVISER) suite, which aims to fill the gap between data generation and genome interpretation by performing holistic, in-depth, annotations and functional predictions on all variant types and effects. The SG-ADVISER suite includes a de-identification tool, a variant annotation web-server, and a user interface for inheritance and annotation-based filtration. SG-ADVISER allows users with no bioinformatics expertise to manipulate large volumes of variant data with ease – without the need to download large reference databases, install software, or use a command line interface. SG-ADVISER is freely available at genomics.scripps.edu/ADVISER. PMID:25706643

  20. Transferable Output ASCII Data (TOAD) editor version 1.0 user's guide

    NASA Technical Reports Server (NTRS)

    Bingel, Bradford D.; Shea, Anne L.; Hofler, Alicia S.

    1991-01-01

    The Transferable Output ASCII Data (TOAD) editor is an interactive software tool for manipulating the contents of TOAD files. The TOAD editor is specifically designed to work with tabular data. Selected subsets of data may be displayed to the user's screen, sorted, exchanged, duplicated, removed, replaced, inserted, or transferred to and from external files. It also offers a number of useful features including on-line help, macros, a command history, an 'undo' option, variables, and a full compliment of mathematical functions and conversion factors. Written in ANSI FORTRAN 77 and completely self-contained, the TOAD editor is very portable and has already been installed on SUN, SGI/IRIS, and CONVEX hosts.

  1. Charming Users into Scripting CIAO with Python

    NASA Astrophysics Data System (ADS)

    Burke, D. J.

    2011-07-01

    The Science Data Systems group of the Chandra X-ray Center provides a number of scripts and Python modules that extend the capabilities of CIAO. Experience in converting the existing scripts—written in a variety of languages such as bash, csh/tcsh, Perl and S-Lang—to Python, and conversations with users, led to the development of the ciao_contrib.runtool module. This allows users to easily run CIAO tools from Python scripts, and utilizes the metadata provided by the parameter-file system to create an API that provides the flexibility and safety guarantees of the command-line. The module is provided to the user community and is being used within our group to create new scripts.

  2. InSilico DB genomic datasets hub: an efficient starting point for analyzing genome-wide studies in GenePattern, Integrative Genomics Viewer, and R/Bioconductor.

    PubMed

    Coletta, Alain; Molter, Colin; Duqué, Robin; Steenhoff, David; Taminau, Jonatan; de Schaetzen, Virginie; Meganck, Stijn; Lazar, Cosmin; Venet, David; Detours, Vincent; Nowé, Ann; Bersini, Hugues; Weiss Solís, David Y

    2012-11-18

    Genomics datasets are increasingly useful for gaining biomedical insights, with adoption in the clinic underway. However, multiple hurdles related to data management stand in the way of their efficient large-scale utilization. The solution proposed is a web-based data storage hub. Having clear focus, flexibility and adaptability, InSilico DB seamlessly connects genomics dataset repositories to state-of-the-art and free GUI and command-line data analysis tools. The InSilico DB platform is a powerful collaborative environment, with advanced capabilities for biocuration, dataset sharing, and dataset subsetting and combination. InSilico DB is available from https://insilicodb.org.

  3. dCLIP: a computational approach for comparative CLIP-seq analyses

    PubMed Central

    2014-01-01

    Although comparison of RNA-protein interaction profiles across different conditions has become increasingly important to understanding the function of RNA-binding proteins (RBPs), few computational approaches have been developed for quantitative comparison of CLIP-seq datasets. Here, we present an easy-to-use command line tool, dCLIP, for quantitative CLIP-seq comparative analysis. The two-stage method implemented in dCLIP, including a modified MA normalization method and a hidden Markov model, is shown to be able to effectively identify differential binding regions of RBPs in four CLIP-seq datasets, generated by HITS-CLIP, iCLIP and PAR-CLIP protocols. dCLIP is freely available at http://qbrc.swmed.edu/software/. PMID:24398258

  4. Interactive full channel teletext system for cable television nets

    NASA Astrophysics Data System (ADS)

    Vandenboom, H. P. A.

    1984-08-01

    A demonstration set-up of an interactive full channel teletext (FCT) system for cable TV networks with two-way data communication possibilities was designed and realized. In FCT all image lines are used for teletext data lines. The FCT decoder was placed in the mini-star, and the FCT encoder which provides the FCT signal was placed in the local center. From the FCT signal a number of data lines are selected using an extra FCT decoder. They are placed on the image lines reserved for teletext so that a normal TV receiver equipped with a teletext decoder, can process the selected data lines. For texts not on hand in the FCT signal, a command can be sent to the local center via the data communication path. A cheap and simple system is offered in which the number of commanded pages or books is in principle unlimited, while the used waiting time and channel capacity is limited.

  5. Increases in efficiency and enhancements to the Mars Observer non-stored commanding process

    NASA Technical Reports Server (NTRS)

    Brooks, Robert N., Jr.; Torgerson, J. Leigh

    1994-01-01

    The Mars Observer team was, until the untimely loss of the spacecraft on August 21, 1993, performing flight operations with greater efficiency and speed than any previous JPL mission of its size. This level of through-put was made possible by a mission operations system which was composed of skilled personnel using sophisticated sequencing and commanding tools. During cruise flight operations, however, it was realized by the project that this commanding level was not going to be sufficient to support the activities planned for mapping operations. The project had committed to providing the science instrument principle investigators with a much higher level of commanding during mapping. Thus, the project began taking steps to enhance the capabilities of the flight team. One mechanism used by project management was a tool available from total quality management (TQM). This tool is known as a process action team (PAT). The Mars Observer PAT was tasked to increase the capacity of the flight team's nonstored commanding process by fifty percent with no increase in staffing and a minimal increase in risk. The outcome of this effort was, in fact, to increase the capacity by a factor of 2.5 rather than the desired fifty percent and actually reduce risk. The majority of these improvements came from the automation of the existing command process. These results required very few changes to the existing mission operations system. Rather, the PAT was able to take advantage of automation capabilities inherent in the existing system and make changes to the existing flight team procedures.

  6. Rover Sequencing and Visualization Program

    NASA Technical Reports Server (NTRS)

    Cooper, Brian; Hartman, Frank; Maxwell, Scott; Yen, Jeng; Wright, John; Balacuit, Carlos

    2005-01-01

    The Rover Sequencing and Visualization Program (RSVP) is the software tool for use in the Mars Exploration Rover (MER) mission for planning rover operations and generating command sequences for accomplishing those operations. RSVP combines three-dimensional (3D) visualization for immersive exploration of the operations area, stereoscopic image display for high-resolution examination of the downlinked imagery, and a sophisticated command-sequence editing tool for analysis and completion of the sequences. RSVP is linked with actual flight-code modules for operations rehearsal to provide feedback on the expected behavior of the rover prior to committing to a particular sequence. Playback tools allow for review of both rehearsed rover behavior and downlinked results of actual rover operations. These can be displayed simultaneously for comparison of rehearsed and actual activities for verification. The primary inputs to RSVP are downlink data products from the Operations Storage Server (OSS) and activity plans generated by the science team. The activity plans are high-level goals for the next day s activities. The downlink data products include imagery, terrain models, and telemetered engineering data on rover activities and state. The Rover Sequence Editor (RoSE) component of RSVP performs activity expansion to command sequences, command creation and editing with setting of command parameters, and viewing and management of rover resources. The HyperDrive component of RSVP performs 2D and 3D visualization of the rover s environment, graphical and animated review of rover-predicted and telemetered state, and creation and editing of command sequences related to mobility and Instrument Deployment Device (IDD) operations. Additionally, RoSE and HyperDrive together evaluate command sequences for potential violations of flight and safety rules. The products of RSVP include command sequences for uplink that are stored in the Distributed Object Manager (DOM) and predicted rover state histories stored in the OSS for comparison and validation of downlinked telemetry. The majority of components comprising RSVP utilize the MER command and activity dictionaries to automatically customize the system for MER activities. Thus, RSVP, being highly data driven, may be tailored to other missions with minimal effort. In addition, RSVP uses a distributed, message-passing architecture to allow multitasking, and collaborative visualization and sequence development by scattered team members.

  7. Update on Rover Sequencing and Visualization Program

    NASA Technical Reports Server (NTRS)

    Cooper, Brian; Hartman, Frank; Maxwell, Scott; Yen, Jeng; Wright, John; Balacuit, Carlos

    2005-01-01

    The Rover Sequencing and Visualization Program (RSVP) has been updated. RSVP was reported in Rover Sequencing and Visualization Program (NPO-30845), NASA Tech Briefs, Vol. 29, No. 4 (April 2005), page 38. To recapitulate: The Rover Sequencing and Visualization Program (RSVP) is the software tool to be used in the Mars Exploration Rover (MER) mission for planning rover operations and generating command sequences for accomplishing those operations. RSVP combines three-dimensional (3D) visualization for immersive exploration of the operations area, stereoscopic image display for high-resolution examination of the downlinked imagery, and a sophisticated command-sequence editing tool for analysis and completion of the sequences. RSVP is linked with actual flight code modules for operations rehearsal to provide feedback on the expected behavior of the rover prior to committing to a particular sequence. Playback tools allow for review of both rehearsed rover behavior and downlinked results of actual rover operations. These can be displayed simultaneously for comparison of rehearsed and actual activities for verification. The primary inputs to RSVP are downlink data products from the Operations Storage Server (OSS) and activity plans generated by the science team. The activity plans are high-level goals for the next day s activities. The downlink data products include imagery, terrain models, and telemetered engineering data on rover activities and state. The Rover Sequence Editor (RoSE) component of RSVP performs activity expansion to command sequences, command creation and editing with setting of command parameters, and viewing and management of rover resources. The HyperDrive component of RSVP performs 2D and 3D visualization of the rover s environment, graphical and animated review of rover predicted and telemetered state, and creation and editing of command sequences related to mobility and Instrument Deployment Device (robotic arm) operations. Additionally, RoSE and HyperDrive together evaluate command sequences for potential violations of flight and safety rules. The products of RSVP include command sequences for uplink that are stored in the Distributed Object Manager (DOM) and predicted rover state histories stored in the OSS for comparison and validation of downlinked telemetry. The majority of components comprising RSVP utilize the MER command and activity dictionaries to automatically customize the system for MER activities.

  8. 46 CFR 42.13-5 - Strength of vessel.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... information to the Commandant. (b) Vessels built and maintained in conformity with the requirements of a classification society recognized by the Commandant are considered to possess adequate strength for the purpose... General Rules for Determining Load Lines § 42.13-5 Strength of vessel. (a) The assigning and issuing...

  9. Band of Brothers or Dysfunctional Family? A Military Perspective on Coalition Challenges During Stability Operations

    DTIC Science & Technology

    2011-01-01

    vehicle LCT landing craft, tank LO liaison officer LOO line of operation MIS Military Intelligence Service MND (SE) Multi-National Division South East...23, 2000. Their units then fell under the command of the peace- keeping force (PKF) commander, Lt Gen Jaime de los Santos of the Philippines, who was...rather than instantaneous transition of responsibility between his command and the PKF. He suggested to General de los Santos that the UNTAET staff

  10. Ascent stage of Apollo 10 Lunar Module seen from Command module

    NASA Image and Video Library

    1969-05-22

    AS10-34-5112 (26 May 1969) --- The ascent stage of the Apollo 10 Lunar Module (LM) is photographed from the Command Module prior to docking in lunar orbit. The LM is approaching the Command and Service Modules from below. The LM descent stage had already been jettisoned. The lunar surface in the background is near, but beyond the eastern limb of the moon as viewed from Earth (about 120 degrees east longitude). The red/blue diagonal line is the spacecraft window.

  11. Digital Control of the Czochralski Growth of Gallium Arsenide-Controller Software Reference Manual

    DTIC Science & Technology

    1987-07-15

    possible with regard to the format of the commands. Several help menus and extensive command prompts guide the operator. The dialog between the...single-zone heater is in use.) - 4 - Kfc ^&S^^ p IS’ K: i 1. Digital Control of Czochralski GaAs Crystal Growth (2) Four tachometers which are...commands for the display of menus or auxiliary information. The scrolled portion shrinks to four lines if auxiliary data display is re- quested with the

  12. Remote Data Exploration with the Interactive Data Language (IDL)

    NASA Technical Reports Server (NTRS)

    Galloy, Michael

    2013-01-01

    A difficulty for many NASA researchers is that often the data to analyze is located remotely from the scientist and the data is too large to transfer for local analysis. Researchers have developed the Data Access Protocol (DAP) for accessing remote data. Presently one can use DAP from within IDL, but the IDL-DAP interface is both limited and cumbersome. A more powerful and user-friendly interface to DAP for IDL has been developed. Users are able to browse remote data sets graphically, select partial data to retrieve, import that data and make customized plots, and have an interactive IDL command line session simultaneous with the remote visualization. All of these IDL-DAP tools are usable easily and seamlessly for any IDL user. IDL and DAP are both widely used in science, but were not easily used together. The IDL DAP bindings were incomplete and had numerous bugs that prevented their serious use. For example, the existing bindings did not read DAP Grid data, which is the organization of nearly all NASA datasets currently served via DAP. This project uniquely provides a fully featured, user-friendly interface to DAP from IDL, both from the command line and a GUI application. The DAP Explorer GUI application makes browsing a dataset more user-friendly, while also providing the capability to run user-defined functions on specified data. Methods for running remote functions on the DAP server were investigated, and a technique for accomplishing this task was decided upon.

  13. Command-line cellular electrophysiology for conventional and real-time closed-loop experiments.

    PubMed

    Linaro, Daniele; Couto, João; Giugliano, Michele

    2014-06-15

    Current software tools for electrophysiological experiments are limited in flexibility and rarely offer adequate support for advanced techniques such as dynamic clamp and hybrid experiments, which are therefore limited to laboratories with a significant expertise in neuroinformatics. We have developed lcg, a software suite based on a command-line interface (CLI) that allows performing both standard and advanced electrophysiological experiments. Stimulation protocols for classical voltage and current clamp experiments are defined by a concise and flexible meta description that allows representing complex waveforms as a piece-wise parametric decomposition of elementary sub-waveforms, abstracting the stimulation hardware. To perform complex experiments lcg provides a set of elementary building blocks that can be interconnected to yield a large variety of experimental paradigms. We present various cellular electrophysiological experiments in which lcg has been employed, ranging from the automated application of current clamp protocols for characterizing basic electrophysiological properties of neurons, to dynamic clamp, response clamp, and hybrid experiments. We finally show how the scripting capabilities behind a CLI are suited for integrating experimental trials into complex workflows, where actual experiment, online data analysis and computational modeling seamlessly integrate. We compare lcg with two open source toolboxes, RTXI and RELACS. We believe that lcg will greatly contribute to the standardization and reproducibility of both simple and complex experiments. Additionally, on the long run the increased efficiency due to a CLI will prove a great benefit for the experimental community. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. A Maple package for improved global mapping forecast

    NASA Astrophysics Data System (ADS)

    Carli, H.; Duarte, L. G. S.; da Mota, L. A. C. P.

    2014-03-01

    We present a Maple implementation of the well known global approach to time series analysis and some further developments designed to improve the computational efficiency of the forecasting capabilities of the approach. This global approach can be summarized as being a reconstruction of the phase space, based on a time ordered series of data obtained from the system. After that, using the reconstructed vectors, a portion of this space is used to produce a mapping, a polynomial fitting, through a minimization procedure, that represents the system and can be employed to forecast further entries for the series. In the present implementation, we introduce a set of commands, tools, in order to perform all these tasks. For example, the command VecTS deals mainly with the reconstruction of the vector in the phase space. The command GfiTS deals with producing the minimization and the fitting. ForecasTS uses all these and produces the prediction of the next entries. For the non-standard algorithms, we here present two commands: IforecasTS and NiforecasTS that, respectively deal with the one-step and the N-step forecasting. Finally, we introduce two further tools to aid the forecasting. The commands GfiTS and AnalysTS, basically, perform an analysis of the behavior of each portion of a series regarding the settings used on the commands just mentioned above. Catalogue identifier: AERW_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERW_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 3001 No. of bytes in distributed program, including test data, etc.: 95018 Distribution format: tar.gz Programming language: Maple 14. Computer: Any capable of running Maple Operating system: Any capable of running Maple. Tested on Windows ME, Windows XP, Windows 7. RAM: 128 MB Classification: 4.3, 4.9, 5 Nature of problem: Time series analysis and improving forecast capability. Solution method: The method of solution is partially based on a result published in [1]. Restrictions: If the time series that is being analyzed presents a great amount of noise or if the dynamical system behind the time series is of high dimensionality (Dim≫3), then the method may not work well. Unusual features: Our implementation can, in the cases where the dynamics behind the time series is given by a system of low dimensionality, greatly improve the forecast. Running time: This depends strongly on the command that is being used. References: [1] Barbosa, L.M.C.R., Duarte, L.G.S., Linhares, C.A. and da Mota, L.A.C.P., Improving the global fitting method on nonlinear time series analysis, Phys. Rev. E 74, 026702 (2006).

  15. Power User Interface

    NASA Technical Reports Server (NTRS)

    Pfister, Robin; McMahon, Joe

    2006-01-01

    Power User Interface 5.0 (PUI) is a system of middleware, written for expert users in the Earth-science community, PUI enables expedited ordering of data granules on the basis of specific granule-identifying information that the users already know or can assemble. PUI also enables expert users to perform quick searches for orderablegranule information for use in preparing orders. PUI 5.0 is available in two versions (note: PUI 6.0 has command-line mode only): a Web-based application program and a UNIX command-line- mode client program. Both versions include modules that perform data-granule-ordering functions in conjunction with external systems. The Web-based version works with Earth Observing System Clearing House (ECHO) metadata catalog and order-entry services and with an open-source order-service broker server component, called the Mercury Shopping Cart, that is provided separately by Oak Ridge National Laboratory through the Department of Energy. The command-line version works with the ECHO metadata and order-entry process service. Both versions of PUI ultimately use ECHO to process an order to be sent to a data provider. Ordered data are provided through means outside the PUI software system.

  16. Software for Improved Extraction of Data From Tape Storage

    NASA Technical Reports Server (NTRS)

    Cheng, Chiu-Fu

    2003-01-01

    A computer program has been written to replace the original software of Racal Storeplex Delta tape recorders, which are used at Stennis Space Center. The original software could be activated by a command- line interface only; the present software offers the option of a command-line or graphical user interface. The present software also offers the option of batch-file operation (activation by a file that contains command lines for operations performed consecutively). The present software is also more reliable than was the original software: The original software was plagued by several deficiencies that made it difficult to execute, modify, and test. In addition, when using the original software to extract data that had been recorded within specified intervals of time, the resolution with which one could control starting and stopping times was no finer than about a second (or, in some cases, several seconds). In contrast, the present software is capable of controlling playback times to within 1/100 second of times specified by the user, assuming that the tape-recorder clock is accurate to within 1/100 second.

  17. Software for Improved Extraction of Data From Tape Storage

    NASA Technical Reports Server (NTRS)

    Cheng, Chiu-Fu

    2002-01-01

    A computer program has been written to replace the original software of Racal Storeplex Delta tape recorders, which are still used at Stennis Space Center but have been discontinued by the manufacturer. Whereas the original software could be activated by a command-line interface only, the present software offers the option of a command-line or graphical user interface. The present software also offers the option of batch-file operation (activation by a file that contains command lines for operations performed consecutively). The present software is also more reliable than was the original software: The original software was plagued by several deficiencies that made it difficult to execute, modify, and test. In addition, when using the original software to extract data that had been recorded within specified intervals of time, the resolution with which one could control starting and stopping times was no finer than about a second (or, in some cases, several seconds). In contrast, the present software is capable of controlling playback times to within 1/100 second of times specified by the user, assuming that the tape-recorder clock is accurate to within 1/100 second.

  18. 50 CFR 600.10 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... fishing gear consisting of a float and one or more lines suspended therefrom. A hook or hooks are on the... live fish on board a vessel. Center means one of the five NMFS Fisheries Science Centers. Charter boat... carry six or fewer passengers for hire. Coast Guard Commander means one of the commanding officers of...

  19. 50 CFR 600.10 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... fishing gear consisting of a float and one or more lines suspended therefrom. A hook or hooks are on the... live fish on board a vessel. Center means one of the five NMFS Fisheries Science Centers. Charter boat... carry six or fewer passengers for hire. Coast Guard Commander means one of the commanding officers of...

  20. 50 CFR 600.10 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... fishing gear consisting of a float and one or more lines suspended therefrom. A hook or hooks are on the... live fish on board a vessel. Center means one of the five NMFS Fisheries Science Centers. Charter boat... carry six or fewer passengers for hire. Coast Guard Commander means one of the commanding officers of...

  1. 50 CFR 600.10 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... fishing gear consisting of a float and one or more lines suspended therefrom. A hook or hooks are on the... live fish on board a vessel. Center means one of the five NMFS Fisheries Science Centers. Charter boat... carry six or fewer passengers for hire. Coast Guard Commander means one of the commanding officers of...

  2. The Proximity Principle: Army Chaplains on the Fighting Line in Doctrine and History

    DTIC Science & Technology

    2014-12-12

    U.S. Army Command and General Staff College in partial fulfillment of the requirements for the degree MASTER OF MILITARY ART AND SCIENCE Art......and do not necessarily represent the views of the U.S. Army Command and General Staff College or any other governmental agency. (References to this

  3. Primer for the Transportable Applications Executive

    NASA Technical Reports Server (NTRS)

    Carlson, P. A.; Emmanuelli, C. A.; Harris, E. L.; Perkins, D. C.

    1984-01-01

    The Transportable Applications Executive (TAE), an interactive multipurpose executive that provides commonly required functions for scientific analysis systems, is discussed. The concept of an executive is discussed and the various components of TAE are presented. These include on-line help information, the use of menus or commands to access analysis programs, and TAE command procedures.

  4. Dynamic sample size detection in learning command line sequence for continuous authentication.

    PubMed

    Traore, Issa; Woungang, Isaac; Nakkabi, Youssef; Obaidat, Mohammad S; Ahmed, Ahmed Awad E; Khalilian, Bijan

    2012-10-01

    Continuous authentication (CA) consists of authenticating the user repetitively throughout a session with the goal of detecting and protecting against session hijacking attacks. While the accuracy of the detector is central to the success of CA, the detection delay or length of an individual authentication period is important as well since it is a measure of the window of vulnerability of the system. However, high accuracy and small detection delay are conflicting requirements that need to be balanced for optimum detection. In this paper, we propose the use of sequential sampling technique to achieve optimum detection by trading off adequately between detection delay and accuracy in the CA process. We illustrate our approach through CA based on user command line sequence and naïve Bayes classification scheme. Experimental evaluation using the Greenberg data set yields encouraging results consisting of a false acceptance rate (FAR) of 11.78% and a false rejection rate (FRR) of 1.33%, with an average command sequence length (i.e., detection delay) of 37 commands. When using the Schonlau (SEA) data set, we obtain FAR = 4.28% and FRR = 12%.

  5. BioPCD - A Language for GUI Development Requiring a Minimal Skill Set

    PubMed Central

    Alvare, Graham GM; Roche-Lima, Abiel; Fristensky, Brian

    2016-01-01

    BioPCD is a new language whose purpose is to simplify the creation of Graphical User Interfaces (GUIs) by biologists with minimal programming skills. The first step in developing BioPCD was to create a minimal superset of the language referred to as PCD (Pythonesque Command Description). PCD defines the core of terminals and high-level nonterminals required to describe data of almost any type. BioPCD adds to PCD the constructs necessary to describe GUI components and the syntax for executing system commands. BioPCD is implemented using JavaCC to convert the grammar into code. BioPCD is designed to be terse and readable and simple enough to be learned by copying and modifying existing BioPCD files. We demonstrate that BioPCD can easily be used to generate GUIs for existing command line programs. Although BioPCD was designed to make it easier to run bioinformatics programs, it could be used in any domain in which many useful command line programs exist that do not have GUI interfaces. PMID:27818582

  6. Perspectives of U.S. military commanders on tobacco use and tobacco control policy

    PubMed Central

    Poston, Walker S.C.; Haddock, Christopher K.; Jahnke, Sara A.; Jitnarin, Nattinee; Malone, Ruth E.; Smith, Elizabeth A.

    2016-01-01

    Background Tobacco use among U.S. military service members is unacceptably high, resulting in substantial health care and personnel costs. Support of military command is critical to the success of tobacco control policies because line commanders are responsible for implementation and enforcement. This study is the first to examine U.S. military line commander’s perspectives about current tobacco control policies and the impact of tobacco on readiness. Methods We conducted key-informant interviews with 20 officers at the U.S. Army’s Command and General Staff College about military tobacco use and tobacco control policy. Results Participants identified the long term impact of tobacco use on military members, but were unaware of proximal effects on health and readiness other than lost productivity due to smoke breaks. Officers also discussed nicotine addiction and the logistics of ensuring that an addicted population had access to tobacco. Regarding policy, most knew about regulations governing smoke-free areas and were open to stronger restrictions, but were unaware of current policies governing prevention, intervention, and product sales. Conclusions Findings suggest that strong policy that takes advantage of the hierarchical and disciplined nature of the military, supported by senior line and civilian leadership up to and including the Secretaries of the services and the Secretary of Defense, will be critical to substantially diminishing tobacco use by military personnel. PMID:27084960

  7. CRISPR Primer Designer: Design primers for knockout and chromosome imaging CRISPR-Cas system.

    PubMed

    Yan, Meng; Zhou, Shi-Rong; Xue, Hong-Wei

    2015-07-01

    The clustered regularly interspaced short palindromic repeats (CRISPR)-associated system enables biologists to edit genomes precisely and provides a powerful tool for perturbing endogenous gene regulation, modulation of epigenetic markers, and genome architecture. However, there are concerns about the specificity of the system, especially the usages of knocking out a gene. Previous designing tools either were mostly built-in websites or ran as command-line programs, and none of them ran locally and acquired a user-friendly interface. In addition, with the development of CRISPR-derived systems, such as chromosome imaging, there were still no tools helping users to generate specific end-user spacers. We herein present CRISPR Primer Designer for researchers to design primers for CRISPR applications. The program has a user-friendly interface, can analyze the BLAST results by using multiple parameters, score for each candidate spacer, and generate the primers when using a certain plasmid. In addition, CRISPR Primer Designer runs locally and can be used to search spacer clusters, and exports primers for the CRISPR-Cas system-based chromosome imaging system. © 2014 Institute of Botany, Chinese Academy of Sciences.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ditzler, Gregory; Morrison, J. Calvin; Lan, Yemin

    Background: Some of the current software tools for comparative metagenomics provide ecologists with the ability to investigate and explore bacterial communities using α– & β–diversity. Feature subset selection – a sub-field of machine learning – can also provide a unique insight into the differences between metagenomic or 16S phenotypes. In particular, feature subset selection methods can obtain the operational taxonomic units (OTUs), or functional features, that have a high-level of influence on the condition being studied. For example, in a previous study we have used information-theoretic feature selection to understand the differences between protein family abundances that best discriminate betweenmore » age groups in the human gut microbiome. Results: We have developed a new Python command line tool, which is compatible with the widely adopted BIOM format, for microbial ecologists that implements information-theoretic subset selection methods for biological data formats. We demonstrate the software tools capabilities on publicly available datasets. Conclusions: We have made the software implementation of Fizzy available to the public under the GNU GPL license. The standalone implementation can be found at http://github.com/EESI/Fizzy.« less

  9. "Just Another Tool for Online Studies” (JATOS): An Easy Solution for Setup and Management of Web Servers Supporting Online Studies

    PubMed Central

    Lange, Kristian; Kühn, Simone; Filevich, Elisa

    2015-01-01

    We present here “Just Another Tool for Online Studies” (JATOS): an open source, cross-platform web application with a graphical user interface (GUI) that greatly simplifies setting up and communicating with a web server to host online studies that are written in JavaScript. JATOS is easy to install in all three major platforms (Microsoft Windows, Mac OS X, and Linux), and seamlessly pairs with a database for secure data storage. It can be installed on a server or locally, allowing researchers to try the application and feasibility of their studies within a browser environment, before engaging in setting up a server. All communication with the JATOS server takes place via a GUI (with no need to use a command line interface), making JATOS an especially accessible tool for researchers without a strong IT background. We describe JATOS’ main features and implementation and provide a detailed tutorial along with example studies to help interested researchers to set up their online studies. JATOS can be found under the Internet address: www.jatos.org. PMID:26114751

  10. Fizzy: feature subset selection for metagenomics.

    PubMed

    Ditzler, Gregory; Morrison, J Calvin; Lan, Yemin; Rosen, Gail L

    2015-11-04

    Some of the current software tools for comparative metagenomics provide ecologists with the ability to investigate and explore bacterial communities using α- & β-diversity. Feature subset selection--a sub-field of machine learning--can also provide a unique insight into the differences between metagenomic or 16S phenotypes. In particular, feature subset selection methods can obtain the operational taxonomic units (OTUs), or functional features, that have a high-level of influence on the condition being studied. For example, in a previous study we have used information-theoretic feature selection to understand the differences between protein family abundances that best discriminate between age groups in the human gut microbiome. We have developed a new Python command line tool, which is compatible with the widely adopted BIOM format, for microbial ecologists that implements information-theoretic subset selection methods for biological data formats. We demonstrate the software tools capabilities on publicly available datasets. We have made the software implementation of Fizzy available to the public under the GNU GPL license. The standalone implementation can be found at http://github.com/EESI/Fizzy.

  11. India Allele Finder: a web-based annotation tool for identifying common alleles in next-generation sequencing data of Indian origin.

    PubMed

    Zhang, Jimmy F; James, Francis; Shukla, Anju; Girisha, Katta M; Paciorkowski, Alex R

    2017-06-27

    We built India Allele Finder, an online searchable database and command line tool, that gives researchers access to variant frequencies of Indian Telugu individuals, using publicly available fastq data from the 1000 Genomes Project. Access to appropriate population-based genomic variant annotation can accelerate the interpretation of genomic sequencing data. In particular, exome analysis of individuals of Indian descent will identify population variants not reflected in European exomes, complicating genomic analysis for such individuals. India Allele Finder offers improved ease-of-use to investigators seeking to identify and annotate sequencing data from Indian populations. We describe the use of India Allele Finder to identify common population variants in a disease quartet whole exome dataset, reducing the number of candidate single nucleotide variants from 84 to 7. India Allele Finder is freely available to investigators to annotate genomic sequencing data from Indian populations. Use of India Allele Finder allows efficient identification of population variants in genomic sequencing data, and is an example of a population-specific annotation tool that simplifies analysis and encourages international collaboration in genomics research.

  12. Fizzy. Feature subset selection for metagenomics

    DOE PAGES

    Ditzler, Gregory; Morrison, J. Calvin; Lan, Yemin; ...

    2015-11-04

    Background: Some of the current software tools for comparative metagenomics provide ecologists with the ability to investigate and explore bacterial communities using α– & β–diversity. Feature subset selection – a sub-field of machine learning – can also provide a unique insight into the differences between metagenomic or 16S phenotypes. In particular, feature subset selection methods can obtain the operational taxonomic units (OTUs), or functional features, that have a high-level of influence on the condition being studied. For example, in a previous study we have used information-theoretic feature selection to understand the differences between protein family abundances that best discriminate betweenmore » age groups in the human gut microbiome. Results: We have developed a new Python command line tool, which is compatible with the widely adopted BIOM format, for microbial ecologists that implements information-theoretic subset selection methods for biological data formats. We demonstrate the software tools capabilities on publicly available datasets. Conclusions: We have made the software implementation of Fizzy available to the public under the GNU GPL license. The standalone implementation can be found at http://github.com/EESI/Fizzy.« less

  13. SICLE: a high-throughput tool for extracting evolutionary relationships from phylogenetic trees.

    PubMed

    DeBlasio, Dan F; Wisecaver, Jennifer H

    2016-01-01

    We present the phylogeny analysis software SICLE (Sister Clade Extractor), an easy-to-use, high-throughput tool to describe the nearest neighbors to a node of interest in a phylogenetic tree as well as the support value for the relationship. The application is a command line utility that can be embedded into a phylogenetic analysis pipeline or can be used as a subroutine within another C++ program. As a test case, we applied this new tool to the published phylome of Salinibacter ruber, a species of halophilic Bacteriodetes, identifying 13 unique sister relationships to S. ruber across the 4,589 gene phylogenies. S. ruber grouped with bacteria, most often other Bacteriodetes, in the majority of phylogenies, but 91 phylogenies showed a branch-supported sister association between S. ruber and Archaea, an evolutionarily intriguing relationship indicative of horizontal gene transfer. This test case demonstrates how SICLE makes it possible to summarize the phylogenetic information produced by automated phylogenetic pipelines to rapidly identify and quantify the possible evolutionary relationships that merit further investigation. SICLE is available for free for noncommercial use at http://eebweb.arizona.edu/sicle/.

  14. ColorTree: a batch customization tool for phylogenic trees

    PubMed Central

    Chen, Wei-Hua; Lercher, Martin J

    2009-01-01

    Background Genome sequencing projects and comparative genomics studies typically aim to trace the evolutionary history of large gene sets, often requiring human inspection of hundreds of phylogenetic trees. If trees are checked for compatibility with an explicit null hypothesis (e.g., the monophyly of certain groups), this daunting task is greatly facilitated by an appropriate coloring scheme. Findings In this note, we introduce ColorTree, a simple yet powerful batch customization tool for phylogenic trees. Based on pattern matching rules, ColorTree applies a set of customizations to an input tree file, e.g., coloring labels or branches. The customized trees are saved to an output file, which can then be viewed and further edited by Dendroscope (a freely available tree viewer). ColorTree runs on any Perl installation as a stand-alone command line tool, and its application can thus be easily automated. This way, hundreds of phylogenic trees can be customized for easy visual inspection in a matter of minutes. Conclusion ColorTree allows efficient and flexible visual customization of large tree sets through the application of a user-supplied configuration file to multiple tree files. PMID:19646243

  15. ColorTree: a batch customization tool for phylogenic trees.

    PubMed

    Chen, Wei-Hua; Lercher, Martin J

    2009-07-31

    Genome sequencing projects and comparative genomics studies typically aim to trace the evolutionary history of large gene sets, often requiring human inspection of hundreds of phylogenetic trees. If trees are checked for compatibility with an explicit null hypothesis (e.g., the monophyly of certain groups), this daunting task is greatly facilitated by an appropriate coloring scheme. In this note, we introduce ColorTree, a simple yet powerful batch customization tool for phylogenic trees. Based on pattern matching rules, ColorTree applies a set of customizations to an input tree file, e.g., coloring labels or branches. The customized trees are saved to an output file, which can then be viewed and further edited by Dendroscope (a freely available tree viewer). ColorTree runs on any Perl installation as a stand-alone command line tool, and its application can thus be easily automated. This way, hundreds of phylogenic trees can be customized for easy visual inspection in a matter of minutes. ColorTree allows efficient and flexible visual customization of large tree sets through the application of a user-supplied configuration file to multiple tree files.

  16. KBWS: an EMBOSS associated package for accessing bioinformatics web services.

    PubMed

    Oshita, Kazuki; Arakawa, Kazuharu; Tomita, Masaru

    2011-04-29

    The availability of bioinformatics web-based services is rapidly proliferating, for their interoperability and ease of use. The next challenge is in the integration of these services in the form of workflows, and several projects are already underway, standardizing the syntax, semantics, and user interfaces. In order to deploy the advantages of web services with locally installed tools, here we describe a collection of proxy client tools for 42 major bioinformatics web services in the form of European Molecular Biology Open Software Suite (EMBOSS) UNIX command-line tools. EMBOSS provides sophisticated means for discoverability and interoperability for hundreds of tools, and our package, named the Keio Bioinformatics Web Service (KBWS), adds functionalities of local and multiple alignment of sequences, phylogenetic analyses, and prediction of cellular localization of proteins and RNA secondary structures. This software implemented in C is available under GPL from http://www.g-language.org/kbws/ and GitHub repository http://github.com/cory-ko/KBWS. Users can utilize the SOAP services implemented in Perl directly via WSDL file at http://soap.g-language.org/kbws.wsdl (RPC Encoded) and http://soap.g-language.org/kbws_dl.wsdl (Document/literal).

  17. Galaxy tools and workflows for sequence analysis with applications in molecular plant pathology

    PubMed Central

    Grüning, Björn A.; Paszkiewicz, Konrad; Pritchard, Leighton

    2013-01-01

    The Galaxy Project offers the popular web browser-based platform Galaxy for running bioinformatics tools and constructing simple workflows. Here, we present a broad collection of additional Galaxy tools for large scale analysis of gene and protein sequences. The motivating research theme is the identification of specific genes of interest in a range of non-model organisms, and our central example is the identification and prediction of “effector” proteins produced by plant pathogens in order to manipulate their host plant. This functional annotation of a pathogen’s predicted capacity for virulence is a key step in translating sequence data into potential applications in plant pathology. This collection includes novel tools, and widely-used third-party tools such as NCBI BLAST+ wrapped for use within Galaxy. Individual bioinformatics software tools are typically available separately as standalone packages, or in online browser-based form. The Galaxy framework enables the user to combine these and other tools to automate organism scale analyses as workflows, without demanding familiarity with command line tools and scripting. Workflows created using Galaxy can be saved and are reusable, so may be distributed within and between research groups, facilitating the construction of a set of standardised, reusable bioinformatic protocols. The Galaxy tools and workflows described in this manuscript are open source and freely available from the Galaxy Tool Shed (http://usegalaxy.org/toolshed or http://toolshed.g2.bx.psu.edu). PMID:24109552

  18. Use of Spacecraft Command Language for Advanced Command and Control Applications

    NASA Technical Reports Server (NTRS)

    Mims, Tikiela L.

    2008-01-01

    The purpose of this work is to evaluate the use of SCL in building and monitoring command and control applications in order to determine its fitness for space operations. Approximately 24,325 lines of PCG2 code was converted to SCL yielding a 90% reduction in the number of lines of code as many of the functions and scripts utilized in SCL could be ported and reused. Automated standalone testing, simulating the actual production environment, was performed in order to generalize and gauge the relative time it takes for SCL to update and write a given display. The use of SCL rules, functions, and scripts allowed the creation of several test cases permitting the detection of the amount of time it takes update a given set of measurements given the change in a globally existing CUI or CUI. It took the SCL system an average 926.09 ticks to update the entire display of 323 measurements.

  19. SIG: a general-purpose signal processing program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lager, D.; Azevedo, S.

    1986-02-01

    SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time- and frequency-domain signals. It also accommodates other representations for data such as transfer function polynomials. Signal processing operations include digital filtering, auto/cross spectral density, transfer function/impulse response, convolution, Fourier transform, and inverse Fourier transform. Graphical operations provide display of signals and spectra, including plotting, cursor zoom, families of curves, and multiple viewport plots. SIG provides two user interfaces with a menu mode for occasional users and a command mode for more experienced users. Capability exits for multiple commands per line, commandmore » files with arguments, commenting lines, defining commands, automatic execution for each item in a repeat sequence, etc. SIG is presently available for VAX(VMS), VAX (BERKELEY 4.2 UNIX), SUN (BERKELEY 4.2 UNIX), DEC-20 (TOPS-20), LSI-11/23 (TSX), and DEC PRO 350 (TSX). 4 refs., 2 figs.« less

  20. Command and Control Warfare. Putting Another Tool in the War-Fighter’s Data Base

    DTIC Science & Technology

    1994-09-01

    information dominance , friendly commanders will be able to work inside the enemy commander’s decision-making cycle forcing him to be reactive and thus cede the initiative and advantage to friendly forces. In any conflict, from large scale transregional to small scale, localized counter-insurgency, a joint or coalition team drawn together from the capabilities of each service and orchestrated by the joint force or theater- level commander will execute the responses of the United States armed forces. Units should perform their specific roles in accordance with the

  1. Command in a field hospital.

    PubMed

    Bricknell, M C M

    2003-03-01

    This paper examines the challenges involved in commanding a field hospital. There are frequent, dynamic tensions between the military culture that is based on a task-focussed, hierarchical structure and the clinical culture that is based on flat, process-focussed, multidisciplinary teams. The paper outlines the cultural environment of the field hospital and then examines the deployment sequence whereby a functioning clinical facility may be created from a group of disparate individuals. There are a number of tools that may assist with this including the personality of the Commanding Officer, individual skills, the creation of an organizational identity and the choice of command structure.

  2. MEA-Tools: an open source toolbox for the analysis of multi-electrode data with MATLAB.

    PubMed

    Egert, U; Knott, Th; Schwarz, C; Nawrot, M; Brandt, A; Rotter, S; Diesmann, M

    2002-05-30

    Recent advances in electrophysiological techniques have created new tools for the acquisition and storage of neuronal activity recorded simultaneously with numerous electrodes. These techniques support the analysis of the function as well as the structure of individual electrogenic cells in the context of surrounding neuronal or cardiac network. Commercially available tools for the analysis of such data, however, cannot be easily adapted to newly emerging requirements for data analysis and visualization, and cross compatibility between them is limited. In this report we introduce a free open source toolbox called microelectrode array tools (MEA-Tools) for the analysis of multi-electrode data based on the common data analysis environment MATLAB (version 5.3-6.1, The Mathworks, Natick, MA). The toolbox itself is platform independent. The file interface currently supports files recorded with MCRack (Multi Channel Systems, Reutlingen, Germany) under Microsoft Windows 95, 98, NT, and 2000, but can be adapted to other data acquisition systems. Functions are controlled via command line input and graphical user interfaces, and support common requirements for the analysis of local field potentials, extracellular spike activity, and continuous recordings, in addition to supplementary data acquired by additional instruments, e.g. intracellular amplifiers. Data may be processed as continuous recordings or time windows triggered to some event.

  3. Integrated design, execution, and analysis of arrayed and pooled CRISPR genome-editing experiments.

    PubMed

    Canver, Matthew C; Haeussler, Maximilian; Bauer, Daniel E; Orkin, Stuart H; Sanjana, Neville E; Shalem, Ophir; Yuan, Guo-Cheng; Zhang, Feng; Concordet, Jean-Paul; Pinello, Luca

    2018-05-01

    CRISPR (clustered regularly interspaced short palindromic repeats) genome-editing experiments offer enormous potential for the evaluation of genomic loci using arrayed single guide RNAs (sgRNAs) or pooled sgRNA libraries. Numerous computational tools are available to help design sgRNAs with optimal on-target efficiency and minimal off-target potential. In addition, computational tools have been developed to analyze deep-sequencing data resulting from genome-editing experiments. However, these tools are typically developed in isolation and oftentimes are not readily translatable into laboratory-based experiments. Here, we present a protocol that describes in detail both the computational and benchtop implementation of an arrayed and/or pooled CRISPR genome-editing experiment. This protocol provides instructions for sgRNA design with CRISPOR (computational tool for the design, evaluation, and cloning of sgRNA sequences), experimental implementation, and analysis of the resulting high-throughput sequencing data with CRISPResso (computational tool for analysis of genome-editing outcomes from deep-sequencing data). This protocol allows for design and execution of arrayed and pooled CRISPR experiments in 4-5 weeks by non-experts, as well as computational data analysis that can be performed in 1-2 d by both computational and noncomputational biologists alike using web-based and/or command-line versions.

  4. A gimbal platform stabilization for topographic applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michele, Mangiameli, E-mail: michele.mangiameli@dica.unict.it; Giuseppe, Mussumeci

    2015-03-10

    The aim of this work is the stabilization of a Gimbal platform for optical sensors acquisitions in topographic applications using mobile vehicles. The stabilization of the line of sight (LOS) consists in tracking the command velocity in presence of nonlinear noise due to the external environment. The hardware architecture is characterized by an Ardupilot platform that allows the control of both the mobile device and the Gimbal. Here we developed a new approach to stabilize the Gimbal platform, which is based on neural network. For the control system, we considered a plant that represents the transfer function of the servomore » system control model for an inertial stabilized Gimbal platform. The transductor used in the feed-back line control is characterized by the Rate Gyro transfer function installed onboard of Ardupilot. For the simulation and investigation of the system performance, we used the Simulink tool of Matlab. Results show that the hardware/software approach is efficient, reliable and cheap for direct photogrammetry, as well as for general purpose applications using mobile vehicles.« less

  5. Building oceanographic and atmospheric observation networks by composition: unmanned vehicles, communication networks, and planning and execution control frameworks

    NASA Astrophysics Data System (ADS)

    Sousa, J. T.; Pinto, J.; Martins, R.; Costa, M.; Ferreira, F.; Gomes, R.

    2014-12-01

    The problem of developing mobile oceanographic and atmospheric observation networks (MOAO) with coordinated air and ocean vehicles is discussed in the framework of the communications and control software tool chain developed at Underwater Systems and Technologies Laboratory (LSTS) from Porto University. This is done with reference to field experiments to illustrate key capabilities and to assess future MOAO operations. First, the motivation for building MOAO by "composition" of air and ocean vehicles, communication networks, and planning and execution control frameworks is discussed - in networked vehicle systems information and commands are exchanged among multiple vehicles and operators, and the roles, relative positions, and dependencies of these vehicles and operators change during operations. Second, the planning and execution control framework developed at LSTS for multi-vehicle systems is discussed with reference to key concepts such as autonomy, mixed-initiative interactions, and layered organization. Third, the LSTS tool software tool chain is presented to show how to develop MOAO by composition. The tool chain comprises the Neptus command and control framework for mixed initiative interactions, the underlying IMC messaging protocol, and the DUNE on-board software. Fourth, selected LSTS operational deployments illustrate MOAO capability building. In 2012 we demonstrated the use of UAS to "ferry" data from UUVs located beyond line of sight (BLOS). In 2013 we demonstrated coordinated observations of coastal fronts with small UAS and UUVs, "bent" BLOS through the use of UAS as communication relays, and UAS tracking of juvenile hammer-head sharks. In 2014 we demonstrated UUV adaptive sampling with the closed loop controller of the UUV residing on a UAS; this was done with the help of a Wave Glider ASV with a communications gateway. The results from these experiments provide a background for assessing potential future UAS operations in a compositional MOAO.

  6. MySQL/PHP web database applications for IPAC proposal submission

    NASA Astrophysics Data System (ADS)

    Crane, Megan K.; Storrie-Lombardi, Lisa J.; Silbermann, Nancy A.; Rebull, Luisa M.

    2008-07-01

    The Infrared Processing and Analysis Center (IPAC) is NASA's multi-mission center of expertise for long-wavelength astrophysics. Proposals for various IPAC missions and programs are ingested via MySQL/PHP web database applications. Proposers use web forms to enter coversheet information and upload PDF files related to the proposal. Upon proposal submission, a unique directory is created on the webserver into which all of the uploaded files are placed. The coversheet information is converted into a PDF file using a PHP extension called FPDF. The files are concatenated into one PDF file using the command-line tool pdftk and then forwarded to the review committee. This work was performed at the California Institute of Technology under contract to the National Aeronautics and Space Administration.

  7. WCSTools 3.0: More Tools for Image Astrometry and Catalog Searching

    NASA Astrophysics Data System (ADS)

    Mink, Douglas J.

    For five years, WCSTools has provided image astrometry for astronomers who need accurate positions for objects they wish to observe. Other functions have been added and improved since the package was first released. Support has been added for new catalogs, such as the GSC-ACT, 2MASS Point Source Catalog, and GSC II, as they have been published. A simple command line interface can search any supported catalog, returning information in several standard formats, whether the catalog is on a local disk or searchable over the World Wide Web. The catalog searching routine can be located on either end (or both ends!) of such a web connection, and the output from one catalog search can be used as the input to another search.

  8. Gromita: a fully integrated graphical user interface to gromacs 4.

    PubMed

    Sellis, Diamantis; Vlachakis, Dimitrios; Vlassi, Metaxia

    2009-09-07

    Gromita is a fully integrated and efficient graphical user interface (GUI) to the recently updated molecular dynamics suite Gromacs, version 4. Gromita is a cross-platform, perl/tcl-tk based, interactive front end designed to break the command line barrier and introduce a new user-friendly environment to run molecular dynamics simulations through Gromacs. Our GUI features a novel workflow interface that guides the user through each logical step of the molecular dynamics setup process, making it accessible to both advanced and novice users. This tool provides a seamless interface to the Gromacs package, while providing enhanced functionality by speeding up and simplifying the task of setting up molecular dynamics simulations of biological systems. Gromita can be freely downloaded from http://bio.demokritos.gr/gromita/.

  9. Sequence History Update Tool

    NASA Technical Reports Server (NTRS)

    Khanampompan, Teerapat; Gladden, Roy; Fisher, Forest; DelGuercio, Chris

    2008-01-01

    The Sequence History Update Tool performs Web-based sequence statistics archiving for Mars Reconnaissance Orbiter (MRO). Using a single UNIX command, the software takes advantage of sequencing conventions to automatically extract the needed statistics from multiple files. This information is then used to populate a PHP database, which is then seamlessly formatted into a dynamic Web page. This tool replaces a previous tedious and error-prone process of manually editing HTML code to construct a Web-based table. Because the tool manages all of the statistics gathering and file delivery to and from multiple data sources spread across multiple servers, there is also a considerable time and effort savings. With the use of The Sequence History Update Tool what previously took minutes is now done in less than 30 seconds, and now provides a more accurate archival record of the sequence commanding for MRO.

  10. User guide to a command and control system; a part of a prelaunch wind monitoring program

    NASA Technical Reports Server (NTRS)

    Cowgill, G. R.

    1976-01-01

    A set of programs called Command and Control System (CCS), intended as a user manual, is described for the operation of CCS by the personnel supporting the wind monitoring portion of the launch mission. Wind data obtained by tracking balloons is sent by electronic means using telephone lines to other locations. Steering commands are computed from a system called ADDJUST for the on-board computer and relays this data. Data are received and automatically stored in a microprocessor, then via a real time program transferred to the UNIVAC 1100/40 computer. At this point the data is available to be used by the Command and Control system.

  11. 46 CFR 45.11 - Issue of load line certificate.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Issue of load line certificate. 45.11 Section 45.11... § 45.11 Issue of load line certificate. (a) A vessel 79 feet in length and more, and 150 gross tons or... issue of a load line certificate under this part by the Commandant or his authorized representative. (c...

  12. 2006 NASA Range Safety Annual Report

    NASA Technical Reports Server (NTRS)

    TenHaken, Ron; Daniels, B.; Becker, M.; Barnes, Zack; Donovan, Shawn; Manley, Brenda

    2007-01-01

    Throughout 2006, Range Safety was involved in a number of exciting and challenging activities and events, from developing, implementing, and supporting Range Safety policies and procedures-such as the Space Shuttle Launch and Landing Plans, the Range Safety Variance Process, and the Expendable Launch Vehicle Safety Program procedures-to evaluating new technologies. Range Safety training development is almost complete with the last course scheduled to go on line in mid-2007. Range Safety representatives took part in a number of panels and councils, including the newly formed Launch Constellation Range Safety Panel, the Range Commanders Council and its subgroups, the Space Shuttle Range Safety Panel, and the unmanned aircraft systems working group. Space based range safety demonstration and certification (formerly STARS) and the autonomous flight safety system were successfully tested. The enhanced flight termination system will be tested in early 2007 and the joint advanced range safety system mission analysis software tool is nearing operational status. New technologies being evaluated included a processor for real-time compensation in long range imaging, automated range surveillance using radio interferometry, and a space based range command and telemetry processor. Next year holds great promise as we continue ensuring safety while pursuing our quest beyond the Moon to Mars.

  13. Analysis and Visualization of ChIP-Seq and RNA-Seq Sequence Alignments Using ngs.plot.

    PubMed

    Loh, Yong-Hwee Eddie; Shen, Li

    2016-01-01

    The continual maturation and increasing applications of next-generation sequencing technology in scientific research have yielded ever-increasing amounts of data that need to be effectively and efficiently analyzed and innovatively mined for new biological insights. We have developed ngs.plot-a quick and easy-to-use bioinformatics tool that performs visualizations of the spatial relationships between sequencing alignment enrichment and specific genomic features or regions. More importantly, ngs.plot is customizable beyond the use of standard genomic feature databases to allow the analysis and visualization of user-specified regions of interest generated by the user's own hypotheses. In this protocol, we demonstrate and explain the use of ngs.plot using command line executions, as well as a web-based workflow on the Galaxy framework. We replicate the underlying commands used in the analysis of a true biological dataset that we had reported and published earlier and demonstrate how ngs.plot can easily generate publication-ready figures. With ngs.plot, users would be able to efficiently and innovatively mine their own datasets without having to be involved in the technical aspects of sequence coverage calculations and genomic databases.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowen, Benjamin; Ruebel, Oliver; Fischer, Curt Fischer R.

    BASTet is an advanced software library written in Python. BASTet serves as the analysis and storage library for the OpenMSI project. BASTet is an integrate framework for: i) storage of spectral imaging data, ii) storage of derived analysis data, iii) provenance of analyses, iv) integration and execution of analyses via complex workflows. BASTet implements the API for the HDF5 storage format used by OpenMSI. Analyses that are developed using BASTet benefit from direct integration with storage format, automatic tracking of provenance, and direct integration with command-line and workflow execution tools. BASTet also defines interfaces to enable developers to directly integratemore » their analysis with OpenMSI's web-based viewing infrastruture without having to know OpenMSI. BASTet also provides numerous helper classes and tools to assist with the conversion of data files, ease parallel implementation of analysis algorithms, ease interaction with web-based functions, description methods for data reduction. BASTet also includes detailed developer documentation, user tutorials, iPython notebooks, and other supporting documents.« less

  15. Phyx: phylogenetic tools for unix.

    PubMed

    Brown, Joseph W; Walker, Joseph F; Smith, Stephen A

    2017-06-15

    The ease with which phylogenomic data can be generated has drastically escalated the computational burden for even routine phylogenetic investigations. To address this, we present phyx : a collection of programs written in C ++ to explore, manipulate, analyze and simulate phylogenetic objects (alignments, trees and MCMC logs). Modelled after Unix/GNU/Linux command line tools, individual programs perform a single task and operate on standard I/O streams that can be piped to quickly and easily form complex analytical pipelines. Because of the stream-centric paradigm, memory requirements are minimized (often only a single tree or sequence in memory at any instance), and hence phyx is capable of efficiently processing very large datasets. phyx runs on POSIX-compliant operating systems. Source code, installation instructions, documentation and example files are freely available under the GNU General Public License at https://github.com/FePhyFoFum/phyx. eebsmith@umich.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press.

  16. Phyx: phylogenetic tools for unix

    PubMed Central

    Brown, Joseph W.; Walker, Joseph F.; Smith, Stephen A.

    2017-01-01

    Abstract Summary: The ease with which phylogenomic data can be generated has drastically escalated the computational burden for even routine phylogenetic investigations. To address this, we present phyx: a collection of programs written in C ++ to explore, manipulate, analyze and simulate phylogenetic objects (alignments, trees and MCMC logs). Modelled after Unix/GNU/Linux command line tools, individual programs perform a single task and operate on standard I/O streams that can be piped to quickly and easily form complex analytical pipelines. Because of the stream-centric paradigm, memory requirements are minimized (often only a single tree or sequence in memory at any instance), and hence phyx is capable of efficiently processing very large datasets. Availability and Implementation: phyx runs on POSIX-compliant operating systems. Source code, installation instructions, documentation and example files are freely available under the GNU General Public License at https://github.com/FePhyFoFum/phyx Contact: eebsmith@umich.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28174903

  17. Inselect: Automating the Digitization of Natural History Collections

    PubMed Central

    Hudson, Lawrence N.; Blagoderov, Vladimir; Heaton, Alice; Holtzhausen, Pieter; Livermore, Laurence; Price, Benjamin W.; van der Walt, Stéfan; Smith, Vincent S.

    2015-01-01

    The world’s natural history collections constitute an enormous evidence base for scientific research on the natural world. To facilitate these studies and improve access to collections, many organisations are embarking on major programmes of digitization. This requires automated approaches to mass-digitization that support rapid imaging of specimens and associated data capture, in order to process the tens of millions of specimens common to most natural history collections. In this paper we present Inselect—a modular, easy-to-use, cross-platform suite of open-source software tools that supports the semi-automated processing of specimen images generated by natural history digitization programmes. The software is made up of a Windows, Mac OS X, and Linux desktop application, together with command-line tools that are designed for unattended operation on batches of images. Blending image visualisation algorithms that automatically recognise specimens together with workflows to support post-processing tasks such as barcode reading, label transcription and metadata capture, Inselect fills a critical gap to increase the rate of specimen digitization. PMID:26599208

  18. Inselect: Automating the Digitization of Natural History Collections.

    PubMed

    Hudson, Lawrence N; Blagoderov, Vladimir; Heaton, Alice; Holtzhausen, Pieter; Livermore, Laurence; Price, Benjamin W; van der Walt, Stéfan; Smith, Vincent S

    2015-01-01

    The world's natural history collections constitute an enormous evidence base for scientific research on the natural world. To facilitate these studies and improve access to collections, many organisations are embarking on major programmes of digitization. This requires automated approaches to mass-digitization that support rapid imaging of specimens and associated data capture, in order to process the tens of millions of specimens common to most natural history collections. In this paper we present Inselect-a modular, easy-to-use, cross-platform suite of open-source software tools that supports the semi-automated processing of specimen images generated by natural history digitization programmes. The software is made up of a Windows, Mac OS X, and Linux desktop application, together with command-line tools that are designed for unattended operation on batches of images. Blending image visualisation algorithms that automatically recognise specimens together with workflows to support post-processing tasks such as barcode reading, label transcription and metadata capture, Inselect fills a critical gap to increase the rate of specimen digitization.

  19. Performance management of multiple access communication networks

    NASA Astrophysics Data System (ADS)

    Lee, Suk; Ray, Asok

    1993-12-01

    This paper focuses on conceptual design, development, and implementation of a performance management tool for computer communication networks to serve large-scale integrated systems. The objective is to improve the network performance in handling various types of messages by on-line adjustment of protocol parameters. The techniques of perturbation analysis of Discrete Event Dynamic Systems (DEDS), stochastic approximation (SA), and learning automata have been used in formulating the algorithm of performance management. The efficacy of the performance management tool has been demonstrated on a network testbed. The conceptual design presented in this paper offers a step forward to bridging the gap between management standards and users' demands for efficient network operations since most standards such as ISO (International Standards Organization) and IEEE address only the architecture, services, and interfaces for network management. The proposed concept of performance management can also be used as a general framework to assist design, operation, and management of various DEDS such as computer integrated manufacturing and battlefield C(sup 3) (Command, Control, and Communications).

  20. sbml-diff: A Tool for Visually Comparing SBML Models in Synthetic Biology.

    PubMed

    Scott-Brown, James; Papachristodoulou, Antonis

    2017-07-21

    We present sbml-diff, a tool that is able to read a model of a biochemical reaction network in SBML format and produce a range of diagrams showing different levels of detail. Each diagram type can be used to visualize a single model or to visually compare two or more models. The default view depicts species as ellipses, reactions as rectangles, rules as parallelograms, and events as diamonds. A cartoon view replaces the symbols used for reactions on the basis of the associated Systems Biology Ontology terms. An abstract view represents species as ellipses and draws edges between them to indicate whether a species increases or decreases the production or degradation of another species. sbml-diff is freely licensed under the three-clause BSD license and can be downloaded from https://github.com/jamesscottbrown/sbml-diff and used as a python package called from other software, as a free-standing command-line application, or online using the form at http://sysos.eng.ox.ac.uk/tebio/upload.

  1. XTCE (XML Telemetric and Command Exchange) Standard Making It Work at NASA. Can It Work For You?

    NASA Technical Reports Server (NTRS)

    Munoz-Fernandez, Michela; Smith, Danford S.; Rice, James K.; Jones, Ronald A.

    2017-01-01

    The XML Telemetric and Command Exchange (XTCE) standard is intended as a way to describe telemetry and command databases to be exchanged across centers and space agencies. XTCE usage has the potential to lead to consolidation of the Mission Operations Center (MOC) Monitor and Control displays for mission cross-support, reducing equipment and configuration costs, as well as a decrease in the turnaround time for telemetry and command modifications during all the mission phases. The adoption of XTCE will reduce software maintenance costs by reducing the variation between our existing mission dictionaries. The main objective of this poster is to show how powerful XTCE is in terms of interoperability across centers and missions. We will provide results for a use case where two centers can use their local tools to process and display the same mission telemetry in their MOC independently of one another. In our use case we have first quantified the ability for XTCE to capture the telemetry definitions of the mission by use of our suite of support tools (Conversion, Validation, and Compliance measurement). The next step was to show processing and monitoring of the same telemetry in two mission centers. Once the database was converted to XTCE using our tool, the XTCE file became our primary database and was shared among the various tool chains through their XTCE importers and ultimately configured to ingest the telemetry stream and display or capture the telemetered information in similar ways.Summary results include the ability to take a real mission database and real mission telemetry and display them on various tools from two centers, as well as using commercially free COTS.

  2. Mixed-Initiative Constraint-Based Activity Planning for Mars Exploration Rovers

    NASA Technical Reports Server (NTRS)

    Bresina, John; Jonsson, Ari K.; Morris, Paul H.; Rajan, Kanna

    2004-01-01

    In January, 2004, two NASA rovers, named Spirit and Opportunity, successfully landed on Mars, starting an unprecedented exploration of the Martian surface. Power and thermal concerns constrained the duration of this mission, leading to an aggressive plan for commanding both rovers every day. As part of the process for generating these command loads, the MAPGEN tool provides engineers and scientists an intelligent activity planning tool that allows them to more effectively generate complex plans that maximize the science return each day. The key to'the effectiveness of the MAPGEN tool is an underlying artificial intelligence plan and constraint reasoning engine. In this paper we outline the design and functionality of the MAEPGEN tool and focus on some of the key capabilities it offers to the MER mission engineers.

  3. SIG. Signal Processing, Analysis, & Display

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hernandez, J.; Lager, D.; Azevedo, S.

    1992-01-22

    SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time and frequency-domain signals. However, it has been designed to ultimately accommodate other representations for data such as multiplexed signals and complex matrices. Two user interfaces are provided in SIG; a menu mode for the unfamiliar user and a command mode for more experienced users. In both modes errors are detected as early as possible and are indicated by friendly, meaningful messages. An on-line HELP package is also included. A variety of operations can be performed on time and frequency-domain signals includingmore » operations on the samples of a signal, operations on the entire signal, and operations on two or more signals. Signal processing operations that can be performed are digital filtering (median, Bessel, Butterworth, and Chebychev), ensemble average, resample, auto and cross spectral density, transfer function and impulse response, trend removal, convolution, Fourier transform and inverse window functions (Hamming, Kaiser-Bessel), simulation (ramp, sine, pulsetrain, random), and read/write signals. User definable signal processing algorithms are also featured. SIG has many options including multiple commands per line, command files with arguments, commenting lines, defining commands, and automatic execution for each item in a `repeat` sequence. Graphical operations on signals and spectra include: x-y plots of time signals; real, imaginary, magnitude, and phase plots of spectra; scaling of spectra for continuous or discrete domain; cursor zoom; families of curves; and multiple viewports.« less

  4. SIG. Signal Processing, Analysis, & Display

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hernandez, J.; Lager, D.; Azevedo, S.

    1992-01-22

    SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time-and frequency-domain signals. However, it has been designed to ultimately accommodate other representations for data such as multiplexed signals and complex matrices. Two user interfaces are provided in SIG - a menu mode for the unfamiliar user and a command mode for more experienced users. In both modes errors are detected as early as possible and are indicated by friendly, meaningful messages. An on-line HELP package is also included. A variety of operations can be performed on time and frequency-domain signals includingmore » operations on the samples of a signal, operations on the entire signal, and operations on two or more signals. Signal processing operations that can be performed are digital filtering (median, Bessel, Butterworth, and Chebychev), ensemble average, resample, auto and cross spectral density, transfer function and impulse response, trend removal, convolution, Fourier transform and inverse window functions (Hamming, Kaiser-Bessel), simulation (ramp, sine, pulsetrain, random), and read/write signals. User definable signal processing algorithms are also featured. SIG has many options including multiple commands per line, command files with arguments, commenting lines, defining commands, and automatic execution for each item in a repeat sequence. Graphical operations on signals and spectra include: x-y plots of time signals; real, imaginary, magnitude, and phase plots of spectra; scaling of spectra for continuous or discrete domain; cursor zoom; families of curves; and multiple viewports.« less

  5. Signal Processing, Analysis, & Display

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lager, Darrell; Azevado, Stephen

    1986-06-01

    SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time- and frequency-domain signals. However, it has been designed to ultimately accommodate other representations for data such as multiplexed signals and complex matrices. Two user interfaces are provided in SIG - a menu mode for the unfamiliar user and a command mode for more experienced users. In both modes errors are detected as early as possible and are indicated by friendly, meaningful messages. An on-line HELP package is also included. A variety of operations can be performed on time- and frequency-domain signalsmore » including operations on the samples of a signal, operations on the entire signal, and operations on two or more signals. Signal processing operations that can be performed are digital filtering (median, Bessel, Butterworth, and Chebychev), ensemble average, resample, auto and cross spectral density, transfer function and impulse response, trend removal, convolution, Fourier transform and inverse window functions (Hamming, Kaiser-Bessel), simulation (ramp, sine, pulsetrain, random), and read/write signals. User definable signal processing algorithms are also featured. SIG has many options including multiple commands per line, command files with arguments,commenting lines, defining commands, and automatic execution for each item in a repeat sequence. Graphical operations on signals and spectra include: x-y plots of time signals; real, imaginary, magnitude, and phase plots of spectra; scaling of spectra for continuous or discrete domain; cursor zoom; families of curves; and multiple viewports.« less

  6. SIG. Signal Processing, Analysis, & Display

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hernandez, J.; Lager, D.; Azevedo, S.

    1992-01-22

    SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time- and frequency-domain signals. However, it has been designed to ultimately accommodate other representations for data such as multiplexed signals and complex matrices. Two user interfaces are provided in SIG - a menu mode for the unfamiliar user and a command mode for more experienced users. In both modes errors are detected as early as possible and are indicated by friendly, meaningful messages. An on-line HELP package is also included. A variety of operations can be performed on time- and frequency-domain signalsmore » including operations on the samples of a signal, operations on the entire signal, and operations on two or more signals. Signal processing operations that can be performed are digital filtering (median, Bessel, Butterworth, and Chebychev), ensemble average, resample, auto and cross spectral density, transfer function and impulse response, trend removal, convolution, Fourier transform and inverse window functions (Hamming, Kaiser-Bessel), simulation (ramp, sine, pulsetrain, random), and read/write signals. User definable signal processing algorithms are also featured. SIG has many options including multiple commands per line, command files with arguments,commenting lines, defining commands, and automatic execution for each item in a repeat sequence. Graphical operations on signals and spectra include: x-y plots of time signals; real, imaginary, magnitude, and phase plots of spectra; scaling of spectra for continuous or discrete domain; cursor zoom; families of curves; and multiple viewports.« less

  7. Scalable Unix tools on parallel processors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gropp, W.; Lusk, E.

    1994-12-31

    The introduction of parallel processors that run a separate copy of Unix on each process has introduced new problems in managing the user`s environment. This paper discusses some generalizations of common Unix commands for managing files (e.g. 1s) and processes (e.g. ps) that are convenient and scalable. These basic tools, just like their Unix counterparts, are text-based. We also discuss a way to use these with a graphical user interface (GUI). Some notes on the implementation are provided. Prototypes of these commands are publicly available.

  8. The Integration of COTS/GOTS within NASA's HST Command and Control System

    NASA Technical Reports Server (NTRS)

    Pfarr, Thomas; Reis, James E.; Obenschain, Arthur F. (Technical Monitor)

    2001-01-01

    NASA's mission critical Hubble Space Telescope (HST) command and control system has been re-engineered with COTS/GOTS and minimal custom code. This paper focuses on the design of this new HST Control Center System (CCS) and the lessons learned throughout its development. CCS currently utilizes 31 COTS/GOTS products with an additional 12 million lines of custom glueware code; the new CCS exceeds the capabilities of the original system while significantly reducing the lines of custom code by more than 50%. The lifecycle of COTS/GOTS products will be examined including the pack-age selection process, evaluation process, and integration process. The advantages, disadvantages, issues, concerns, and lessons teamed for integrating COTS/GOTS into the NASA's mission critical HST CCS will be examined in detail. Command and control systems designed with traditional custom code development efforts will be compared with command and control systems designed with new development techniques relying heavily on COTS/COTS integration. This paper will reveal the many hidden costs of COTS/GOTS solutions when compared to traditional custom code development efforts; this paper will show the high cost of COTS/GOTS solutions including training expenses, consulting fees, and long-term maintenance expenses.

  9. AMO EXPRESS: A Command and Control Experiment for Crew Autonomy Onboard the International Space Station

    NASA Technical Reports Server (NTRS)

    Cornelius, Randy; Frank, Jeremy; Garner, Larry; Haddock, Angie; Stetson, Howard; Wang, Lui

    2015-01-01

    The Autonomous Mission Operations project is investigating crew autonomy capabilities and tools for deep space missions. Team members at Ames Research Center, Johnson Space Center and Marshall Space Flight Center are using their experience with ISS Payload operations and TIMELINER to: move earth based command and control assets to on-board for crew access; safely merge core and payload command procedures; give the crew single action intelligent operations; and investigate crew interface requirements.

  10. ProphTools: general prioritization tools for heterogeneous biological networks.

    PubMed

    Navarro, Carmen; Martínez, Victor; Blanco, Armando; Cano, Carlos

    2017-12-01

    Networks have been proven effective representations for the analysis of biological data. As such, there exist multiple methods to extract knowledge from biological networks. However, these approaches usually limit their scope to a single biological entity type of interest or they lack the flexibility to analyze user-defined data. We developed ProphTools, a flexible open-source command-line tool that performs prioritization on a heterogeneous network. ProphTools prioritization combines a Flow Propagation algorithm similar to a Random Walk with Restarts and a weighted propagation method. A flexible model for the representation of a heterogeneous network allows the user to define a prioritization problem involving an arbitrary number of entity types and their interconnections. Furthermore, ProphTools provides functionality to perform cross-validation tests, allowing users to select the best network configuration for a given problem. ProphTools core prioritization methodology has already been proven effective in gene-disease prioritization and drug repositioning. Here we make ProphTools available to the scientific community as flexible, open-source software and perform a new proof-of-concept case study on long noncoding RNAs (lncRNAs) to disease prioritization. ProphTools is robust prioritization software that provides the flexibility not present in other state-of-the-art network analysis approaches, enabling researchers to perform prioritization tasks on any user-defined heterogeneous network. Furthermore, the application to lncRNA-disease prioritization shows that ProphTools can reach the performance levels of ad hoc prioritization tools without losing its generality. © The Authors 2017. Published by Oxford University Press.

  11. The Ocean Observatories Initiative: Data Acquisition Functions and Its Built-In Automated Python Modules

    NASA Astrophysics Data System (ADS)

    Smith, M. J.; Vardaro, M.; Crowley, M. F.; Glenn, S. M.; Schofield, O.; Belabbassi, L.; Garzio, L. M.; Knuth, F.; Fram, J. P.; Kerfoot, J.

    2016-02-01

    The Ocean Observatories Initiative (OOI), funded by the National Science Foundation, provides users with access to long-term datasets from a variety of oceanographic sensors. The Endurance Array in the Pacific Ocean consists of two separate lines off the coasts of Oregon and Washington. The Oregon line consists of 7 moorings, two cabled benthic experiment packages and 6 underwater gliders. The Washington line comprises 6 moorings and 6 gliders. Each mooring is outfitted with a variety of instrument packages. The raw data from these instruments are sent to shore via satellite communication and in some cases, via fiber optic cable. Raw data is then sent to the cyberinfrastructure (CI) group at Rutgers where it is aggregated, parsed into thousands of different data streams, and integrated into a software package called uFrame. The OOI CI delivers the data to the general public via a web interface that outputs data into commonly used scientific data file formats such as JSON, netCDF, and CSV. The Rutgers data management team has developed a series of command-line Python tools that streamline data acquisition in order to facilitate the QA/QC review process. The first step in the process is querying the uFrame database for a list of all available platforms. From this list, a user can choose a specific platform and automatically download all available datasets from the specified platform. The downloaded dataset is plotted using a generalized Python netcdf plotting routine that utilizes a data visualization toolbox called matplotlib. This routine loads each netCDF file separately and outputs plots by each available parameter. These Python tools have been uploaded to a Github repository that is openly available to help facilitate OOI data access and visualization.

  12. The Army's Use of the Advanced Communications Technology Satellite

    NASA Technical Reports Server (NTRS)

    Ilse, Kenneth

    1996-01-01

    Tactical operations require military commanders to be mobile and have a high level of independence in their actions. Communications capabilities providing intelligence and command orders in these tactical situations have been limited to simple voice communications or low-rate narrow bandwidth communications because of the need for immediate reliable connectivity. The Advanced Communications Technology Satellite (ACTS) has brought an improved communications tool to the tactical commander giving the ability to gain access to a global communications system using high data rates and wide bandwidths. The Army has successfully tested this new capability of bandwidth-on-demand and high data rates for commanders in real-world conditions during Operation UPHOLD DEMOCRACY in Haiti during the fall and winter of 1994. This paper examines ACTS use by field commanders and details the success of the ACTS system in support of a wide variety of field condition command functions.

  13. Collaborative Tool for Command and Control Team Effectiveness Studies: Experimental Test of Interventions to Improve Performance in Command and Control

    DTIC Science & Technology

    2008-11-01

    weapons system in the United States Air Force (USAF) inventory (Garrity, Morley, Rodriguez , & Tossell, 2004). It is important for potential AOC operators...Research Division in Mesa , AZ. Additionally, cadets were able to participate in two conferences. Please see conference report at Attachment 3

  14. The next generation of command post computing

    NASA Astrophysics Data System (ADS)

    Arnold, Ross D.; Lieb, Aaron J.; Samuel, Jason M.; Burger, Mitchell A.

    2015-05-01

    The future of command post computing demands an innovative new solution to address a variety of challenging operational needs. The Command Post of the Future is the Army's primary command and control decision support system, providing situational awareness and collaborative tools for tactical decision making, planning, and execution management from Corps to Company level. However, as the U.S. Army moves towards a lightweight, fully networked battalion, disconnected operations, thin client architecture and mobile computing become increasingly essential. The Command Post of the Future is not designed to support these challenges in the coming decade. Therefore, research into a hybrid blend of technologies is in progress to address these issues. This research focuses on a new command and control system utilizing the rich collaboration framework afforded by Command Post of the Future coupled with a new user interface consisting of a variety of innovative workspace designs. This new system is called Tactical Applications. This paper details a brief history of command post computing, presents the challenges facing the modern Army, and explores the concepts under consideration for Tactical Applications that meet these challenges in a variety of innovative ways.

  15. The missing graphical user interface for genomics.

    PubMed

    Schatz, Michael C

    2010-01-01

    The Galaxy package empowers regular users to perform rich DNA sequence analysis through a much-needed and user-friendly graphical web interface. See research article http://genomebiology.com/2010/11/8/R86 RESEARCH HIGHLIGHT: With the advent of affordable and high-throughput DNA sequencing, sequencing is becoming an essential component in nearly every genetics lab. These data are being generated to probe sequence variations, to understand transcribed, regulated or methylated DNA elements, and to explore a host of other biological features across the tree of life and across a range of environments and conditions. Given this deluge of data, novices and experts alike are facing the daunting challenge of trying to analyze the raw sequence data computationally. With so many tools available and so many assays to analyze, how can one be expected to stay current with the state of the art? How can one be expected to learn to use each tool and construct robust end-to-end analysis pipelines, all while ensuring that input formats, command-line options, sequence databases and program libraries are set correctly? Finally, once the analysis is complete, how does one ensure the results are reproducible and transparent for others to scrutinize and study?In an article published in Genome Biology, Jeremy Goecks, Anton Nekrutenko, James Taylor and the rest of the Galaxy Team (Goecks et al. 1) make a great advance towards resolving these critical questions with the latest update to their Galaxy Project. The ambitious goal of Galaxy is to empower regular users to carry out their own computational analysis without having to be an expert in computational biology or computer science. Galaxy adds a desperately needed graphical user interface to genomics research, making data analysis universally accessible in a web browser, and freeing users from the minutiae of archaic command-line parameters, data formats and scripting languages. Data inputs and computational steps are selected from dynamic graphical menus, and the results are displayed in intuitive plots and summaries that encourage interactive workflows and the exploration of hypotheses. The underlying data analysis tools can be almost any piece of software, written in any language, but all their complexity is neatly hidden inside of Galaxy, allowing users to focus on scientific rather than technical questions.

  16. Time-Of-Travel Tool Protects Drinking Water

    EPA Pesticide Factsheets

    The Lower Susquehanna Source Water Protection (SWP) Partnership utilizes the Incident Command Tool for Drinking Water Protection (ICWater) to support the Pennsylvania Department of Environmental Protection (PADEP) with real-time spill tracking information.

  17. Integrated Information Support System (IISS). Volume 8. User Interface Subsystem. Part 10. Graph Support System Unit Test Plan

    DTIC Science & Technology

    1990-09-30

    UTP 620344220 30 September 1990 Command Line form gfl MSG: Estructure closed apIcatic Figure 5-64b (AFTER) 5-132 UTP 620344220 30 September 1990 Command...620344220 30 September 1990 Comman Lineay form gf 1 MSG: Estructure 9 psted on workstation 1 at priority 1 applcatioo Figure 5-101a (BEFORE) 5-205 UTP

  18. Empowering Globally Integrated Operations and Mission Command: Revisiting Key West

    DTIC Science & Technology

    2013-03-01

    depended on coordination between commanders. Examples include Captain Thomas McDonough’s 4 U.S...Hedgehog Concept In his book, Good to Great, which describes how good organizations become great ones, Jim Collins borrows an example from an Isaiah ...that it executes with perfection—curling into a ball with its spikes outward. 62 In the words of Archilochus (first line in quotes), Isaiah Berlin

  19. Free DICOM de-identification tools in clinical research: functioning and safety of patient privacy.

    PubMed

    Aryanto, K Y E; Oudkerk, M; van Ooijen, P M A

    2015-12-01

    To compare non-commercial DICOM toolkits for their de-identification ability in removing a patient's personal health information (PHI) from a DICOM header. Ten DICOM toolkits were selected for de-identification tests. Tests were performed by using the system's default de-identification profile and, subsequently, the tools' best adjusted settings. We aimed to eliminate fifty elements considered to contain identifying patient information. The tools were also examined for their respective methods of customization. Only one tool was able to de-identify all required elements with the default setting. Not all of the toolkits provide a customizable de-identification profile. Six tools allowed changes by selecting the provided profiles, giving input through a graphical user interface (GUI) or configuration text file, or providing the appropriate command-line arguments. Using adjusted settings, four of those six toolkits were able to perform full de-identification. Only five tools could properly de-identify the defined DICOM elements, and in four cases, only after careful customization. Therefore, free DICOM toolkits should be used with extreme care to prevent the risk of disclosing PHI, especially when using the default configuration. In case optimal security is required, one of the five toolkits is proposed. • Free DICOM toolkits should be carefully used to prevent patient identity disclosure. • Each DICOM tool produces its own specific outcomes from the de-identification process. • In case optimal security is required, using one DICOM toolkit is proposed.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Favorite, Jeffrey A.

    SENSMG is a tool for computing first-order sensitivities of neutron reaction rates, reaction-rate ratios, leakage, k eff, and α using the PARTISN multigroup discrete-ordinates code. SENSMG computes sensitivities to all of the transport cross sections and data (total, fission, nu, chi, and all scattering moments), two edit cross sections (absorption and capture), and the density for every isotope and energy group. It also computes sensitivities to the mass density for every material and derivatives with respect to all interface locations. The tool can be used for one-dimensional spherical (r) and two-dimensional cylindrical (r-z) geometries. The tool can be used formore » fixed-source and eigenvalue problems. The tool implements Generalized Perturbation Theory (GPT) as discussed by Williams and Stacey. Section II of this report describes the theory behind adjoint-based sensitivities, gives the equations that SENSMG solves, and defines the sensitivities that are output. Section III describes the user interface, including the input file and command line options. Section IV describes the output. Section V gives some notes about the coding that may be of interest. Section VI discusses verification, which is ongoing. Section VII lists needs and ideas for future work. Appendix A lists all of the input files whose results are presented in Sec. VI.« less

  1. The BioExtract Server: a web-based bioinformatic workflow platform

    PubMed Central

    Lushbough, Carol M.; Jennewein, Douglas M.; Brendel, Volker P.

    2011-01-01

    The BioExtract Server (bioextract.org) is an open, web-based system designed to aid researchers in the analysis of genomic data by providing a platform for the creation of bioinformatic workflows. Scientific workflows are created within the system by recording tasks performed by the user. These tasks may include querying multiple, distributed data sources, saving query results as searchable data extracts, and executing local and web-accessible analytic tools. The series of recorded tasks can then be saved as a reproducible, sharable workflow available for subsequent execution with the original or modified inputs and parameter settings. Integrated data resources include interfaces to the National Center for Biotechnology Information (NCBI) nucleotide and protein databases, the European Molecular Biology Laboratory (EMBL-Bank) non-redundant nucleotide database, the Universal Protein Resource (UniProt), and the UniProt Reference Clusters (UniRef) database. The system offers access to numerous preinstalled, curated analytic tools and also provides researchers with the option of selecting computational tools from a large list of web services including the European Molecular Biology Open Software Suite (EMBOSS), BioMoby, and the Kyoto Encyclopedia of Genes and Genomes (KEGG). The system further allows users to integrate local command line tools residing on their own computers through a client-side Java applet. PMID:21546552

  2. EXPERIMENT - APOLLO 16 (UV)

    NASA Image and Video Library

    1972-06-06

    S72-40820 (21 April 1972) --- A color enhancement of a photograph taken on ultra-violet light showing the spectrum of the upper atmosphere of Earth and geocorona. The bright horizontal line is far ultra-violet emission (1216 angstrom) of hydrogen extending 10 degrees (40,000 miles) either side of Earth. The knobby vertical line shows several ultra-violet emissions from Earth's sunlit atmosphere, each "lump" being produced by one type gas (oxygen, nitrogen, helium, etc.). The spectral dispersion is about 10 angstrom per millimeter on this enlargement. The UV camera/spectrograph was operated on the lunar surface by astronaut John W. Young, commander of the Apollo 16 lunar landing mission. It was designed and built at the Naval Research Laboratory, Washington, D.C. While astronauts Young and Charles M. Duke Jr., lunar module pilot, descended in the Lunar Module (LM) "Orion" to explore the Descartes highlands region of the moon, astronaut Thomas K. Mattingly II, command module pilot, remained with the Command and Service Modules (CSM) "Casper" in lunar orbit.

  3. A voice-actuated wind tunnel model leak checking system

    NASA Technical Reports Server (NTRS)

    Larson, William E.

    1989-01-01

    A computer program has been developed that improves the efficiency of wind tunnel model leak checking. The program uses a voice recognition unit to relay a technician's commands to the computer. The computer, after receiving a command, can respond to the technician via a voice response unit. Information about the model pressure orifice being checked is displayed on a gas-plasma terminal. On command, the program records up to 30 seconds of pressure data. After the recording is complete, the raw data and a straight line fit of the data are plotted on the terminal. This allows the technician to make a decision on the integrity of the orifice being checked. All results of the leak check program are stored in a database file that can be listed on the line printer for record keeping purposes or displayed on the terminal to help the technician find unchecked orifices. This program allows one technician to check a model for leaks instead of the two or three previously required.

  4. Integration of cloud-based storage in BES III computing environment

    NASA Astrophysics Data System (ADS)

    Wang, L.; Hernandez, F.; Deng, Z.

    2014-06-01

    We present an on-going work that aims to evaluate the suitability of cloud-based storage as a supplement to the Lustre file system for storing experimental data for the BES III physics experiment and as a backend for storing files belonging to individual members of the collaboration. In particular, we discuss our findings regarding the support of cloud-based storage in the software stack of the experiment. We report on our development work that improves the support of CERN' s ROOT data analysis framework and allows efficient remote access to data through several cloud storage protocols. We also present our efforts providing the experiment with efficient command line tools for navigating and interacting with cloud storage-based data repositories both from interactive sessions and grid jobs.

  5. ViennaNGS: A toolbox for building efficient next- generation sequencing analysis pipelines

    PubMed Central

    Wolfinger, Michael T.; Fallmann, Jörg; Eggenhofer, Florian; Amman, Fabian

    2015-01-01

    Recent achievements in next-generation sequencing (NGS) technologies lead to a high demand for reuseable software components to easily compile customized analysis workflows for big genomics data. We present ViennaNGS, an integrated collection of Perl modules focused on building efficient pipelines for NGS data processing. It comes with functionality for extracting and converting features from common NGS file formats, computation and evaluation of read mapping statistics, as well as normalization of RNA abundance. Moreover, ViennaNGS provides software components for identification and characterization of splice junctions from RNA-seq data, parsing and condensing sequence motif data, automated construction of Assembly and Track Hubs for the UCSC genome browser, as well as wrapper routines for a set of commonly used NGS command line tools. PMID:26236465

  6. Recent developments in the CCP-EM software suite.

    PubMed

    Burnley, Tom; Palmer, Colin M; Winn, Martyn

    2017-06-01

    As part of its remit to provide computational support to the cryo-EM community, the Collaborative Computational Project for Electron cryo-Microscopy (CCP-EM) has produced a software framework which enables easy access to a range of programs and utilities. The resulting software suite incorporates contributions from different collaborators by encapsulating them in Python task wrappers, which are then made accessible via a user-friendly graphical user interface as well as a command-line interface suitable for scripting. The framework includes tools for project and data management. An overview of the design of the framework is given, together with a survey of the functionality at different levels. The current CCP-EM suite has particular strength in the building and refinement of atomic models into cryo-EM reconstructions, which is described in detail.

  7. Recent developments in the CCP-EM software suite

    PubMed Central

    Burnley, Tom

    2017-01-01

    As part of its remit to provide computational support to the cryo-EM community, the Collaborative Computational Project for Electron cryo-Microscopy (CCP-EM) has produced a software framework which enables easy access to a range of programs and utilities. The resulting software suite incorporates contributions from different collaborators by encapsulating them in Python task wrappers, which are then made accessible via a user-friendly graphical user interface as well as a command-line interface suitable for scripting. The framework includes tools for project and data management. An overview of the design of the framework is given, together with a survey of the functionality at different levels. The current CCP-EM suite has particular strength in the building and refinement of atomic models into cryo-EM reconstructions, which is described in detail. PMID:28580908

  8. WHAM!: a web-based visualization suite for user-defined analysis of metagenomic shotgun sequencing data.

    PubMed

    Devlin, Joseph C; Battaglia, Thomas; Blaser, Martin J; Ruggles, Kelly V

    2018-06-25

    Exploration of large data sets, such as shotgun metagenomic sequence or expression data, by biomedical experts and medical professionals remains as a major bottleneck in the scientific discovery process. Although tools for this purpose exist for 16S ribosomal RNA sequencing analysis, there is a growing but still insufficient number of user-friendly interactive visualization workflows for easy data exploration and figure generation. The development of such platforms for this purpose is necessary to accelerate and streamline microbiome laboratory research. We developed the Workflow Hub for Automated Metagenomic Exploration (WHAM!) as a web-based interactive tool capable of user-directed data visualization and statistical analysis of annotated shotgun metagenomic and metatranscriptomic data sets. WHAM! includes exploratory and hypothesis-based gene and taxa search modules for visualizing differences in microbial taxa and gene family expression across experimental groups, and for creating publication quality figures without the need for command line interface or in-house bioinformatics. WHAM! is an interactive and customizable tool for downstream metagenomic and metatranscriptomic analysis providing a user-friendly interface allowing for easy data exploration by microbiome and ecological experts to facilitate discovery in multi-dimensional and large-scale data sets.

  9. TopoMS: Comprehensive topological exploration for molecular and condensed-matter systems.

    PubMed

    Bhatia, Harsh; Gyulassy, Attila G; Lordi, Vincenzo; Pask, John E; Pascucci, Valerio; Bremer, Peer-Timo

    2018-06-15

    We introduce TopoMS, a computational tool enabling detailed topological analysis of molecular and condensed-matter systems, including the computation of atomic volumes and charges through the quantum theory of atoms in molecules, as well as the complete molecular graph. With roots in techniques from computational topology, and using a shared-memory parallel approach, TopoMS provides scalable, numerically robust, and topologically consistent analysis. TopoMS can be used as a command-line tool or with a GUI (graphical user interface), where the latter also enables an interactive exploration of the molecular graph. This paper presents algorithmic details of TopoMS and compares it with state-of-the-art tools: Bader charge analysis v1.0 (Arnaldsson et al., 01/11/17) and molecular graph extraction using Critic2 (Otero-de-la-Roza et al., Comput. Phys. Commun. 2014, 185, 1007). TopoMS not only combines the functionality of these individual codes but also demonstrates up to 4× performance gain on a standard laptop, faster convergence to fine-grid solution, robustness against lattice bias, and topological consistency. TopoMS is released publicly under BSD License. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.

  10. Software Comparison for Renewable Energy Deployment in a Distribution Network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao, David Wenzhong; Muljadi, Eduard; Tian, Tian

    The main objective of this report is to evaluate different software options for performing robust distributed generation (DG) power system modeling. The features and capabilities of four simulation tools, OpenDSS, GridLAB-D, CYMDIST, and PowerWorld Simulator, are compared to analyze their effectiveness in analyzing distribution networks with DG. OpenDSS and GridLAB-D, two open source software, have the capability to simulate networks with fluctuating data values. These packages allow the running of a simulation each time instant by iterating only the main script file. CYMDIST, a commercial software, allows for time-series simulation to study variations on network controls. PowerWorld Simulator, another commercialmore » tool, has a batch mode simulation function through the 'Time Step Simulation' tool, which obtains solutions for a list of specified time points. PowerWorld Simulator is intended for analysis of transmission-level systems, while the other three are designed for distribution systems. CYMDIST and PowerWorld Simulator feature easy-to-use graphical user interfaces (GUIs). OpenDSS and GridLAB-D, on the other hand, are based on command-line programs, which increase the time necessary to become familiar with the software packages.« less

  11. Parsley: a Command-Line Parser for Astronomical Applications

    NASA Astrophysics Data System (ADS)

    Deich, William

    Parsley is a sophisticated keyword + value parser, packaged as a library of routines that offers an easy method for providing command-line arguments to programs. It makes it easy for the user to enter values, and it makes it easy for the programmer to collect and validate the user's entries. Parsley is tuned for astronomical applications: for example, dates entered in Julian, Modified Julian, calendar, or several other formats are all recognized without special effort by the user or by the programmer; angles can be entered using decimal degrees or dd:mm:ss; time-like intervals as decimal hours, hh:mm:ss, or a variety of other units. Vectors of data are accepted as readily as scalars.

  12. Astronaut Eugene Cernan drives the Lunar Roving Vehicle during first EVA

    NASA Image and Video Library

    1972-12-10

    AS17-147-22526 (11 Dec. 1972) --- Astronaut Eugene A. Cernan, commander, makes a short checkout of the Lunar Roving Vehicle (LRV) during the early part of the first Apollo 17 extravehicular activity (EVA) at the Taurus-Littrow landing site. This view of the "stripped down" LRV is prior to loading up. Equipment later loaded onto the LRV included the ground-controlled television assembly, the lunar communications relay unit, hi-gain antenna, low-gain antenna, aft tool pallet, lunar tools and scientific gear. This photograph was taken by scientist-astronaut Harrison H. Schmitt, lunar module pilot. The mountain in the right background is the east end of South Massif. While astronauts Cernan and Schmitt descended in the Lunar Module (LM) "Challenger" to explore the moon, astronaut Ronald E. Evans, command module pilot, remained with the Command and Service Modules (CSM) "America" in lunar orbit.

  13. Running SINDA '85/FLUINT interactive on the VAX

    NASA Technical Reports Server (NTRS)

    Simmonds, Boris

    1992-01-01

    Computer software as engineering tools are typically run in three modes: Batch, Demand, and Interactive. The first two are the most popular in the SINDA world. The third one is not so popular, due probably to the users inaccessibility to the command procedure files for running SINDA '85, or lack of familiarity with the SINDA '85 execution processes (pre-processor, processor, compilation, linking, execution and all of the file assignment, creation, deletions and de-assignments). Interactive is the mode that makes thermal analysis with SINDA '85 a real-time design tool. This paper explains a command procedure sufficient (the minimum modifications required in an existing demand command procedure) to run SINDA '85 on the VAX in an interactive mode. To exercise the procedure a sample problem is presented exemplifying the mode, plus additional programming capabilities available in SINDA '85. Following the same guidelines the process can be extended to other SINDA '85 residence computer platforms.

  14. MOPEX: a software package for astronomical image processing and visualization

    NASA Astrophysics Data System (ADS)

    Makovoz, David; Roby, Trey; Khan, Iffat; Booth, Hartley

    2006-06-01

    We present MOPEX - a software package for astronomical image processing and display. The package is a combination of command-line driven image processing software written in C/C++ with a Java-based GUI. The main image processing capabilities include creating mosaic images, image registration, background matching, point source extraction, as well as a number of minor image processing tasks. The combination of the image processing and display capabilities allows for much more intuitive and efficient way of performing image processing. The GUI allows for the control over the image processing and display to be closely intertwined. Parameter setting, validation, and specific processing options are entered by the user through a set of intuitive dialog boxes. Visualization feeds back into further processing by providing a prompt feedback of the processing results. The GUI also allows for further analysis by accessing and displaying data from existing image and catalog servers using a virtual observatory approach. Even though originally designed for the Spitzer Space Telescope mission, a lot of functionalities are of general usefulness and can be used for working with existing astronomical data and for new missions. The software used in the package has undergone intensive testing and benefited greatly from effective software reuse. The visualization part has been used for observation planning for both the Spitzer and Herschel Space Telescopes as part the tool Spot. The visualization capabilities of Spot have been enhanced and integrated with the image processing functionality of the command-line driven MOPEX. The image processing software is used in the Spitzer automated pipeline processing, which has been in operation for nearly 3 years. The image processing capabilities have also been tested in off-line processing by numerous astronomers at various institutions around the world. The package is multi-platform and includes automatic update capabilities. The software package has been developed by a small group of software developers and scientists at the Spitzer Science Center. It is available for distribution at the Spitzer Science Center web page.

  15. Slow: A Tool for Reporting and Diagnosing Performance Problems

    NASA Technical Reports Server (NTRS)

    Root, Darrell; Liviero, Belinda; Lasinski, Tom (Technical Monitor)

    1998-01-01

    Slow is a bourne shell script which is meant to be run by workstation users who are suffering performance problems. It collects a snapshot of performance data using previously published and publicly available diagnostic commands. This paper discusses how to interpret the output of those commands to identify the root-cause of unix workstation performance problems.

  16. TBI Endpoints Development

    DTIC Science & Technology

    2015-10-01

    Medical Research and Materiel Command Fort Detrick, Maryland 21702-5012 DISTRIBUTION STATEMENT: Approved for Public Release; Distribution Unlimited The...SPONSORING / MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM(S) U.S. Army Medical Resear ch and Materiel Command Fort Detrick...DDT) and Medical Device Development Tool (MDDT) programs with case study presentations and question and answer opportunities. Expert Working Groups

  17. 46 CFR 171.010 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... vessel operates. (k) Passenger space means a space which is provided for the accommodation and use of... this chapter. (i) Machinery space means, unless otherwise prescribed by the Commandant for unusual arrangements, the space extending from the molded base line to the margin line and between the main transverse...

  18. 46 CFR 171.010 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... vessel operates. (k) Passenger space means a space which is provided for the accommodation and use of... this chapter. (i) Machinery space means, unless otherwise prescribed by the Commandant for unusual arrangements, the space extending from the molded base line to the margin line and between the main transverse...

  19. 46 CFR 171.010 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... vessel operates. (k) Passenger space means a space which is provided for the accommodation and use of... this chapter. (i) Machinery space means, unless otherwise prescribed by the Commandant for unusual arrangements, the space extending from the molded base line to the margin line and between the main transverse...

  20. Sample Analysis at Mars Instrument Simulator

    NASA Technical Reports Server (NTRS)

    Benna, Mehdi; Nolan, Tom

    2013-01-01

    The Sample Analysis at Mars Instrument Simulator (SAMSIM) is a numerical model dedicated to plan and validate operations of the Sample Analysis at Mars (SAM) instrument on the surface of Mars. The SAM instrument suite, currently operating on the Mars Science Laboratory (MSL), is an analytical laboratory designed to investigate the chemical and isotopic composition of the atmosphere and volatiles extracted from solid samples. SAMSIM was developed using Matlab and Simulink libraries of MathWorks Inc. to provide MSL mission planners with accurate predictions of the instrument electrical, thermal, mechanical, and fluid responses to scripted commands. This tool is a first example of a multi-purpose, full-scale numerical modeling of a flight instrument with the purpose of supplementing or even eliminating entirely the need for a hardware engineer model during instrument development and operation. SAMSIM simulates the complex interactions that occur between the instrument Command and Data Handling unit (C&DH) and all subsystems during the execution of experiment sequences. A typical SAM experiment takes many hours to complete and involves hundreds of components. During the simulation, the electrical, mechanical, thermal, and gas dynamics states of each hardware component are accurately modeled and propagated within the simulation environment at faster than real time. This allows the simulation, in just a few minutes, of experiment sequences that takes many hours to execute on the real instrument. The SAMSIM model is divided into five distinct but interacting modules: software, mechanical, thermal, gas flow, and electrical modules. The software module simulates the instrument C&DH by executing a customized version of the instrument flight software in a Matlab environment. The inputs and outputs to this synthetic C&DH are mapped to virtual sensors and command lines that mimic in their structure and connectivity the layout of the instrument harnesses. This module executes, and thus validates, complex command scripts prior to their up-linking to the SAM instrument. As an output, this module generates synthetic data and message logs at a rate that is similar to the actual instrument.

  1. Robust Inversion and Data Compression in Control Allocation

    NASA Technical Reports Server (NTRS)

    Hodel, A. Scottedward

    2000-01-01

    We present an off-line computational method for control allocation design. The control allocation function delta = F(z)tau = delta (sub 0) (z) mapping commanded body-frame torques to actuator commands is implicitly specified by trim condition delta (sub 0) (z) and by a robust pseudo-inverse problem double vertical line I - G(z) F(z) double vertical line less than epsilon (z) where G(z) is a system Jacobian evaluated at operating point z, z circumflex is an estimate of z, and epsilon (z) less than 1 is a specified error tolerance. The allocation function F(z) = sigma (sub i) psi (z) F (sub i) is computed using a heuristic technique for selecting wavelet basis functions psi and a constrained least-squares criterion for selecting the allocation matrices F (sub i). The method is applied to entry trajectory control allocation for a reusable launch vehicle (X-33).

  2. Future Years Defense Program (FYDP) Structure

    DTIC Science & Technology

    2004-04-01

    JC - United States Central Command DoD 7045.7-H, April 2004 12 JCA - CJCS Controlled Activities JE - United States European Command JFC - United...Codes ARMY TITLECODE TITLECODE(H) = Historical (H) = Historical 1291 Line of Sight Anti-Tank (LOSAT) Battalion 1295 Armored Cavalry Squadrons (ACR) 1296...TRI-TAC) 0208010N Joint Tactical Communications Program (TRI-TAC) 0208011A CJCS Exercise Program 0208011F CJCS Exercise Program 0208011J CJCS Exercise

  3. Does the Fast Patrol Boat Have a Future in the Navy?

    DTIC Science & Technology

    2002-05-31

    Admiral Dennis Blair (Commander and Chief, United States Pacific Command) testified to Congress “countering terrorism, weapons proliferation...United States Navy. Blair, Dennis C., Admiral, USN. 2001a. Interview by Maria Ressa, CNN Jakarta Bureau, December 1. Interview transcript on-line...Available from http://www. pacom.mil/speeches/sst2001/011201blairCNN.htm. Internet accessed 3 March 2002. Blair, Dennis C., Admiral, USN. 2001b

  4. Defending Air Bases in an Age of Insurgency

    DTIC Science & Technology

    2014-05-01

    2007, http://www.defence.gov.au/raaf/adg/; WGCDR John Leo , RAAF, commanding officer, 2AFDS, e-mail correspondence with the author, 1 March–9 April...2007; and WGCDR John Leo , RAAF, commanding offi- cer, 2AFDS, “Airfield Defence Squadron Operations & Support” [Power- Point presentation, 22 February...in- stallations throughout the country: The initial mission of these forces is to secure the base and its internal LOCs [lines of communication

  5. General Mission Analysis Tool (GMAT) User's Guide (Draft)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.

    2007-01-01

    4The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system. This document is a draft of the users guide for the tool. Included in the guide is information about Configuring Objects/Resources, Object Fields: Quick Look-up Tables, and Commands and Events.

  6. CoVaCS: a consensus variant calling system.

    PubMed

    Chiara, Matteo; Gioiosa, Silvia; Chillemi, Giovanni; D'Antonio, Mattia; Flati, Tiziano; Picardi, Ernesto; Zambelli, Federico; Horner, David Stephen; Pesole, Graziano; Castrignanò, Tiziana

    2018-02-05

    The advent and ongoing development of next generation sequencing technologies (NGS) has led to a rapid increase in the rate of human genome re-sequencing data, paving the way for personalized genomics and precision medicine. The body of genome resequencing data is progressively increasing underlining the need for accurate and time-effective bioinformatics systems for genotyping - a crucial prerequisite for identification of candidate causal mutations in diagnostic screens. Here we present CoVaCS, a fully automated, highly accurate system with a web based graphical interface for genotyping and variant annotation. Extensive tests on a gold standard benchmark data-set -the NA12878 Illumina platinum genome- confirm that call-sets based on our consensus strategy are completely in line with those attained by similar command line based approaches, and far more accurate than call-sets from any individual tool. Importantly our system exhibits better sensitivity and higher specificity than equivalent commercial software. CoVaCS offers optimized pipelines integrating state of the art tools for variant calling and annotation for whole genome sequencing (WGS), whole-exome sequencing (WES) and target-gene sequencing (TGS) data. The system is currently hosted at Cineca, and offers the speed of a HPC computing facility, a crucial consideration when large numbers of samples must be analysed. Importantly, all the analyses are performed automatically allowing high reproducibility of the results. As such, we believe that CoVaCS can be a valuable tool for the analysis of human genome resequencing studies. CoVaCS is available at: https://bioinformatics.cineca.it/covacs .

  7. Simplifying and enhancing the use of PyMOL with horizontal scripts

    PubMed Central

    2016-01-01

    Abstract Scripts are used in PyMOL to exert precise control over the appearance of the output and to ease remaking similar images at a later time. We developed horizontal scripts to ease script development. A horizontal script makes a complete scene in PyMOL like a traditional vertical script. The commands in a horizontal script are separated by semicolons. These scripts are edited interactively on the command line with no need for an external text editor. This simpler workflow accelerates script development. In using PyMOL, the illustration of a molecular scene requires an 18‐element matrix of view port settings. The default format spans several lines and is laborious to manually reformat for one line. This default format prevents the fast assembly of horizontal scripts that can reproduce a molecular scene. We solved this problem by writing a function that displays the settings on one line in a compact format suitable for horizontal scripts. We also demonstrate the mapping of aliases to horizontal scripts. Many aliases can be defined in a single script file, which can be useful for applying costume molecular representations to any structure. We also redefined horizontal scripts as Python functions to enable the use of the help function to print documentation about an alias to the command history window. We discuss how these methods of using horizontal scripts both simplify and enhance the use of PyMOL in research and education. PMID:27488983

  8. Using the Generic Mapping Tools From Within the MATLAB, Octave and Julia Computing Environments

    NASA Astrophysics Data System (ADS)

    Luis, J. M. F.; Wessel, P.

    2016-12-01

    The Generic Mapping Tools (GMT) is a widely used software infrastructure tool set for analyzing and displaying geoscience data. Its power to analyze and process data and produce publication-quality graphics has made it one of several standard processing toolsets used by a large segment of the Earth and Ocean Sciences. GMT's strengths lie in superior publication-quality vector graphics, geodetic-quality map projections, robust data processing algorithms scalable to enormous data sets, and ability to run under all common operating systems. The GMT tool chest offers over 120 modules sharing a common set of command options, file structures, and documentation. GMT modules are command line tools that accept input and write output, and this design allows users to write scripts in which one module's output becomes another module's input, creating highly customized GMT workflows. With the release of GMT 5, these modules are high-level functions with a C API, potentially allowing users access to high-level GMT capabilities from any programmable environment. Many scientists who use GMT also use other computational tools, such as MATLAB® and its clone Octave. We have built a MATLAB/Octave interface on top of the GMT 5 C API. Thus, MATLAB or Octave now has full access to all GMT modules as well as fundamental input/output of GMT data objects via a MEX function. Internally, the GMT/MATLAB C API defines six high-level composite data objects that handle input and output of data via individual GMT modules. These are data tables, grids, text tables (text/data mixed records), color palette tables, raster images (1-4 color bands), and PostScript. The API is responsible for translating between the six GMT objects and the corresponding native MATLAB objects. References to data arrays are passed if transposing of matrices is not required. The GMT and MATLAB/Octave combination is extremely flexible, letting the user harvest the general numerical and graphical capabilities of both systems, and represents a giant step forward in interoperability between GMT and other software package. We will present examples of the symbiotic benefits of combining these platforms. Two other extensions are also in the works: a nearly finished Julia wrapper and an embryonic Python module. Publication supported by FCT- project UID/GEO/50019/2013 - Instituto D. Luiz

  9. 46 CFR 46.10-60 - Control.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 2 2011-10-01 2011-10-01 false Control. 46.10-60 Section 46.10-60 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) LOAD LINES SUBDIVISION LOAD LINES FOR PASSENGER VESSELS Administration § 46.10-60 Control. (a) The District Director of Customs or the Coast Guard District Commander may...

  10. 46 CFR 46.10-60 - Control.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Control. 46.10-60 Section 46.10-60 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) LOAD LINES SUBDIVISION LOAD LINES FOR PASSENGER VESSELS Administration § 46.10-60 Control. (a) The District Director of Customs or the Coast Guard District Commander may...

  11. 46 CFR 46.10-60 - Control.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 2 2013-10-01 2013-10-01 false Control. 46.10-60 Section 46.10-60 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) LOAD LINES SUBDIVISION LOAD LINES FOR PASSENGER VESSELS Administration § 46.10-60 Control. The Director, Field Operations (DFO) or the Coast Guard District Commander may...

  12. 46 CFR 46.10-60 - Control.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 2 2014-10-01 2014-10-01 false Control. 46.10-60 Section 46.10-60 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) LOAD LINES SUBDIVISION LOAD LINES FOR PASSENGER VESSELS Administration § 46.10-60 Control. The Director, Field Operations (DFO) or the Coast Guard District Commander may...

  13. 46 CFR 46.10-60 - Control.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 2 2012-10-01 2012-10-01 false Control. 46.10-60 Section 46.10-60 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) LOAD LINES SUBDIVISION LOAD LINES FOR PASSENGER VESSELS Administration § 46.10-60 Control. The Director, Field Operations (DFO) or the Coast Guard District Commander may...

  14. GANGA: A tool for computational-task management and easy access to Grid resources

    NASA Astrophysics Data System (ADS)

    Mościcki, J. T.; Brochu, F.; Ebke, J.; Egede, U.; Elmsheuser, J.; Harrison, K.; Jones, R. W. L.; Lee, H. C.; Liko, D.; Maier, A.; Muraru, A.; Patrick, G. N.; Pajchel, K.; Reece, W.; Samset, B. H.; Slater, M. W.; Soroko, A.; Tan, C. L.; van der Ster, D. C.; Williams, M.

    2009-11-01

    In this paper, we present the computational task-management tool GANGA, which allows for the specification, submission, bookkeeping and post-processing of computational tasks on a wide set of distributed resources. GANGA has been developed to solve a problem increasingly common in scientific projects, which is that researchers must regularly switch between different processing systems, each with its own command set, to complete their computational tasks. GANGA provides a homogeneous environment for processing data on heterogeneous resources. We give examples from High Energy Physics, demonstrating how an analysis can be developed on a local system and then transparently moved to a Grid system for processing of all available data. GANGA has an API that can be used via an interactive interface, in scripts, or through a GUI. Specific knowledge about types of tasks or computational resources is provided at run-time through a plugin system, making new developments easy to integrate. We give an overview of the GANGA architecture, give examples of current use, and demonstrate how GANGA can be used in many different areas of science. Catalogue identifier: AEEN_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEN_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GPL No. of lines in distributed program, including test data, etc.: 224 590 No. of bytes in distributed program, including test data, etc.: 14 365 315 Distribution format: tar.gz Programming language: Python Computer: personal computers, laptops Operating system: Linux/Unix RAM: 1 MB Classification: 6.2, 6.5 Nature of problem: Management of computational tasks for scientific applications on heterogenous distributed systems, including local, batch farms, opportunistic clusters and Grids. Solution method: High-level job management interface, including command line, scripting and GUI components. Restrictions: Access to the distributed resources depends on the installed, 3rd party software such as batch system client or Grid user interface.

  15. A new relational database structure and online interface for the HITRAN database

    NASA Astrophysics Data System (ADS)

    Hill, Christian; Gordon, Iouli E.; Rothman, Laurence S.; Tennyson, Jonathan

    2013-11-01

    A new format for the HITRAN database is proposed. By storing the line-transition data in a number of linked tables described by a relational database schema, it is possible to overcome the limitations of the existing format, which have become increasingly apparent over the last few years as new and more varied data are being used by radiative-transfer models. Although the database in the new format can be searched using the well-established Structured Query Language (SQL), a web service, HITRANonline, has been deployed to allow users to make most common queries of the database using a graphical user interface in a web page. The advantages of the relational form of the database to ensuring data integrity and consistency are explored, and the compatibility of the online interface with the emerging standards of the Virtual Atomic and Molecular Data Centre (VAMDC) project is discussed. In particular, the ability to access HITRAN data using a standard query language from other websites, command line tools and from within computer programs is described.

  16. A Pipeline for 3D Digital Optical Phenotyping Plant Root System Architecture

    NASA Astrophysics Data System (ADS)

    Davis, T. W.; Shaw, N. M.; Schneider, D. J.; Shaff, J. E.; Larson, B. G.; Craft, E. J.; Liu, Z.; Kochian, L. V.; Piñeros, M. A.

    2017-12-01

    This work presents a new pipeline for digital optical phenotyping the root system architecture of agricultural crops. The pipeline begins with a 3D root-system imaging apparatus for hydroponically grown crop lines of interest. The apparatus acts as a self-containing dark room, which includes an imaging tank, motorized rotating bearing and digital camera. The pipeline continues with the Plant Root Imaging and Data Acquisition (PRIDA) software, which is responsible for image capturing and storage. Once root images have been captured, image post-processing is performed using the Plant Root Imaging Analysis (PRIA) command-line tool, which extracts root pixels from color images. Following the pre-processing binarization of digital root images, 3D trait characterization is performed using the next-generation RootReader3D software. RootReader3D measures global root system architecture traits, such as total root system volume and length, total number of roots, and maximum rooting depth and width. While designed to work together, the four stages of the phenotyping pipeline are modular and stand-alone, which provides flexibility and adaptability for various research endeavors.

  17. Human Dimensions in Future Battle Command Systems: A Workshop Report

    DTIC Science & Technology

    2008-04-01

    information processing). These dimensions can best be described anecdotally and metaphorically as: • Battle command is a human-centric...enhance information visualization techniques in the decision tools, including multimodal platforms: video, graphics, symbols, etc. This should be...organization members. Each dimension can metaphorically represent the spatial location of individuals and group thinking in a trajectory of social norms

  18. Virtual Machine Language

    NASA Technical Reports Server (NTRS)

    Grasso, Christopher; Page, Dennis; O'Reilly, Taifun; Fteichert, Ralph; Lock, Patricia; Lin, Imin; Naviaux, Keith; Sisino, John

    2005-01-01

    Virtual Machine Language (VML) is a mission-independent, reusable software system for programming for spacecraft operations. Features of VML include a rich set of data types, named functions, parameters, IF and WHILE control structures, polymorphism, and on-the-fly creation of spacecraft commands from calculated values. Spacecraft functions can be abstracted into named blocks that reside in files aboard the spacecraft. These named blocks accept parameters and execute in a repeatable fashion. The sizes of uplink products are minimized by the ability to call blocks that implement most of the command steps. This block approach also enables some autonomous operations aboard the spacecraft, such as aerobraking, telemetry conditional monitoring, and anomaly response, without developing autonomous flight software. Operators on the ground write blocks and command sequences in a concise, high-level, human-readable programming language (also called VML ). A compiler translates the human-readable blocks and command sequences into binary files (the operations products). The flight portion of VML interprets the uplinked binary files. The ground subsystem of VML also includes an interactive sequence- execution tool hosted on workstations, which runs sequences at several thousand times real-time speed, affords debugging, and generates reports. This tool enables iterative development of blocks and sequences within times of the order of seconds.

  19. Molgenis-impute: imputation pipeline in a box.

    PubMed

    Kanterakis, Alexandros; Deelen, Patrick; van Dijk, Freerk; Byelas, Heorhiy; Dijkstra, Martijn; Swertz, Morris A

    2015-08-19

    Genotype imputation is an important procedure in current genomic analysis such as genome-wide association studies, meta-analyses and fine mapping. Although high quality tools are available that perform the steps of this process, considerable effort and expertise is required to set up and run a best practice imputation pipeline, particularly for larger genotype datasets, where imputation has to scale out in parallel on computer clusters. Here we present MOLGENIS-impute, an 'imputation in a box' solution that seamlessly and transparently automates the set up and running of all the steps of the imputation process. These steps include genome build liftover (liftovering), genotype phasing with SHAPEIT2, quality control, sample and chromosomal chunking/merging, and imputation with IMPUTE2. MOLGENIS-impute builds on MOLGENIS-compute, a simple pipeline management platform for submission and monitoring of bioinformatics tasks in High Performance Computing (HPC) environments like local/cloud servers, clusters and grids. All the required tools, data and scripts are downloaded and installed in a single step. Researchers with diverse backgrounds and expertise have tested MOLGENIS-impute on different locations and imputed over 30,000 samples so far using the 1,000 Genomes Project and new Genome of the Netherlands data as the imputation reference. The tests have been performed on PBS/SGE clusters, cloud VMs and in a grid HPC environment. MOLGENIS-impute gives priority to the ease of setting up, configuring and running an imputation. It has minimal dependencies and wraps the pipeline in a simple command line interface, without sacrificing flexibility to adapt or limiting the options of underlying imputation tools. It does not require knowledge of a workflow system or programming, and is targeted at researchers who just want to apply best practices in imputation via simple commands. It is built on the MOLGENIS compute workflow framework to enable customization with additional computational steps or it can be included in other bioinformatics pipelines. It is available as open source from: https://github.com/molgenis/molgenis-imputation.

  20. DASTCOM5: A Portable and Current Database of Asteroid and Comet Orbit Solutions

    NASA Astrophysics Data System (ADS)

    Giorgini, Jon D.; Chamberlin, Alan B.

    2014-11-01

    A portable direct-access database containing all NASA/JPL asteroid and comet orbit solutions, with the software to access it, is available for download (ftp://ssd.jpl.nasa.gov/pub/xfr/dastcom5.zip; unzip -ao dastcom5.zip). DASTCOM5 contains the latest heliocentric IAU76/J2000 ecliptic osculating orbital elements for all known asteroids and comets as determined by a least-squares best-fit to ground-based optical, spacecraft, and radar astrometric measurements. Other physical, dynamical, and covariance parameters are included when known. A total of 142 parameters per object are supported within DASTCOM5. This information is suitable for initializing high-precision numerical integrations, assessing orbit geometry, computing trajectory uncertainties, visual magnitude, and summarizing physical characteristics of the body. The DASTCOM5 distribution is updated as often as hourly to include newly discovered objects or orbit solution updates. It includes an ASCII index of objects that supports look-ups based on name, current or past designation, SPK ID, MPC packed-designations, or record number. DASTCOM5 is the database used by the NASA/JPL Horizons ephemeris system. It is a subset exported from a larger MySQL-based relational Small-Body Database ("SBDB") maintained at JPL. The DASTCOM5 distribution is intended for programmers comfortable with UNIX/LINUX/MacOSX command-line usage who need to develop stand-alone applications. The goal of the implementation is to provide small, fast, portable, and flexibly programmatic access to JPL comet and asteroid orbit solutions. The supplied software library, examples, and application programs have been verified under gfortran, Lahey, Intel, and Sun 32/64-bit Linux/UNIX FORTRAN compilers. A command-line tool ("dxlook") is provided to enable database access from shell or script environments.

  1. GeneSCF: a real-time based functional enrichment tool with support for multiple organisms.

    PubMed

    Subhash, Santhilal; Kanduri, Chandrasekhar

    2016-09-13

    High-throughput technologies such as ChIP-sequencing, RNA-sequencing, DNA sequencing and quantitative metabolomics generate a huge volume of data. Researchers often rely on functional enrichment tools to interpret the biological significance of the affected genes from these high-throughput studies. However, currently available functional enrichment tools need to be updated frequently to adapt to new entries from the functional database repositories. Hence there is a need for a simplified tool that can perform functional enrichment analysis by using updated information directly from the source databases such as KEGG, Reactome or Gene Ontology etc. In this study, we focused on designing a command-line tool called GeneSCF (Gene Set Clustering based on Functional annotations), that can predict the functionally relevant biological information for a set of genes in a real-time updated manner. It is designed to handle information from more than 4000 organisms from freely available prominent functional databases like KEGG, Reactome and Gene Ontology. We successfully employed our tool on two of published datasets to predict the biologically relevant functional information. The core features of this tool were tested on Linux machines without the need for installation of more dependencies. GeneSCF is more reliable compared to other enrichment tools because of its ability to use reference functional databases in real-time to perform enrichment analysis. It is an easy-to-integrate tool with other pipelines available for downstream analysis of high-throughput data. More importantly, GeneSCF can run multiple gene lists simultaneously on different organisms thereby saving time for the users. Since the tool is designed to be ready-to-use, there is no need for any complex compilation and installation procedures.

  2. Small satellite debris catalog maintenance issues

    NASA Technical Reports Server (NTRS)

    Jackson, Phoebe A.

    1991-01-01

    The United States Space Command (USSPACECOM) is a unified command of the Department of Defense, and one of its tasks is to detect, track, identify, and maintain a catalog of all man-made objects in Earth orbit. This task is called space surveillance, and the most important tool for space surveillance is the satellite catalog. The command's reasons for performing satellite catalog maintenance is presented. A satellite catalog is described, and small satellite-debris catalog-maintenance issues are identified. The underlying rationale is to describe the catalog maintenance services so that the members of the community can use them with assurance.

  3. RNA-SeQC: RNA-seq metrics for quality control and process optimization.

    PubMed

    DeLuca, David S; Levin, Joshua Z; Sivachenko, Andrey; Fennell, Timothy; Nazaire, Marc-Danie; Williams, Chris; Reich, Michael; Winckler, Wendy; Getz, Gad

    2012-06-01

    RNA-seq, the application of next-generation sequencing to RNA, provides transcriptome-wide characterization of cellular activity. Assessment of sequencing performance and library quality is critical to the interpretation of RNA-seq data, yet few tools exist to address this issue. We introduce RNA-SeQC, a program which provides key measures of data quality. These metrics include yield, alignment and duplication rates; GC bias, rRNA content, regions of alignment (exon, intron and intragenic), continuity of coverage, 3'/5' bias and count of detectable transcripts, among others. The software provides multi-sample evaluation of library construction protocols, input materials and other experimental parameters. The modularity of the software enables pipeline integration and the routine monitoring of key measures of data quality such as the number of alignable reads, duplication rates and rRNA contamination. RNA-SeQC allows investigators to make informed decisions about sample inclusion in downstream analysis. In summary, RNA-SeQC provides quality control measures critical to experiment design, process optimization and downstream computational analysis. See www.genepattern.org to run online, or www.broadinstitute.org/rna-seqc/ for a command line tool.

  4. Lessons learned: the switch from VMS to UNIX operations for the STScI's Science and Mission Scheduling Branch

    NASA Astrophysics Data System (ADS)

    Adler, David S.; Workman, William M., III; Chance, Don

    2004-09-01

    The Science and Mission Scheduling Branch (SMSB) of the Space Telescope Science Institute (STScI) historically operated exclusively under VMS. Due to diminished support for VMS-based platforms at STScI, SMSB recently transitioned to Unix operations. No additional resources were available to the group; the project was SMSB's to design, develop, and implement. Early decisions included the choice of Python as the primary scripting language; adoption of Object-Oriented Design in the development of base utilities; and the development of a Python utility to interact directly with the Sybase database. The project was completed in January 2004 with the implementation of a GUI to generate the Command Loads that are uplinked to HST. The current tool suite consists of 31 utilities and 271 tools comprising over 60,000 lines of code. In this paper, we summarize the decision-making process used to determine the primary scripting language, database interface, and code management library. We also describe the finished product and summarize lessons learned along the way to completing the project.

  5. GOssTo: a stand-alone application and a web tool for calculating semantic similarities on the Gene Ontology.

    PubMed

    Caniza, Horacio; Romero, Alfonso E; Heron, Samuel; Yang, Haixuan; Devoto, Alessandra; Frasca, Marco; Mesiti, Marco; Valentini, Giorgio; Paccanaro, Alberto

    2014-08-01

    We present GOssTo, the Gene Ontology semantic similarity Tool, a user-friendly software system for calculating semantic similarities between gene products according to the Gene Ontology. GOssTo is bundled with six semantic similarity measures, including both term- and graph-based measures, and has extension capabilities to allow the user to add new similarities. Importantly, for any measure, GOssTo can also calculate the Random Walk Contribution that has been shown to greatly improve the accuracy of similarity measures. GOssTo is very fast, easy to use, and it allows the calculation of similarities on a genomic scale in a few minutes on a regular desktop machine. alberto@cs.rhul.ac.uk GOssTo is available both as a stand-alone application running on GNU/Linux, Windows and MacOS from www.paccanarolab.org/gossto and as a web application from www.paccanarolab.org/gosstoweb. The stand-alone application features a simple and concise command line interface for easy integration into high-throughput data processing pipelines. © The Author 2014. Published by Oxford University Press.

  6. Predictive optimal control of sewer networks using CORAL tool: application to Riera Blanca catchment in Barcelona.

    PubMed

    Puig, V; Cembrano, G; Romera, J; Quevedo, J; Aznar, B; Ramón, G; Cabot, J

    2009-01-01

    This paper deals with the global control of the Riera Blanca catchment in the Barcelona sewer network using a predictive optimal control approach. This catchment has been modelled using a conceptual modelling approach based on decomposing the catchments in subcatchments and representing them as virtual tanks. This conceptual modelling approach allows real-time model calibration and control of the sewer network. The global control problem of the Riera Blanca catchment is solved using a optimal/predictive control algorithm. To implement the predictive optimal control of the Riera Blanca catchment, a software tool named CORAL is used. The on-line control is simulated by interfacing CORAL with a high fidelity simulator of sewer networks (MOUSE). CORAL interchanges readings from the limnimeters and gate commands with MOUSE as if it was connected with the real SCADA system. Finally, the global control results obtained using the predictive optimal control are presented and compared against the results obtained using current local control system. The results obtained using the global control are very satisfactory compared to those obtained using the local control.

  7. GUIdock-VNC: using a graphical desktop sharing system to provide a browser-based interface for containerized software.

    PubMed

    Mittal, Varun; Hung, Ling-Hong; Keswani, Jayant; Kristiyanto, Daniel; Lee, Sung Bong; Yeung, Ka Yee

    2017-04-01

    Software container technology such as Docker can be used to package and distribute bioinformatics workflows consisting of multiple software implementations and dependencies. However, Docker is a command line-based tool, and many bioinformatics pipelines consist of components that require a graphical user interface. We present a container tool called GUIdock-VNC that uses a graphical desktop sharing system to provide a browser-based interface for containerized software. GUIdock-VNC uses the Virtual Network Computing protocol to render the graphics within most commonly used browsers. We also present a minimal image builder that can add our proposed graphical desktop sharing system to any Docker packages, with the end result that any Docker packages can be run using a graphical desktop within a browser. In addition, GUIdock-VNC uses the Oauth2 authentication protocols when deployed on the cloud. As a proof-of-concept, we demonstrated the utility of GUIdock-noVNC in gene network inference. We benchmarked our container implementation on various operating systems and showed that our solution creates minimal overhead. © The Authors 2017. Published by Oxford University Press.

  8. Evaluation of a Tool for Airborne-Managed In-Trail Approach Spacing

    DOT National Transportation Integrated Search

    2005-08-01

    The Advanced Terminal Area Approach Spacing (ATAAS) tool uses Automatic Dependent Surveillance-Broadcast aircraft state data to compute a speed command for an ATAAS-equipped aircraft to follow and obtain a required time interval behind another aircra...

  9. 46 CFR 153.558 - Special requirements for phosphoric acid.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... CARGOES SHIPS CARRYING BULK LIQUID, LIQUEFIED GAS, OR COMPRESSED GAS HAZARDOUS MATERIALS Design and... containment system must be: (a) Lined with natural rubber or neoprene; (b) Lined with a material approved for phosphoric acid tanks by the Commandant (CG-522); or (c) Made of a stainless steel that resists corrosion by...

  10. English Collocation Learning through Corpus Data: On-Line Concordance and Statistical Information

    ERIC Educational Resources Information Center

    Ohtake, Hiroshi; Fujita, Nobuyuki; Kawamoto, Takeshi; Morren, Brian; Ugawa, Yoshihiro; Kaneko, Shuji

    2012-01-01

    We developed an English Collocations On Demand system offering on-line corpus and concordance information to help Japanese researchers acquire a better command of English collocation patterns. The Life Science Dictionary Corpus consists of approximately 90,000,000 words collected from life science related research papers published in academic…

  11. Simulation Data Management - Requirements and Design Specification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clay, Robert L.; Friedman-Hill, Ernest J.; Gibson, Marcus J.

    Simulation Data Management (SDM), the ability to securely organize, archive, and share analysis models and the artifacts used to create them, is a fundamental requirement for modern engineering analysis based on computational simulation. We have worked separately to provide secure, network SDM services to engineers and scientists at our respective laboratories for over a decade. We propose to leverage our experience and lessons learned to help develop and deploy a next-generation SDM service as part of a multi-laboratory team. This service will be portable across multiple sites and platforms, and will be accessible via a range of command-line tools andmore » well-documented APIs. In this document, we’ll review our high-level and low-level requirements for such a system, review one existing system, and briefly discuss our proposed implementation.« less

  12. mrtailor: a tool for PDB-file preparation for the generation of external restraints.

    PubMed

    Gruene, Tim

    2013-09-01

    Model building starting from, for example, a molecular-replacement solution with low sequence similarity introduces model bias, which can be difficult to detect, especially at low resolution. The program mrtailor removes low-similarity regions from a template PDB file according to sequence similarity between the target sequence and the template sequence and maps the target sequence onto the PDB file. The modified PDB file can be used to generate external restraints for low-resolution refinement with reduced model bias and can be used as a starting point for model building and refinement. The program can call ProSMART [Nicholls et al. (2012), Acta Cryst. D68, 404-417] directly in order to create external restraints suitable for REFMAC5 [Murshudov et al. (2011), Acta Cryst. D67, 355-367]. Both a command-line version and a GUI exist.

  13. ZED- A LINE EDITOR FOR THE DEC VAX

    NASA Technical Reports Server (NTRS)

    Scott, P. J.

    1994-01-01

    The ZED editor for the DEC VAX is a simple, yet powerful line editor for text, program source code, and non-binary data. Line editors can be superior to screen editors in some cases, such as executing complex multiple or conditional commands, or editing via slow modem lines. ZED excels in the area of text processing by using procedure files. For example, such procedures can reformat a file of addresses or remove all comment lines from a FORTRAN program. In addition to command files, ZED also features versatile search qualifiers, global changes, conditionals, on-line help, hexadecimal mode, space compression, looping, logical combinations of search strings, journaling, visible control characters, and automatic detabbing. The ZED editor was originally developed at Cambridge University in London and has been continuously enhanced since 1976. Users of the Cambridge implementation have devised such elaborate ZED procedures as chess games, calculators, and programs for evaluating Pi. This implementation of ZED strives to maintain the characteristics of the Cambridge editor. A complete ZED manual is included on the tape. ZED is written entirely in C for either batch or interactive execution on the DEC VAX under VMS 4.X and requires 80,896 bytes of memory. This program was released in 1988 and updated in 1989.

  14. PANTHER-PSEP: predicting disease-causing genetic variants using position-specific evolutionary preservation.

    PubMed

    Tang, Haiming; Thomas, Paul D

    2016-07-15

    PANTHER-PSEP is a new software tool for predicting non-synonymous genetic variants that may play a causal role in human disease. Several previous variant pathogenicity prediction methods have been proposed that quantify evolutionary conservation among homologous proteins from different organisms. PANTHER-PSEP employs a related but distinct metric based on 'evolutionary preservation': homologous proteins are used to reconstruct the likely sequences of ancestral proteins at nodes in a phylogenetic tree, and the history of each amino acid can be traced back in time from its current state to estimate how long that state has been preserved in its ancestors. Here, we describe the PSEP tool, and assess its performance on standard benchmarks for distinguishing disease-associated from neutral variation in humans. On these benchmarks, PSEP outperforms not only previous tools that utilize evolutionary conservation, but also several highly used tools that include multiple other sources of information as well. For predicting pathogenic human variants, the trace back of course starts with a human 'reference' protein sequence, but the PSEP tool can also be applied to predicting deleterious or pathogenic variants in reference proteins from any of the ∼100 other species in the PANTHER database. PANTHER-PSEP is freely available on the web at http://pantherdb.org/tools/csnpScoreForm.jsp Users can also download the command-line based tool at ftp://ftp.pantherdb.org/cSNP_analysis/PSEP/ CONTACT: pdthomas@usc.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  15. Programmatic access to data and information at the IRIS DMC via web services

    NASA Astrophysics Data System (ADS)

    Weertman, B. R.; Trabant, C.; Karstens, R.; Suleiman, Y. Y.; Ahern, T. K.; Casey, R.; Benson, R. B.

    2011-12-01

    The IRIS Data Management Center (DMC) has developed a suite of web services that provide access to the DMC's time series holdings, their related metadata and earthquake catalogs. In addition, services are available to perform simple, on-demand time series processing at the DMC prior to being shipped to the user. The primary goal is to provide programmatic access to data and processing services in a manner usable by and useful to the research community. The web services are relatively simple to understand and use and will form the foundation on which future DMC access tools will be built. Based on standard Web technologies they can be accessed programmatically with a wide range of programming languages (e.g. Perl, Python, Java), command line utilities such as wget and curl or with any web browser. We anticipate these services being used for everything from simple command line access, used in shell scripts and higher programming languages to being integrated within complex data processing software. In addition to improving access to our data by the seismological community the web services will also make our data more accessible to other disciplines. The web services available from the DMC include ws-bulkdataselect for the retrieval of large volumes of miniSEED data, ws-timeseries for the retrieval of individual segments of time series data in a variety of formats (miniSEED, SAC, ASCII, audio WAVE, and PNG plots) with optional signal processing, ws-station for station metadata in StationXML format, ws-resp for the retrieval of instrument response in RESP format, ws-sacpz for the retrieval of sensor response in the SAC poles and zeros convention and ws-event for the retrieval of earthquake catalogs. To make the services even easier to use, the DMC is developing a library that allows Java programmers to seamlessly retrieve and integrate DMC information into their own programs. The library will handle all aspects of dealing with the services and will parse the returned data. By using this library a developer will not need to learn the details of the service interfaces or understand the data formats returned. This library will be used to build the software bridge needed to request data and information from within MATLAB°. We also provide several client scripts written in Perl for the retrieval of waveform data, metadata and earthquake catalogs using command line programs. For more information on the DMC's web services please visit http://www.iris.edu/ws/

  16. A Historical Study of Operational Command: A Resource for Researchers

    DTIC Science & Technology

    2005-03-01

    PJHQ) in the United Kingdom. Control, the impact of command arrangements on the proper functioning of operational level headquarters is shown...might interact in a joint environment, identifying the different types of communication and social networks that exist, determining the influence of...introduction of information technology and its tools has spawned ideas such as Network Centric Warfare (NCW) or Network Enabled Warfare (NEW), there is

  17. Automated Design Tools for Integrated Mixed-Signal Microsystems (NeoCAD)

    DTIC Science & Technology

    2005-02-01

    method, Model Order Reduction (MOR) tools, system-level, mixed-signal circuit synthesis and optimization tools, and parsitic extraction tools. A unique...Mission Area: Command and Control mixed signal circuit simulation parasitic extraction time-domain simulation IC design flow model order reduction... Extraction 1.2 Overall Program Milestones CHAPTER 2 FAST TIME DOMAIN MIXED-SIGNAL CIRCUIT SIMULATION 2.1 HAARSPICE Algorithms 2.1.1 Mathematical Background

  18. Space surveillance satellite catalog maintenance

    NASA Astrophysics Data System (ADS)

    Jackson, Phoebe A.

    1990-04-01

    The United States Space Command (USSPACECOM) is a Unified Command of the Department of Defense with headquarters at Peterson Air Force Base, Colorado Springs, Co. One of the responsibilities of USSPACECOM is to detect, track, identify, and maintain a catalog of all manmade objects in earth orbit. This satellite catalog is the most important tool for space surveillance. The purpose of this paper is threefold. First, to identify why the command does the job of satellite catalog maintenance. Second, to describe what the satellite catalog is and how it is maintained. Third, and finally, to identify the questions that must be addressed if this command is to track small space object debris. This paper's underlying rationale is to describe our catalog maintenance services so that the members of our community can use them with assurance.

  19. Irresistable: Service Masks, Goldwater-Nichols, and Overcoming Service Barriers to JFACC

    DTIC Science & Technology

    2016-06-10

    Air Forces IFR In Flight Refueling JCS Joint Chiefs of Staff JFACC Joint Force Air Component Commander JFC Joint Force Commander LOC Lines of...culture, diplomacy, and beyond.7 The focus is on the personalities that build and develop the technology and thus their impact on history. This broad...embarked Air Group 5 contained propeller and first-generation jets. In the days before in- flight-refueling ( IFR ), these aircraft could only manage a

  20. On-command on/off switching of progenitor cell and cancer cell polarized motility and aligned morphology via a cytocompatible shape memory polymer scaffold.

    PubMed

    Wang, Jing; Quach, Andy; Brasch, Megan E; Turner, Christopher E; Henderson, James H

    2017-09-01

    In vitro biomaterial models have enabled advances in understanding the role of extracellular matrix (ECM) architecture in the control of cell motility and polarity. Most models are, however, static and cannot mimic dynamic aspects of in vivo ECM remodeling and function. To address this limitation, we present an electrospun shape memory polymer scaffold that can change fiber alignment on command under cytocompatible conditions. Cellular response was studied using the human fibrosarcoma cell line HT-1080 and the murine mesenchymal stem cell line C3H/10T1/2. The results demonstrate successful on-command on/off switching of cell polarized motility and alignment. Decrease in fiber alignment causes a change from polarized motility along the direction of fiber alignment to non-polarized motility and from aligned to unaligned morphology, while increase in fiber alignment causes a change from non-polarized to polarized motility along the direction of fiber alignment and from unaligned to aligned morphology. In addition, the findings are consistent with the hypothesis that increased fiber alignment causes increased cell velocity, while decreased fiber alignment causes decreased cell velocity. On-command on/off switching of cell polarized motility and alignment is anticipated to enable new study of directed cell motility in tumor metastasis, in cell homing, and in tissue engineering. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Tools virtualization for command and control systems

    NASA Astrophysics Data System (ADS)

    Piszczek, Marek; Maciejewski, Marcin; Pomianek, Mateusz; Szustakowski, Mieczysław

    2017-10-01

    Information management is an inseparable part of the command process. The result is that the person making decisions at the command post interacts with data providing devices in various ways. Tools virtualization process can introduce a number of significant modifications in the design of solutions for management and command. The general idea involves replacing physical devices user interface with their digital representation (so-called Virtual instruments). A more advanced level of the systems "digitalization" is to use the mixed reality environments. In solutions using Augmented reality (AR) customized HMI is displayed to the operator when he approaches to each device. Identification of device is done by image recognition of photo codes. Visualization is achieved by (optical) see-through head mounted display (HMD). Control can be done for example by means of a handheld touch panel. Using the immersive virtual environment, the command center can be digitally reconstructed. Workstation requires only VR system (HMD) and access to information network. Operator can interact with devices in such a way as it would perform in real world (for example with the virtual hands). Because of their procedures (an analysis of central vision, eye tracking) MR systems offers another useful feature of reducing requirements for system data throughput. Due to the fact that at the moment we focus on the single device. Experiments carried out using Moverio BT-200 and SteamVR systems and the results of experimental application testing clearly indicate the ability to create a fully functional information system with the use of mixed reality technology.

  2. Technological evaluation of gesture and speech interfaces for enabling dismounted soldier-robot dialogue

    NASA Astrophysics Data System (ADS)

    Kattoju, Ravi Kiran; Barber, Daniel J.; Abich, Julian; Harris, Jonathan

    2016-05-01

    With increasing necessity for intuitive Soldier-robot communication in military operations and advancements in interactive technologies, autonomous robots have transitioned from assistance tools to functional and operational teammates able to service an array of military operations. Despite improvements in gesture and speech recognition technologies, their effectiveness in supporting Soldier-robot communication is still uncertain. The purpose of the present study was to evaluate the performance of gesture and speech interface technologies to facilitate Soldier-robot communication during a spatial-navigation task with an autonomous robot. Gesture and speech semantically based spatial-navigation commands leveraged existing lexicons for visual and verbal communication from the U.S Army field manual for visual signaling and a previously established Squad Level Vocabulary (SLV). Speech commands were recorded by a Lapel microphone and Microsoft Kinect, and classified by commercial off-the-shelf automatic speech recognition (ASR) software. Visual signals were captured and classified using a custom wireless gesture glove and software. Participants in the experiment commanded a robot to complete a simulated ISR mission in a scaled down urban scenario by delivering a sequence of gesture and speech commands, both individually and simultaneously, to the robot. Performance and reliability of gesture and speech hardware interfaces and recognition tools were analyzed and reported. Analysis of experimental results demonstrated the employed gesture technology has significant potential for enabling bidirectional Soldier-robot team dialogue based on the high classification accuracy and minimal training required to perform gesture commands.

  3. Next generation tools for genomic data generation, distribution, and visualization

    PubMed Central

    2010-01-01

    Background With the rapidly falling cost and availability of high throughput sequencing and microarray technologies, the bottleneck for effectively using genomic analysis in the laboratory and clinic is shifting to one of effectively managing, analyzing, and sharing genomic data. Results Here we present three open-source, platform independent, software tools for generating, analyzing, distributing, and visualizing genomic data. These include a next generation sequencing/microarray LIMS and analysis project center (GNomEx); an application for annotating and programmatically distributing genomic data using the community vetted DAS/2 data exchange protocol (GenoPub); and a standalone Java Swing application (GWrap) that makes cutting edge command line analysis tools available to those who prefer graphical user interfaces. Both GNomEx and GenoPub use the rich client Flex/Flash web browser interface to interact with Java classes and a relational database on a remote server. Both employ a public-private user-group security model enabling controlled distribution of patient and unpublished data alongside public resources. As such, they function as genomic data repositories that can be accessed manually or programmatically through DAS/2-enabled client applications such as the Integrated Genome Browser. Conclusions These tools have gained wide use in our core facilities, research laboratories and clinics and are freely available for non-profit use. See http://sourceforge.net/projects/gnomex/, http://sourceforge.net/projects/genoviz/, and http://sourceforge.net/projects/useq. PMID:20828407

  4. PanWeb: A web interface for pan-genomic analysis.

    PubMed

    Pantoja, Yan; Pinheiro, Kenny; Veras, Allan; Araújo, Fabrício; Lopes de Sousa, Ailton; Guimarães, Luis Carlos; Silva, Artur; Ramos, Rommel T J

    2017-01-01

    With increased production of genomic data since the advent of next-generation sequencing (NGS), there has been a need to develop new bioinformatics tools and areas, such as comparative genomics. In comparative genomics, the genetic material of an organism is directly compared to that of another organism to better understand biological species. Moreover, the exponentially growing number of deposited prokaryote genomes has enabled the investigation of several genomic characteristics that are intrinsic to certain species. Thus, a new approach to comparative genomics, termed pan-genomics, was developed. In pan-genomics, various organisms of the same species or genus are compared. Currently, there are many tools that can perform pan-genomic analyses, such as PGAP (Pan-Genome Analysis Pipeline), Panseq (Pan-Genome Sequence Analysis Program) and PGAT (Prokaryotic Genome Analysis Tool). Among these software tools, PGAP was developed in the Perl scripting language and its reliance on UNIX platform terminals and its requirement for an extensive parameterized command line can become a problem for users without previous computational knowledge. Thus, the aim of this study was to develop a web application, known as PanWeb, that serves as a graphical interface for PGAP. In addition, using the output files of the PGAP pipeline, the application generates graphics using custom-developed scripts in the R programming language. PanWeb is freely available at http://www.computationalbiology.ufpa.br/panweb.

  5. Manananggal - a novel viewer for alternative splicing events.

    PubMed

    Barann, Matthias; Zimmer, Ralf; Birzele, Fabian

    2017-02-21

    Alternative splicing is an important cellular mechanism that can be analyzed by RNA sequencing. However, identification of splicing events in an automated fashion is error-prone. Thus, further validation is required to select reliable instances of alternative splicing events (ASEs). There are only few tools specifically designed for interactive inspection of ASEs and available visualization approaches can be significantly improved. Here, we present Manananggal, an application specifically designed for the identification of splicing events in next generation sequencing data. Manananggal includes a web application for visual inspection and a command line tool that allows for ASE detection. We compare the sashimi plots available in the IGV Viewer, the DEXSeq splicing plots and SpliceSeq to the Manananggal interface and discuss the advantages and drawbacks of these tools. We show that sashimi plots (such as those used by the IGV Viewer and SpliceSeq) offer a practical solution for simple ASEs, but also indicate short-comings for highly complex genes. Manananggal is an interactive web application that offers functions specifically tailored to the identification of alternative splicing events that other tools are lacking. The ability to select a subset of isoforms allows an easier interpretation of complex alternative splicing events. In contrast to SpliceSeq and the DEXSeq splicing plot, Manananggal does not obscure the gene structure by showing full transcript models that makes it easier to determine which isoforms are expressed and which are not.

  6. The plant leaf movement analyzer (PALMA): a simple tool for the analysis of periodic cotyledon and leaf movement in Arabidopsis thaliana.

    PubMed

    Wagner, Lucas; Schmal, Christoph; Staiger, Dorothee; Danisman, Selahattin

    2017-01-01

    The analysis of circadian leaf movement rhythms is a simple yet effective method to study effects of treatments or gene mutations on the circadian clock of plants. Currently, leaf movements are analysed using time lapse photography and subsequent bioinformatics analyses of leaf movements. Programs that are used for this purpose either are able to perform one function (i.e. leaf tip detection or rhythm analysis) or their function is limited to specific computational environments. We developed a leaf movement analysis tool-PALMA-that works in command line and combines image extraction with rhythm analysis using Fast Fourier transformation and non-linear least squares fitting. We validated PALMA in both simulated time series and in experiments using the known short period mutant sensitivity to red light reduced 1 ( srr1 - 1 ). We compared PALMA with two established leaf movement analysis tools and found it to perform equally well. Finally, we tested the effect of reduced iron conditions on the leaf movement rhythms of wild type plants. Here, we found that PALMA successfully detected period lengthening under reduced iron conditions. PALMA correctly estimated the period of both simulated and real-life leaf movement experiments. As a platform-independent console-program that unites both functions needed for the analysis of circadian leaf movements it is a valid alternative to existing leaf movement analysis tools.

  7. Evolutionary Telemetry and Command Processor (TCP) architecture

    NASA Technical Reports Server (NTRS)

    Schneider, John R.

    1992-01-01

    A low cost, modular, high performance, and compact Telemetry and Command Processor (TCP) is being built as the foundation of command and data handling subsystems for the next generation of satellites. The TCP product line will support command and telemetry requirements for small to large spacecraft and from low to high rate data transmission. It is compatible with the latest TDRSS, STDN and SGLS transponders and provides CCSDS protocol communications in addition to standard TDM formats. Its high performance computer provides computing resources for hosted flight software. Layered and modular software provides common services using standardized interfaces to applications thereby enhancing software re-use, transportability, and interoperability. The TCP architecture is based on existing standards, distributed networking, distributed and open system computing, and packet technology. The first TCP application is planned for the 94 SDIO SPAS 3 mission. The architecture enhances rapid tailoring of functions thereby reducing costs and schedules developed for individual spacecraft missions.

  8. MPST Software: grl_pef_check

    NASA Technical Reports Server (NTRS)

    Call, Jared A.; Kwok, John H.; Fisher, Forest W.

    2013-01-01

    This innovation is a tool used to verify and validate spacecraft sequences at the predicted events file (PEF) level for the GRAIL (Gravity Recovery and Interior Laboratory, see http://www.nasa. gov/mission_pages/grail/main/index. html) mission as part of the Multi-Mission Planning and Sequencing Team (MPST) operations process to reduce the possibility for errors. This tool is used to catch any sequence related errors or issues immediately after the seqgen modeling to streamline downstream processes. This script verifies and validates the seqgen modeling for the GRAIL MPST process. A PEF is provided as input, and dozens of checks are performed on it to verify and validate the command products including command content, command ordering, flight-rule violations, modeling boundary consistency, resource limits, and ground commanding consistency. By performing as many checks as early in the process as possible, grl_pef_check streamlines the MPST task of generating GRAIL command and modeled products on an aggressive schedule. By enumerating each check being performed, and clearly stating the criteria and assumptions made at each step, grl_pef_check can be used as a manual checklist as well as an automated tool. This helper script was written with a focus on enabling the user with the information they need in order to evaluate a sequence quickly and efficiently, while still keeping them informed and active in the overall sequencing process. grl_pef_check verifies and validates the modeling and sequence content prior to investing any more effort into the build. There are dozens of various items in the modeling run that need to be checked, which is a time-consuming and errorprone task. Currently, no software exists that provides this functionality. Compared to a manual process, this script reduces human error and saves considerable man-hours by automating and streamlining the mission planning and sequencing task for the GRAIL mission.

  9. Arlequin suite ver 3.5: a new series of programs to perform population genetics analyses under Linux and Windows.

    PubMed

    Excoffier, Laurent; Lischer, Heidi E L

    2010-05-01

    We present here a new version of the Arlequin program available under three different forms: a Windows graphical version (Winarl35), a console version of Arlequin (arlecore), and a specific console version to compute summary statistics (arlsumstat). The command-line versions run under both Linux and Windows. The main innovations of the new version include enhanced outputs in XML format, the possibility to embed graphics displaying computation results directly into output files, and the implementation of a new method to detect loci under selection from genome scans. Command-line versions are designed to handle large series of files, and arlsumstat can be used to generate summary statistics from simulated data sets within an Approximate Bayesian Computation framework. © 2010 Blackwell Publishing Ltd.

  10. STS-47 crew during JSC fire fighting exercises in the Fire Training Pit

    NASA Technical Reports Server (NTRS)

    1992-01-01

    STS-47 Endeavour, Orbiter Vehicle (OV) 105, crewmembers line up along water hoses during JSC fire fighting exercises held at JSC's Fire Training Pit. In the foreground are (left to right) Pilot Curtis L. Brown, Jr, holding the hose nozzle, Mission Specialist (MS) N. Jan Davis, MS and Payload Commander (PLC) Mark C. Lee, and backup Payload Specialist Stan Koszelak, partially visible at the end of the line. In the background, manning a second hose are backup Payload Specialist Takao Doi, MS Jerome Apt, and Commander Robert L. Gibson. A veteran fire fighter (behind Brown) stands between the two hoses giving instructions. The Fire Training Pit is located across from the Gilruth Center Bldg 207. Doi represents Japan's National Space Development Agency (NASDA).

  11. A Beginner's Sequence of Programming Activities.

    ERIC Educational Resources Information Center

    Slesnick, Twila

    1984-01-01

    Presents various programing activities using the BASIC and LOGO programing languages. Activities are included in separate sections with a title indicating the nature of the activities and the "tools" (commands) needed. For example, "Old-fashioned drawing" requires several tools (PRINT, LIST, RUN, GOTO) to make drawings using…

  12. Joint Mission Command Implementation

    DTIC Science & Technology

    2016-01-22

    choose. The paper finds that trust is strongly influenced by the subconscious brain and treating it like a tool ignores biology and results in... bias for action and empowerment.14 Since then, the services have evaluated their own concepts of command assessing them against Dempsey’s vision. Lt...understanding, intent, and trust, only trust is strongly influenced by the subconscious brain. Treating trust like it can be taught, or a behavior that

  13. 75 FR 1447 - Alaska Railroad Corporation-Construction and Operation Exemption-Rail Line Between North Pole and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-11

    ... Railroad Corporation--Construction and Operation Exemption--Rail Line Between North Pole and Delta Junction... of North Pole (located just south of Fairbanks) to the southern side of the community of Delta... Railroad Administration, U.S. Air Force 354th Fighter Wing Command from Eielson Air Force Base, U.S. Army...

  14. History of the Fifteenth United States Army. 21 August 1944 to 11 July 1945

    DTIC Science & Technology

    1946-12-13

    depots together with complete inventories . c. Complete rosters by unit, grade and organization of all members of his command. d. A roster of all...and wea’pons will b@ neatly stacked by front line troops so that rapid inventory is possible. Inventory will be made by the German Fortress Command...in centrally located warehouses or field dumps at battalion level, inventoried and located on overlays, copies of which will be delivered as directed

  15. Journal of Special Operations Medicine, Volume 8, Edition 2

    DTIC Science & Technology

    2008-01-01

    NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Joint Special...Order Desk — orders@gpo.gov. 4) The JSOM is on- line through the Joint Special Operations University’s new SOF Medical Gateway; it is available to all...From the Command Surgeon WARNER D. “Rocky” FARR COLONEL, U.S. ARMY Command Surgeon HQ USSOCOM • Recommended, and all concurred, that we need a Joint

  16. Crossing the Line of Departure. Battle Command on the Move A Historical Perspective

    DTIC Science & Technology

    2006-01-01

    controlled Alsace into Germany (French ally Württemberg) with an army of 200,000 organized into seven army corps and a cavalry corps, with corps varying...three to five days and daily horse-mounted messen­ gers to and from the corps commanders. After some delays because of a shortage of bridges , his...assembled his army in central Germany when the Prussians invaded the buffer state of Saxony and forcibly made it an ally. Figure 4. The Advance on

  17. The DISAM Journal of International Security Assistance Management. Volume 23, Number 1, Fall 2000

    DTIC Science & Technology

    2000-01-01

    Security Assistance Command Figure 1 The USASAC, including OPM-SANG, is staffed by 621 men and women , of whom 104 are military. These professionals are...by program managers. These program managers are like “front-line entreprenuers ” delivering products and services to their customers. They have been...NATO history was to be commanded by a Polish general in June 1988. The brigade of some 3000 men and women was composed of five national battalions

  18. Command and Control of Civilian Contract Manned Navy Fleet Support and Military Sealift Command Ships.

    DTIC Science & Technology

    1983-12-01

    58 APPENDIX B: COVER LEITERS RxErV FRC4 LABOR AND SHIPPING ORGANIZATINS ................................ 70 LIST CF...result of a merger between the Coast Seamen’ s Union and the Pacific Steamship Sailors’ Union. The SUP was under the leadership of Mr. Andrew Furuseth... leadership to emerge on the West Coast. As this emerging leadership tried to make new gains on the East Coast, it began to cme in conflict with the old-line

  19. The Supreme Allied Commander Mediterranean to the Combined Chiefs of Staff on the Operations in Southern France

    DTIC Science & Technology

    1944-08-01

    his temporary frustration of our plans ·and executed on 22 January, I944. At the Christmas was purchased at excessive cost, and left him at our...provision had been made for a ment, the Bataillon de Choc and a commando battalion. French Army Command to direct the two separate He also emphasized the...designed at the time of execution . Enemy sufficient to maintain the plausibility of a threat to that lines of communication received the heaviest

  20. Bias Correction of Satellite Precipitation Products (SPPs) using a User-friendly Tool: A Step in Enhancing Technical Capacity

    NASA Astrophysics Data System (ADS)

    Rushi, B. R.; Ellenburg, W. L.; Adams, E. C.; Flores, A.; Limaye, A. S.; Valdés-Pineda, R.; Roy, T.; Valdés, J. B.; Mithieu, F.; Omondi, S.

    2017-12-01

    SERVIR, a joint NASA-USAID initiative, works to build capacity in Earth observation technologies in developing countries for improved environmental decision making in the arena of: weather and climate, water and disasters, food security and land use/land cover. SERVIR partners with leading regional organizations in Eastern and Southern Africa, Hindu Kush-Himalaya, Mekong region, and West Africa to achieve its objectives. SERVIR develops hydrological applications to address specific needs articulated by key stakeholders and daily rainfall estimates are a vital input for these applications. Satellite-derived rainfall is subjected to systemic biases which need to be corrected before it can be used for any hydrologic application such as real-time or seasonal forecasting. SERVIR and the SWAAT team at the University of Arizona, have co-developed an open-source and user friendly tool of rainfall bias correction approaches for SPPs. Bias correction tools were developed based on Linear Scaling and Quantile Mapping techniques. A set of SPPs, such as PERSIANN-CCS, TMPA-RT, and CMORPH, are bias corrected using Climate Hazards Group InfraRed Precipitation with Station (CHIRPS) data which incorporates ground based precipitation observations. This bias correction tools also contains a component, which is included to improve monthly mean of CHIRPS using precipitation products of the Global Surface Summary of the Day (GSOD) database developed by the National Climatic Data Center (NCDC). This tool takes input from command-line which makes it user-friendly and applicable in any operating platform without prior programming skills. This presentation will focus on this bias-correction tool for SPPs, including application scenarios.

  1. Toward an Attention-Based Diagnostic Tool for Patients With Locked-in Syndrome.

    PubMed

    Lesenfants, Damien; Habbal, Dina; Chatelle, Camille; Soddu, Andrea; Laureys, Steven; Noirhomme, Quentin

    2018-03-01

    Electroencephalography (EEG) has been proposed as a supplemental tool for reducing clinical misdiagnosis in severely brain-injured populations helping to distinguish conscious from unconscious patients. We studied the use of spectral entropy as a measure of focal attention in order to develop a motor-independent, portable, and objective diagnostic tool for patients with locked-in syndrome (LIS), answering the issues of accuracy and training requirement. Data from 20 healthy volunteers, 6 LIS patients, and 10 patients with a vegetative state/unresponsive wakefulness syndrome (VS/UWS) were included. Spectral entropy was computed during a gaze-independent 2-class (attention vs rest) paradigm, and compared with EEG rhythms (delta, theta, alpha, and beta) classification. Spectral entropy classification during the attention-rest paradigm showed 93% and 91% accuracy in healthy volunteers and LIS patients respectively. VS/UWS patients were at chance level. EEG rhythms classification reached a lower accuracy than spectral entropy. Resting-state EEG spectral entropy could not distinguish individual VS/UWS patients from LIS patients. The present study provides evidence that an EEG-based measure of attention could detect command-following in patients with severe motor disabilities. The entropy system could detect a response to command in all healthy subjects and LIS patients, while none of the VS/UWS patients showed a response to command using this system.

  2. Integrated modeling: a look back

    NASA Astrophysics Data System (ADS)

    Briggs, Clark

    2015-09-01

    This paper discusses applications and implementation approaches used for integrated modeling of structural systems with optics over the past 30 years. While much of the development work focused on control system design, significant contributions were made in system modeling and computer-aided design (CAD) environments. Early work appended handmade line-of-sight models to traditional finite element models, such as the optical spacecraft concept from the ACOSS program. The IDEAS2 computational environment built in support of Space Station collected a wider variety of existing tools around a parametric database. Later, IMOS supported interferometer and large telescope mission studies at JPL with MATLAB modeling of structural dynamics, thermal analysis, and geometric optics. IMOS's predecessor was a simple FORTRAN command line interpreter for LQG controller design with additional functions that built state-space finite element models. Specialized language systems such as CAESY were formulated and prototyped to provide more complex object-oriented functions suited to control-structure interaction. A more recent example of optical modeling directly in mechanical CAD is used to illustrate possible future directions. While the value of directly posing the optical metric in system dynamics terms is well understood today, the potential payoff is illustrated briefly via project-based examples. It is quite likely that integrated structure thermal optical performance (STOP) modeling could be accomplished in a commercial off-the-shelf (COTS) tool set. The work flow could be adopted, for example, by a team developing a small high-performance optical or radio frequency (RF) instrument.

  3. Object-Oriented Multi-Disciplinary Design, Analysis, and Optimization Tool

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi

    2011-01-01

    An Object-Oriented Optimization (O3) tool was developed that leverages existing tools and practices, and allows the easy integration and adoption of new state-of-the-art software. At the heart of the O3 tool is the Central Executive Module (CEM), which can integrate disparate software packages in a cross platform network environment so as to quickly perform optimization and design tasks in a cohesive, streamlined manner. This object-oriented framework can integrate the analysis codes for multiple disciplines instead of relying on one code to perform the analysis for all disciplines. The CEM was written in FORTRAN and the script commands for each performance index were submitted through the use of the FORTRAN Call System command. In this CEM, the user chooses an optimization methodology, defines objective and constraint functions from performance indices, and provides starting and side constraints for continuous as well as discrete design variables. The structural analysis modules such as computations of the structural weight, stress, deflection, buckling, and flutter and divergence speeds have been developed and incorporated into the O3 tool to build an object-oriented Multidisciplinary Design, Analysis, and Optimization (MDAO) tool.

  4. Tech-Knowledgy & Diverse Learners

    ERIC Educational Resources Information Center

    Suh, Jennifer M.

    2010-01-01

    "Cognitive" technology tools have been described as "technologies that help transcend the limitation of the mind... in thinking, learning and problem solving activities" (Pea 1985, p. 168). These tools also respond to a user's commands and make mathematical actions more overtly apparent (Zbiek et al. 2007). By definition, these cognitive…

  5. Model specification and bootstrapping for multiply imputed data: An application to count models for the frequency of alcohol use

    PubMed Central

    Comulada, W. Scott

    2015-01-01

    Stata’s mi commands provide powerful tools to conduct multiple imputation in the presence of ignorable missing data. In this article, I present Stata code to extend the capabilities of the mi commands to address two areas of statistical inference where results are not easily aggregated across imputed datasets. First, mi commands are restricted to covariate selection. I show how to address model fit to correctly specify a model. Second, the mi commands readily aggregate model-based standard errors. I show how standard errors can be bootstrapped for situations where model assumptions may not be met. I illustrate model specification and bootstrapping on frequency counts for the number of times that alcohol was consumed in data with missing observations from a behavioral intervention. PMID:26973439

  6. NETS - A NEURAL NETWORK DEVELOPMENT TOOL, VERSION 3.0 (MACINTOSH VERSION)

    NASA Technical Reports Server (NTRS)

    Phillips, T. A.

    1994-01-01

    NETS, A Tool for the Development and Evaluation of Neural Networks, provides a simulation of Neural Network algorithms plus an environment for developing such algorithms. Neural Networks are a class of systems modeled after the human brain. Artificial Neural Networks are formed from hundreds or thousands of simulated neurons, connected to each other in a manner similar to brain neurons. Problems which involve pattern matching readily fit the class of problems which NETS is designed to solve. NETS uses the back propagation learning method for all of the networks which it creates. The nodes of a network are usually grouped together into clumps called layers. Generally, a network will have an input layer through which the various environment stimuli are presented to the network, and an output layer for determining the network's response. The number of nodes in these two layers is usually tied to some features of the problem being solved. Other layers, which form intermediate stops between the input and output layers, are called hidden layers. NETS allows the user to customize the patterns of connections between layers of a network. NETS also provides features for saving the weight values of a network during the learning process, which allows for more precise control over the learning process. NETS is an interpreter. Its method of execution is the familiar "read-evaluate-print" loop found in interpreted languages such as BASIC and LISP. The user is presented with a prompt which is the simulator's way of asking for input. After a command is issued, NETS will attempt to evaluate the command, which may produce more prompts requesting specific information or an error if the command is not understood. The typical process involved when using NETS consists of translating the problem into a format which uses input/output pairs, designing a network configuration for the problem, and finally training the network with input/output pairs until an acceptable error is reached. NETS allows the user to generate C code to implement the network loaded into the system. This permits the placement of networks as components, or subroutines, in other systems. In short, once a network performs satisfactorily, the Generate C Code option provides the means for creating a program separate from NETS to run the network. Other features: files may be stored in binary or ASCII format; multiple input propagation is permitted; bias values may be included; capability to scale data without writing scaling code; quick interactive testing of network from the main menu; and several options that allow the user to manipulate learning efficiency. NETS is written in ANSI standard C language to be machine independent. The Macintosh version (MSC-22108) includes code for both a graphical user interface version and a command line interface version. The machine independent version (MSC-21588) only includes code for the command line interface version of NETS 3.0. The Macintosh version requires a Macintosh II series computer and has been successfully implemented under System 7. Four executables are included on these diskettes, two for floating point operations and two for integer arithmetic. It requires Think C 5.0 to compile. A minimum of 1Mb of RAM is required for execution. Sample input files and executables for both the command line version and the Macintosh user interface version are provided on the distribution medium. The Macintosh version is available on a set of three 3.5 inch 800K Macintosh format diskettes. The machine independent version has been successfully implemented on an IBM PC series compatible running MS-DOS, a DEC VAX running VMS, a SunIPC running SunOS, and a CRAY Y-MP running UNICOS. Two executables for the IBM PC version are included on the MS-DOS distribution media, one compiled for floating point operations and one for integer arithmetic. The machine independent version is available on a set of three 5.25 inch 360K MS-DOS format diskettes (standard distribution medium) or a .25 inch streaming magnetic tape cartridge in UNIX tar format. NETS was developed in 1989 and updated in 1992. IBM PC is a registered trademark of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation. DEC, VAX, and VMS are trademarks of Digital Equipment Corporation. SunIPC and SunOS are trademarks of Sun Microsystems, Inc. CRAY Y-MP and UNICOS are trademarks of Cray Research, Inc.

  7. NETS - A NEURAL NETWORK DEVELOPMENT TOOL, VERSION 3.0 (MACHINE INDEPENDENT VERSION)

    NASA Technical Reports Server (NTRS)

    Baffes, P. T.

    1994-01-01

    NETS, A Tool for the Development and Evaluation of Neural Networks, provides a simulation of Neural Network algorithms plus an environment for developing such algorithms. Neural Networks are a class of systems modeled after the human brain. Artificial Neural Networks are formed from hundreds or thousands of simulated neurons, connected to each other in a manner similar to brain neurons. Problems which involve pattern matching readily fit the class of problems which NETS is designed to solve. NETS uses the back propagation learning method for all of the networks which it creates. The nodes of a network are usually grouped together into clumps called layers. Generally, a network will have an input layer through which the various environment stimuli are presented to the network, and an output layer for determining the network's response. The number of nodes in these two layers is usually tied to some features of the problem being solved. Other layers, which form intermediate stops between the input and output layers, are called hidden layers. NETS allows the user to customize the patterns of connections between layers of a network. NETS also provides features for saving the weight values of a network during the learning process, which allows for more precise control over the learning process. NETS is an interpreter. Its method of execution is the familiar "read-evaluate-print" loop found in interpreted languages such as BASIC and LISP. The user is presented with a prompt which is the simulator's way of asking for input. After a command is issued, NETS will attempt to evaluate the command, which may produce more prompts requesting specific information or an error if the command is not understood. The typical process involved when using NETS consists of translating the problem into a format which uses input/output pairs, designing a network configuration for the problem, and finally training the network with input/output pairs until an acceptable error is reached. NETS allows the user to generate C code to implement the network loaded into the system. This permits the placement of networks as components, or subroutines, in other systems. In short, once a network performs satisfactorily, the Generate C Code option provides the means for creating a program separate from NETS to run the network. Other features: files may be stored in binary or ASCII format; multiple input propagation is permitted; bias values may be included; capability to scale data without writing scaling code; quick interactive testing of network from the main menu; and several options that allow the user to manipulate learning efficiency. NETS is written in ANSI standard C language to be machine independent. The Macintosh version (MSC-22108) includes code for both a graphical user interface version and a command line interface version. The machine independent version (MSC-21588) only includes code for the command line interface version of NETS 3.0. The Macintosh version requires a Macintosh II series computer and has been successfully implemented under System 7. Four executables are included on these diskettes, two for floating point operations and two for integer arithmetic. It requires Think C 5.0 to compile. A minimum of 1Mb of RAM is required for execution. Sample input files and executables for both the command line version and the Macintosh user interface version are provided on the distribution medium. The Macintosh version is available on a set of three 3.5 inch 800K Macintosh format diskettes. The machine independent version has been successfully implemented on an IBM PC series compatible running MS-DOS, a DEC VAX running VMS, a SunIPC running SunOS, and a CRAY Y-MP running UNICOS. Two executables for the IBM PC version are included on the MS-DOS distribution media, one compiled for floating point operations and one for integer arithmetic. The machine independent version is available on a set of three 5.25 inch 360K MS-DOS format diskettes (standard distribution medium) or a .25 inch streaming magnetic tape cartridge in UNIX tar format. NETS was developed in 1989 and updated in 1992. IBM PC is a registered trademark of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation. DEC, VAX, and VMS are trademarks of Digital Equipment Corporation. SunIPC and SunOS are trademarks of Sun Microsystems, Inc. CRAY Y-MP and UNICOS are trademarks of Cray Research, Inc.

  8. DYNAMIC PROJECT COLLABORATION TOOLS FOR UNEXPLODED ORDNANCE (UXO) REMOVAL Case Study: Jefferson Proving Ground UXO Removal Projector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daffron, James Y.

    2003-02-27

    Unexploded Ordnance (UXO) removal and investigation projects typically involve multiple organizations including Government entities, private contractors, and technical experts. Resources are split into functional ''teams'' who perform the work and interface with the clients. The projects typically generate large amounts of data that must be shared among the project team members, the clients, and the public. The ability to efficiently communicate and control information is essential to project success. Web-based project collaboration is an effective management and communication tool when applied to ordnance and explosives (OE) projects. During a recent UXO/OE removal project at the Jefferson Proving Ground (JPG) inmore » Madison, IN, American Technologies, Inc. (ATI) successfully used the Project Commander(reg sign) (www.ProCommander.com) project collaboration website as a dynamic project and information management tool.« less

  9. Generic command interpreter for robot controllers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Werner, J.

    1991-04-09

    Generic command interpreter programs have been written for robot controllers at Sandia National Laboratories (SNL). Each interpreter program resides on a robot controller and interfaces the controller with a supervisory program on another (host) computer. We call these interpreter programs monitors because they wait, monitoring a communication line, for commands from the supervisory program. These monitors are designed to interface with the object-oriented software structure of the supervisory programs. The functions of the monitor programs are written in each robot controller's native language but reflect the object-oriented functions of the supervisory programs. These functions and other specifics of the monitormore » programs written for three different robots at SNL will be discussed. 4 refs., 4 figs.« less

  10. DOD Inventory of Contracted Services: Actions Needed to Help Ensure Inventory Data Are Complete and Accurate

    DTIC Science & Technology

    2015-11-01

    for Personnel and Readiness NAVSEA Naval Sea Systems Command OFPP Office of Federal Procurement Policy OMB Office of Management and Budget PDC ...Documentation of Contractors ( PDC ) process is delegated to the manpower and programing functions at the commands. The PDC process collects information from...review results. Army’s PDC tool, used to inform the inventory review, tracks by location and functional requirement—such as administrative or

  11. ascii2gdocs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nightingale, Trever

    2011-11-30

    Enables UNIX and Mac OS X command line users to put (individually or batch mode) local ascii files into Google Documents, where the ascii is converted to Google Document format using formatting the user can specify.

  12. SplicePlot: a utility for visualizing splicing quantitative trait loci.

    PubMed

    Wu, Eric; Nance, Tracy; Montgomery, Stephen B

    2014-04-01

    RNA sequencing has provided unprecedented resolution of alternative splicing and splicing quantitative trait loci (sQTL). However, there are few tools available for visualizing the genotype-dependent effects of splicing at a population level. SplicePlot is a simple command line utility that produces intuitive visualization of sQTLs and their effects. SplicePlot takes mapped RNA sequencing reads in BAM format and genotype data in VCF format as input and outputs publication-quality Sashimi plots, hive plots and structure plots, enabling better investigation and understanding of the role of genetics on alternative splicing and transcript structure. Source code and detailed documentation are available at http://montgomerylab.stanford.edu/spliceplot/index.html under Resources and at Github. SplicePlot is implemented in Python and is supported on Linux and Mac OS. A VirtualBox virtual machine running Ubuntu with SplicePlot already installed is also available.

  13. Dugong: a Docker image, based on Ubuntu Linux, focused on reproducibility and replicability for bioinformatics analyses.

    PubMed

    Menegidio, Fabiano B; Jabes, Daniela L; Costa de Oliveira, Regina; Nunes, Luiz R

    2018-02-01

    This manuscript introduces and describes Dugong, a Docker image based on Ubuntu 16.04, which automates installation of more than 3500 bioinformatics tools (along with their respective libraries and dependencies), in alternative computational environments. The software operates through a user-friendly XFCE4 graphic interface that allows software management and installation by users not fully familiarized with the Linux command line and provides the Jupyter Notebook to assist in the delivery and exchange of consistent and reproducible protocols and results across laboratories, assisting in the development of open science projects. Source code and instructions for local installation are available at https://github.com/DugongBioinformatics, under the MIT open source license. Luiz.nunes@ufabc.edu.br. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  14. Joseph Lovell, MD (1788-1836): First US army surgeon general.

    PubMed

    Craig, Stephen C

    2016-08-01

    Joseph Lovell, trained in medicine at Harvard and in military medicine/surgery by the War of 1812, became the first Surgeon General to sit on the reorganised army staff at the tender age of 29 in 1818. With a keen intellect, medical acumen, and wartime experiences for his tools and a close supporting relationship with Commanding General Jacob Jennings Brown and Secretary of War John C Calhoun (1728-1850), Lovell constructed an efficient and effective organisational and administrative framework for the new Medical Department of the US Army. Moreover, he not only redefined the role of the American military physician but also established the professional dignity, respectability and value of the medical officer among line officers and staff. Lovell's 18-year tenure came to an abrupt end, but the operational framework he created became both foundation and legacy for his successors. © Author(s) 2016.

  15. A statistical method for assessing peptide identification confidence in accurate mass and time tag proteomics

    PubMed Central

    Stanley, Jeffrey R.; Adkins, Joshua N.; Slysz, Gordon W.; Monroe, Matthew E.; Purvine, Samuel O.; Karpievitch, Yuliya V.; Anderson, Gordon A.; Smith, Richard D.; Dabney, Alan R.

    2011-01-01

    Current algorithms for quantifying peptide identification confidence in the accurate mass and time (AMT) tag approach assume that the AMT tags themselves have been correctly identified. However, there is uncertainty in the identification of AMT tags, as this is based on matching LC-MS/MS fragmentation spectra to peptide sequences. In this paper, we incorporate confidence measures for the AMT tag identifications into the calculation of probabilities for correct matches to an AMT tag database, resulting in a more accurate overall measure of identification confidence for the AMT tag approach. The method is referred to as Statistical Tools for AMT tag Confidence (STAC). STAC additionally provides a Uniqueness Probability (UP) to help distinguish between multiple matches to an AMT tag and a method to calculate an overall false discovery rate (FDR). STAC is freely available for download as both a command line and a Windows graphical application. PMID:21692516

  16. MOST: a software environment for constraint-based metabolic modeling and strain design.

    PubMed

    Kelley, James J; Lane, Anatoliy; Li, Xiaowei; Mutthoju, Brahmaji; Maor, Shay; Egen, Dennis; Lun, Desmond S

    2015-02-15

    MOST (metabolic optimization and simulation tool) is a software package that implements GDBB (genetic design through branch and bound) in an intuitive user-friendly interface with excel-like editing functionality, as well as implementing FBA (flux balance analysis), and supporting systems biology markup language and comma-separated values files. GDBB is currently the fastest algorithm for finding gene knockouts predicted by FBA to increase production of desired products, but GDBB has only been available on a command line interface, which is difficult to use for those without programming knowledge, until the release of MOST. MOST is distributed for free on the GNU General Public License. The software and full documentation are available at http://most.ccib.rutgers.edu/. dslun@rutgers.edu. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  17. Research Techniques Made Simple: Bioinformatics for Genome-Scale Biology.

    PubMed

    Foulkes, Amy C; Watson, David S; Griffiths, Christopher E M; Warren, Richard B; Huber, Wolfgang; Barnes, Michael R

    2017-09-01

    High-throughput biology presents unique opportunities and challenges for dermatological research. Drawing on a small handful of exemplary studies, we review some of the major lessons of these new technologies. We caution against several common errors and introduce helpful statistical concepts that may be unfamiliar to researchers without experience in bioinformatics. We recommend specific software tools that can aid dermatologists at varying levels of computational literacy, including platforms with command line and graphical user interfaces. The future of dermatology lies in integrative research, in which clinicians, laboratory scientists, and data analysts come together to plan, execute, and publish their work in open forums that promote critical discussion and reproducibility. In this article, we offer guidelines that we hope will steer researchers toward best practices for this new and dynamic era of data intensive dermatology. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  18. Registered File Support for Critical Operations Files at (Space Infrared Telescope Facility) SIRTF

    NASA Technical Reports Server (NTRS)

    Turek, G.; Handley, Tom; Jacobson, J.; Rector, J.

    2001-01-01

    The SIRTF Science Center's (SSC) Science Operations System (SOS) has to contend with nearly one hundred critical operations files via comprehensive file management services. The management is accomplished via the registered file system (otherwise known as TFS) which manages these files in a registered file repository composed of a virtual file system accessible via a TFS server and a file registration database. The TFS server provides controlled, reliable, and secure file transfer and storage by registering all file transactions and meta-data in the file registration database. An API is provided for application programs to communicate with TFS servers and the repository. A command line client implementing this API has been developed as a client tool. This paper describes the architecture, current implementation, but more importantly, the evolution of these services based on evolving community use cases and emerging information system technology.

  19. The semantic measures library and toolkit: fast computation of semantic similarity and relatedness using biomedical ontologies.

    PubMed

    Harispe, Sébastien; Ranwez, Sylvie; Janaqi, Stefan; Montmain, Jacky

    2014-03-01

    The semantic measures library and toolkit are robust open-source and easy to use software solutions dedicated to semantic measures. They can be used for large-scale computations and analyses of semantic similarities between terms/concepts defined in terminologies and ontologies. The comparison of entities (e.g. genes) annotated by concepts is also supported. A large collection of measures is available. Not limited to a specific application context, the library and the toolkit can be used with various controlled vocabularies and ontology specifications (e.g. Open Biomedical Ontology, Resource Description Framework). The project targets both designers and practitioners of semantic measures providing a JAVA library, as well as a command-line tool that can be used on personal computers or computer clusters. Downloads, documentation, tutorials, evaluation and support are available at http://www.semantic-measures-library.org.

  20. Automated Sanger Analysis Pipeline (ASAP): A Tool for Rapidly Analyzing Sanger Sequencing Data with Minimum User Interference.

    PubMed

    Singh, Aditya; Bhatia, Prateek

    2016-12-01

    Sanger sequencing platforms, such as applied biosystems instruments, generate chromatogram files. Generally, for 1 region of a sequence, we use both forward and reverse primers to sequence that area, in that way, we have 2 sequences that need to be aligned and a consensus generated before mutation detection studies. This work is cumbersome and takes time, especially if the gene is large with many exons. Hence, we devised a rapid automated command system to filter, build, and align consensus sequences and also optionally extract exonic regions, translate them in all frames, and perform an amino acid alignment starting from raw sequence data within a very short time. In full capabilities of Automated Mutation Analysis Pipeline (ASAP), it is able to read "*.ab1" chromatogram files through command line interface, convert it to the FASTQ format, trim the low-quality regions, reverse-complement the reverse sequence, create a consensus sequence, extract the exonic regions using a reference exonic sequence, translate the sequence in all frames, and align the nucleic acid and amino acid sequences to reference nucleic acid and amino acid sequences, respectively. All files are created and can be used for further analysis. ASAP is available as Python 3.x executable at https://github.com/aditya-88/ASAP. The version described in this paper is 0.28.

  1. A free interactive matching program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J.-F. Ostiguy

    1999-04-16

    For physicists and engineers involved in the design and analysis of beamlines (transfer lines or insertions) the lattice function matching problem is central and can be time-consuming because it involves constrained nonlinear optimization. For such problems convergence can be difficult to obtain in general without expert human intervention. Over the years, powerful codes have been developed to assist beamline designers. The canonical example is MAD (Methodical Accelerator Design) developed at CERN by Christophe Iselin. MAD, through a specialized command language, allows one to solve a wide variety of problems, including matching problems. Although in principle, the MAD command interpreter canmore » be run interactively, in practice the solution of a matching problem involves a sequence of independent trial runs. Unfortunately, but perhaps not surprisingly, there still exists relatively few tools exploiting the resources offered by modern environments to assist lattice designer with this routine and repetitive task. In this paper, we describe a fully interactive lattice matching program, written in C++ and assembled using freely available software components. An important feature of the code is that the evolution of the lattice functions during the nonlinear iterative process can be graphically monitored in real time; the user can dynamically interrupt the iterations at will to introduce new variables, freeze existing ones into their current state and/or modify constraints. The program runs under both UNIX and Windows NT.« less

  2. Decision generation tools and Bayesian inference

    NASA Astrophysics Data System (ADS)

    Jannson, Tomasz; Wang, Wenjian; Forrester, Thomas; Kostrzewski, Andrew; Veeris, Christian; Nielsen, Thomas

    2014-05-01

    Digital Decision Generation (DDG) tools are important software sub-systems of Command and Control (C2) systems and technologies. In this paper, we present a special type of DDGs based on Bayesian Inference, related to adverse (hostile) networks, including such important applications as terrorism-related networks and organized crime ones.

  3. STS-133 crew members Lindsey, Boe and Drew during Tool/Repair Kits training with instructor

    NASA Image and Video Library

    2010-01-26

    JSC2010-E-014266 (26 Jan. 2010) --- NASA astronauts Steve Lindsey (right), STS-133 commander; and Eric Boe, pilot, participate in an ISS tools and repair kits training session in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center.

  4. 78 FR 76822 - 36(b)(1) Arms Sales Notification

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-19

    ... missiles, containers, spare and repair parts, support equipment, tools and test equipment, publications and... missiles, 7 Fly-to-Buy TOW2A missiles, containers, spare and repair parts, support equipment, tools and... thermal) for the launcher to track and guide the missile in flight. Guidance commands from the launcher...

  5. Development and Command-Control Tools for Many-Robot Systems

    DTIC Science & Technology

    2005-01-01

    been components such as pressure sensors and accelerometers for the automobile market. In fact, robots of any size have yet to appear in our daily...34 mode, so that the target hardware is neither reprogrammable nor rechargable. The goal of this paper is to propose some generic tools that the

  6. Applications of Multiconductor Transmission Line Theory to the Prediction of Cable Coupling. Volume I. Multiconductor Transmission Line Theory.

    DTIC Science & Technology

    1976-04-01

    Compatibility AFAL-TR-65-142, Wright Patterson AFB, Ohio, The Boeing Com- pany, May 1965. -171- itA [34] J. Bogdanor , M. Siegel and G. Weinstock, Intra...Reliability Evaluation Laboratory, U. S. Army Missle Command, Contract DAA HOI-69-C-1381, GTE Sylvania Incorporated, June 1972. [37] J. L. Bogdanor , R. A

  7. 46 CFR 44.01-5 - Administration; special service.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... unless another society has been specifically approved by the Commandant as a load line assigning authority. In the latter case application shall be made to the society so approved. Applications shall state...

  8. 46 CFR 44.01-5 - Administration; special service.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... unless another society has been specifically approved by the Commandant as a load line assigning authority. In the latter case application shall be made to the society so approved. Applications shall state...

  9. Line drawing of Apollo 14 Command/Service Modules

    NASA Image and Video Library

    1971-01-12

    S71-16823 (January 1971) --- A line drawing illustrating a cutaway view of the Apollo 14 Command and Service Modules, showing the engineering changes in the CSM which were recommended by the Apollo 13 Review Board. (The Apollo 13 abort was caused by a short circuit and wiring overheating in one of the SM cryogenic oxygen tanks.) The major changes to the Apollo 14 CSM include adding a third cryogenic oxygen tank installed in a heretofore empty bay (in sector one) of the SM, addition of an auxiliary battery in the SM as a backup in case of fuel cell failure, and removal of destratification fans in the cryogenic oxygen tanks and removal of thermostat switches from the oxygen tank heater circuits. Provision for stowage of an emergency five-gallon supply of drinking water has been added to the CM.

  10. GLINT: a user-friendly toolset for the analysis of high-throughput DNA-methylation array data.

    PubMed

    Rahmani, Elior; Yedidim, Reut; Shenhav, Liat; Schweiger, Regev; Weissbrod, Omer; Zaitlen, Noah; Halperin, Eran

    2017-06-15

    GLINT is a user-friendly command-line toolset for fast analysis of genome-wide DNA methylation data generated using the Illumina human methylation arrays. GLINT, which does not require any programming proficiency, allows an easy execution of Epigenome-Wide Association Study analysis pipeline under different models while accounting for known confounders in methylation data. GLINT is a command-line software, freely available at https://github.com/cozygene/glint/releases . It requires Python 2.7 and several freely available Python packages. Further information and documentation as well as a quick start tutorial are available at http://glint-epigenetics.readthedocs.io . elior.rahmani@gmail.com or ehalperin@cs.ucla.edu. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  11. Off-line programming motion and process commands for robotic welding of Space Shuttle main engines

    NASA Technical Reports Server (NTRS)

    Ruokangas, C. C.; Guthmiller, W. A.; Pierson, B. L.; Sliwinski, K. E.; Lee, J. M. F.

    1987-01-01

    The off-line-programming software and hardware being developed for robotic welding of the Space Shuttle main engine are described and illustrated with diagrams, drawings, graphs, and photographs. The menu-driven workstation-based interactive programming system is designed to permit generation of both motion and process commands for the robotic workcell by weld engineers (with only limited knowledge of programming or CAD systems) on the production floor. Consideration is given to the user interface, geometric-sources interfaces, overall menu structure, weld-parameter data base, and displays of run time and archived data. Ongoing efforts to address limitations related to automatic-downhand-configuration coordinated motion, a lack of source codes for the motion-control software, CAD data incompatibility, interfacing with the robotic workcell, and definition of the welding data base are discussed.

  12. Cliffbot Maestro

    NASA Technical Reports Server (NTRS)

    Norris, Jeffrey S.; Powell, Mark W.; Fox, Jason M.; Crockett, Thomas M.; Joswig, Joseph C.

    2009-01-01

    Cliffbot Maestro permits teleoperation of remote rovers for field testing in extreme environments. The application user interface provides two sets of tools for operations: stereo image browsing and command generation.

  13. FastScript3D - A Companion to Java 3D

    NASA Technical Reports Server (NTRS)

    Koenig, Patti

    2005-01-01

    FastScript3D is a computer program, written in the Java 3D(TM) programming language, that establishes an alternative language that helps users who lack expertise in Java 3D to use Java 3D for constructing three-dimensional (3D)-appearing graphics. The FastScript3D language provides a set of simple, intuitive, one-line text-string commands for creating, controlling, and animating 3D models. The first word in a string is the name of a command; the rest of the string contains the data arguments for the command. The commands can also be used as an aid to learning Java 3D. Developers can extend the language by adding custom text-string commands. The commands can define new 3D objects or load representations of 3D objects from files in formats compatible with such other software systems as X3D. The text strings can be easily integrated into other languages. FastScript3D facilitates communication between scripting languages [which enable programming of hyper-text markup language (HTML) documents to interact with users] and Java 3D. The FastScript3D language can be extended and customized on both the scripting side and the Java 3D side.

  14. Software for Automated Testing of Mission-Control Displays

    NASA Technical Reports Server (NTRS)

    OHagan, Brian

    2004-01-01

    MCC Display Cert Tool is a set of software tools for automated testing of computerterminal displays in spacecraft mission-control centers, including those of the space shuttle and the International Space Station. This software makes it possible to perform tests that are more thorough, take less time, and are less likely to lead to erroneous results, relative to tests performed manually. This software enables comparison of two sets of displays to report command and telemetry differences, generates test scripts for verifying telemetry and commands, and generates a documentary record containing display information, including version and corrective-maintenance data. At the time of reporting the information for this article, work was continuing to add a capability for validation of display parameters against a reconfiguration file.

  15. Distributed environmental control

    NASA Technical Reports Server (NTRS)

    Cleveland, Gary A.

    1992-01-01

    We present an architecture of distributed, independent control agents designed to work with the Computer Aided System Engineering and Analysis (CASE/A) simulation tool. CASE/A simulates behavior of Environmental Control and Life Support Systems (ECLSS). We describe a lattice of agents capable of distributed sensing and overcoming certain sensor and effector failures. We address how the architecture can achieve the coordinating functions of a hierarchical command structure while maintaining the robustness and flexibility of independent agents. These agents work between the time steps of the CASE/A simulation tool to arrive at command decisions based on the state variables maintained by CASE/A. Control is evaluated according to both effectiveness (e.g., how well temperature was maintained) and resource utilization (the amount of power and materials used).

  16. Graphics simulation and training aids for advanced teleoperation

    NASA Technical Reports Server (NTRS)

    Kim, Won S.; Schenker, Paul S.; Bejczy, Antal K.

    1993-01-01

    Graphics displays can be of significant aid in accomplishing a teleoperation task throughout all three phases of off-line task analysis and planning, operator training, and online operation. In the first phase, graphics displays provide substantial aid to investigate work cell layout, motion planning with collision detection and with possible redundancy resolution, and planning for camera views. In the second phase, graphics displays can serve as very useful tools for introductory training of operators before training them on actual hardware. In the third phase, graphics displays can be used for previewing planned motions and monitoring actual motions in any desired viewing angle, or, when communication time delay prevails, for providing predictive graphics overlay on the actual camera view of the remote site to show the non-time-delayed consequences of commanded motions in real time. This paper addresses potential space applications of graphics displays in all three operational phases of advanced teleoperation. Possible applications are illustrated with techniques developed and demonstrated in the Advanced Teleoperation Laboratory at JPL. The examples described include task analysis and planning of a simulated Solar Maximum Satellite Repair task, a novel force-reflecting teleoperation simulator for operator training, and preview and predictive displays for on-line operations.

  17. Command Center Training Tool (C2T2)

    NASA Technical Reports Server (NTRS)

    Jones, Phillip; Drucker, Nich; Mathews, Reejo; Stanton, Laura; Merkle, Ed

    2012-01-01

    This abstract presents the training approach taken to create a management-centered, experiential learning solution for the Virginia Port Authority's Port Command Center. The resultant tool, called the Command Center Training Tool (C2T2), follows a holistic approach integrated across the training management cycle and within a single environment. The approach allows a single training manager to progress from training design through execution and AAR. The approach starts with modeling the training organization, identifying the organizational elements and their individual and collective performance requirements, including organizational-specific performance scoring ontologies. Next, the developer specifies conditions, the problems, and constructs that compose exercises and drive experiential learning. These conditions are defined by incidents, which denote a single, multi-media datum, and scenarios, which are stories told by incidents. To these layered, modular components, previously developed meta-data is attached, including associated performance requirements. The components are then stored in a searchable library An event developer can create a training event by searching the library based on metadata and then selecting and loading the resultant modular pieces. This loading process brings into the training event all the previously associated task and teamwork material as well as AAR preparation materials. The approach includes tools within an integrated management environment that places these materials at the fingertips of the event facilitator such that, in real time, the facilitator can track training audience performance and resultantly modify the training event. The approach also supports the concentrated knowledge management requirements for rapid preparation of an extensive AAR. This approach supports the integrated training cycle and allows a management-based perspective and advanced tools, through which a complex, thorough training event can be developed.

  18. Pattern Recognition Control Design

    NASA Technical Reports Server (NTRS)

    Gambone, Elisabeth A.

    2018-01-01

    Spacecraft control algorithms must know the expected vehicle response to any command to the available control effectors, such as reaction thrusters or torque devices. Spacecraft control system design approaches have traditionally relied on the estimated vehicle mass properties to determine the desired force and moment, as well as knowledge of the effector performance to efficiently control the spacecraft. A pattern recognition approach was used to investigate the relationship between the control effector commands and spacecraft responses. Instead of supplying the approximated vehicle properties and the thruster performance characteristics, a database of information relating the thruster ring commands and the desired vehicle response was used for closed-loop control. A Monte Carlo simulation data set of the spacecraft dynamic response to effector commands was analyzed to establish the influence a command has on the behavior of the spacecraft. A tool developed at NASA Johnson Space Center to analyze flight dynamics Monte Carlo data sets through pattern recognition methods was used to perform this analysis. Once a comprehensive data set relating spacecraft responses with commands was established, it was used in place of traditional control methods and gains set. This pattern recognition approach was compared with traditional control algorithms to determine the potential benefits and uses.

  19. The Influence of Future Command, Control, Communications, and Computers (C4) on Doctrine and the Operational Commander's Decision-Making Process

    NASA Technical Reports Server (NTRS)

    Mayer, Michael G.

    1996-01-01

    Future C4 systems will alter the traditional balance between force and information, having a profound influence on doctrine and the operational commander's decision making process. The Joint Staff's future vision of C4 is conceptualized in 'C4I for the Warrior' which envisions a joint C4I architecture providing timely sensor to shoot information direct to the warfighter. C4 system must manage and filter an overwhelming amount of information; deal with interoperability issues; overcome technological limitations; meet emerging security requirements; and protect against 'Information Warfare.' Severe budget constraints necessitate unified control of C4 systems under singular leadership for the common good of all the services. In addition, acquisition policy and procedures must be revamped to allow new technologies to be fielded quickly; and the commercial marketplace will become the preferred starting point for modernization. Flatter command structures are recommended in this environment where information is available instantaneously. New responsibilities for decision making at lower levels are created. Commanders will have to strike a balance between exerting greater control and allowing subordinates enough flexibility to maintain initiative. Clearly, the commander's intent remains the most important tool in striking this balance.

  20. An intelligent planning and scheduling system for the HST servicing missions

    NASA Technical Reports Server (NTRS)

    Johnson, Jay; Bogovich, Lynn; Tuchman, Alan; Kispert, Andrew; Page, Brenda; Burkhardt, Christian; Littlefield, Ronald; Mclean, David; Potter, William; Ochs, William

    1993-01-01

    A new, intelligent planning and scheduling system has been delivered to NASA-Goddard Space Flight Center (GSFC) to provide support for the up-coming Hubble Space Telescope (HST) Servicing Missions. This new system is the Servicing Mission Planning and Replanning Tool (SM/PART). SM/PART is written in C and runs on a UNlX-based workstation (IBM RS/6000) under Motif. SM/PART effectively automates the complex task of building or rebuilding integrated timelines and command plans which are required by HST Servicing Mission personnel at their consoles during the missions. SM/PART is able to quickly build or rebuild timelines based on information stored in a Knowledge Base (KB) by using an Artificial Intelligence (AI) tool called the Planning And Resource Reasoning (PARR) shell. After a timeline has been built in the batch mode, it can be displayed and edited in an interactive mode with help from the PARR shell. Finally a detailed command plan is generated. The capability to quickly build or rebuild timelines and command plans provides an additional safety factor for the HST, Shuttle and Crew.

  1. CLIPS 6.0 - C LANGUAGE INTEGRATED PRODUCTION SYSTEM, VERSION 6.0 (UNIX VERSION)

    NASA Technical Reports Server (NTRS)

    Donnell, B.

    1994-01-01

    CLIPS, the C Language Integrated Production System, is a complete environment for developing expert systems -- programs which are specifically intended to model human expertise or knowledge. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. CLIPS 6.0 provides a cohesive tool for handling a wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming allows knowledge to be represented as heuristics, or "rules-of-thumb" which specify a set of actions to be performed for a given situation. Object-oriented programming allows complex systems to be modeled as modular components (which can be easily reused to model other systems or create new components). The procedural programming capabilities provided by CLIPS 6.0 allow CLIPS to represent knowledge in ways similar to those allowed in languages such as C, Pascal, Ada, and LISP. Using CLIPS 6.0, one can develop expert system software using only rule-based programming, only object-oriented programming, only procedural programming, or combinations of the three. CLIPS provides extensive features to support the rule-based programming paradigm including seven conflict resolution strategies, dynamic rule priorities, and truth maintenance. CLIPS 6.0 supports more complex nesting of conditional elements in the if portion of a rule ("and", "or", and "not" conditional elements can be placed within a "not" conditional element). In addition, there is no longer a limitation on the number of multifield slots that a deftemplate can contain. The CLIPS Object-Oriented Language (COOL) provides object-oriented programming capabilities. Features supported by COOL include classes with multiple inheritance, abstraction, encapsulation, polymorphism, dynamic binding, and message passing with message-handlers. CLIPS 6.0 supports tight integration of the rule-based programming features of CLIPS with COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the command line version. For the UNIX version of CLIPS 6.0, the command line interface has been successfully implemented on a Sun4 running SunOS, a DECstation running DEC RISC ULTRIX, an SGI Indigo Elan running IRIX, a DEC Alpha AXP running OSF/1, and an IBM RS/6000 running AIX. Command line interface executables are included for Sun4 computers running SunOS 4.1.1 or later and for the DEC RISC ULTRIX platform. The makefiles may have to be modified slightly to be used on other UNIX platforms. The UNIX, Macintosh, and IBM PC versions of CLIPS 6.0 each have a platform specific interface. Source code, a makefile, and an executable for the Windows 3.1 interface version of CLIPS 6.0 are provided only on the IBM PC distribution diskettes. Source code, a makefile, and an executable for the Macintosh interface version of CLIPS 6.0 are provided only on the Macintosh distribution diskettes. Likewise, for the UNIX version of CLIPS 6.0, only source code and a makefile for an X-Windows interface are provided. The X-Windows interface requires MIT's X Window System, Version 11, Release 4 (X11R4), the Athena Widget Set, and the Xmu library. The source code for the Athena Widget Set is provided on the distribution medium. The X-Windows interface has been successfully implemented on a Sun4 running SunOS 4.1.2 with the MIT distribution of X11R4 (not OpenWindows), an SGI Indigo Elan running IRIX 4.0.5, and a DEC Alpha AXP running OSF/1 1.2. The VAX version of CLIPS 6.0 comes only with the generic command line interface. ASCII makefiles for the command line version of CLIPS are provided on all the distribution media for UNIX, VMS, and DOS. Four executables are provided with the IBM PC version: a windowed interface executable for Windows 3.1 built using Borland C++ v3.1, an editor for use with the windowed interface, a command line version of CLIPS for Windows 3.1, and a 386 command line executable for DOS built using Zortech C++ v3.1. All four executables are capable of utilizing extended memory and require an 80386 CPU or better. Users needing an 8086/8088 or 80286 executable must recompile the CLIPS source code themselves. Users who wish to recompile the DOS executable using Borland C++ or MicroSoft C must use a DOS extender program to produce an executable capable of using extended memory. The version of CLIPS 6.0 for IBM PC compatibles requires DOS v3.3 or later and/or Windows 3.1 or later. It is distributed on a set of three 1.4Mb 3.5 inch diskettes. A hard disk is required. The Macintosh version is distributed in compressed form on two 3.5 inch 1.4Mb Macintosh format diskettes, and requires System 6.0.5, or higher, and 1Mb RAM. The version for DEC VAX/VMS is available in VAX BACKUP format on a 1600 BPI 9-track magnetic tape (standard distribution medium) or a TK50 tape cartridge. The UNIX version is distributed in UNIX tar format on a .25 inch streaming magnetic tape cartridge (Sun QIC-24). For the UNIX version, alternate distribution media and formats are available upon request. The CLIPS 6.0 documentation includes a User's Guide and a three volume Reference Manual consisting of Basic and Advanced Programming Guides and an Interfaces Guide. An electronic version of the documentation is provided on the distribution medium for each version: in MicroSoft Word format for the Macintosh and PC versions of CLIPS, and in both PostScript format and MicroSoft Word for Macintosh format for the UNIX and DEC VAX versions of CLIPS. CLIPS was developed in 1986 and Version 6.0 was released in 1993.

  2. CLIPS 6.0 - C LANGUAGE INTEGRATED PRODUCTION SYSTEM, VERSION 6.0 (IBM PC VERSION)

    NASA Technical Reports Server (NTRS)

    Donnell, B.

    1994-01-01

    CLIPS, the C Language Integrated Production System, is a complete environment for developing expert systems -- programs which are specifically intended to model human expertise or knowledge. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. CLIPS 6.0 provides a cohesive tool for handling a wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming allows knowledge to be represented as heuristics, or "rules-of-thumb" which specify a set of actions to be performed for a given situation. Object-oriented programming allows complex systems to be modeled as modular components (which can be easily reused to model other systems or create new components). The procedural programming capabilities provided by CLIPS 6.0 allow CLIPS to represent knowledge in ways similar to those allowed in languages such as C, Pascal, Ada, and LISP. Using CLIPS 6.0, one can develop expert system software using only rule-based programming, only object-oriented programming, only procedural programming, or combinations of the three. CLIPS provides extensive features to support the rule-based programming paradigm including seven conflict resolution strategies, dynamic rule priorities, and truth maintenance. CLIPS 6.0 supports more complex nesting of conditional elements in the if portion of a rule ("and", "or", and "not" conditional elements can be placed within a "not" conditional element). In addition, there is no longer a limitation on the number of multifield slots that a deftemplate can contain. The CLIPS Object-Oriented Language (COOL) provides object-oriented programming capabilities. Features supported by COOL include classes with multiple inheritance, abstraction, encapsulation, polymorphism, dynamic binding, and message passing with message-handlers. CLIPS 6.0 supports tight integration of the rule-based programming features of CLIPS with COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the command line version. For the UNIX version of CLIPS 6.0, the command line interface has been successfully implemented on a Sun4 running SunOS, a DECstation running DEC RISC ULTRIX, an SGI Indigo Elan running IRIX, a DEC Alpha AXP running OSF/1, and an IBM RS/6000 running AIX. Command line interface executables are included for Sun4 computers running SunOS 4.1.1 or later and for the DEC RISC ULTRIX platform. The makefiles may have to be modified slightly to be used on other UNIX platforms. The UNIX, Macintosh, and IBM PC versions of CLIPS 6.0 each have a platform specific interface. Source code, a makefile, and an executable for the Windows 3.1 interface version of CLIPS 6.0 are provided only on the IBM PC distribution diskettes. Source code, a makefile, and an executable for the Macintosh interface version of CLIPS 6.0 are provided only on the Macintosh distribution diskettes. Likewise, for the UNIX version of CLIPS 6.0, only source code and a makefile for an X-Windows interface are provided. The X-Windows interface requires MIT's X Window System, Version 11, Release 4 (X11R4), the Athena Widget Set, and the Xmu library. The source code for the Athena Widget Set is provided on the distribution medium. The X-Windows interface has been successfully implemented on a Sun4 running SunOS 4.1.2 with the MIT distribution of X11R4 (not OpenWindows), an SGI Indigo Elan running IRIX 4.0.5, and a DEC Alpha AXP running OSF/1 1.2. The VAX version of CLIPS 6.0 comes only with the generic command line interface. ASCII makefiles for the command line version of CLIPS are provided on all the distribution media for UNIX, VMS, and DOS. Four executables are provided with the IBM PC version: a windowed interface executable for Windows 3.1 built using Borland C++ v3.1, an editor for use with the windowed interface, a command line version of CLIPS for Windows 3.1, and a 386 command line executable for DOS built using Zortech C++ v3.1. All four executables are capable of utilizing extended memory and require an 80386 CPU or better. Users needing an 8086/8088 or 80286 executable must recompile the CLIPS source code themselves. Users who wish to recompile the DOS executable using Borland C++ or MicroSoft C must use a DOS extender program to produce an executable capable of using extended memory. The version of CLIPS 6.0 for IBM PC compatibles requires DOS v3.3 or later and/or Windows 3.1 or later. It is distributed on a set of three 1.4Mb 3.5 inch diskettes. A hard disk is required. The Macintosh version is distributed in compressed form on two 3.5 inch 1.4Mb Macintosh format diskettes, and requires System 6.0.5, or higher, and 1Mb RAM. The version for DEC VAX/VMS is available in VAX BACKUP format on a 1600 BPI 9-track magnetic tape (standard distribution medium) or a TK50 tape cartridge. The UNIX version is distributed in UNIX tar format on a .25 inch streaming magnetic tape cartridge (Sun QIC-24). For the UNIX version, alternate distribution media and formats are available upon request. The CLIPS 6.0 documentation includes a User's Guide and a three volume Reference Manual consisting of Basic and Advanced Programming Guides and an Interfaces Guide. An electronic version of the documentation is provided on the distribution medium for each version: in MicroSoft Word format for the Macintosh and PC versions of CLIPS, and in both PostScript format and MicroSoft Word for Macintosh format for the UNIX and DEC VAX versions of CLIPS. CLIPS was developed in 1986 and Version 6.0 was released in 1993.

  3. CLIPS 6.0 - C LANGUAGE INTEGRATED PRODUCTION SYSTEM, VERSION 6.0 (MACINTOSH VERSION)

    NASA Technical Reports Server (NTRS)

    Riley, G.

    1994-01-01

    CLIPS, the C Language Integrated Production System, is a complete environment for developing expert systems -- programs which are specifically intended to model human expertise or knowledge. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. CLIPS 6.0 provides a cohesive tool for handling a wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming allows knowledge to be represented as heuristics, or "rules-of-thumb" which specify a set of actions to be performed for a given situation. Object-oriented programming allows complex systems to be modeled as modular components (which can be easily reused to model other systems or create new components). The procedural programming capabilities provided by CLIPS 6.0 allow CLIPS to represent knowledge in ways similar to those allowed in languages such as C, Pascal, Ada, and LISP. Using CLIPS 6.0, one can develop expert system software using only rule-based programming, only object-oriented programming, only procedural programming, or combinations of the three. CLIPS provides extensive features to support the rule-based programming paradigm including seven conflict resolution strategies, dynamic rule priorities, and truth maintenance. CLIPS 6.0 supports more complex nesting of conditional elements in the if portion of a rule ("and", "or", and "not" conditional elements can be placed within a "not" conditional element). In addition, there is no longer a limitation on the number of multifield slots that a deftemplate can contain. The CLIPS Object-Oriented Language (COOL) provides object-oriented programming capabilities. Features supported by COOL include classes with multiple inheritance, abstraction, encapsulation, polymorphism, dynamic binding, and message passing with message-handlers. CLIPS 6.0 supports tight integration of the rule-based programming features of CLIPS with COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the command line version. For the UNIX version of CLIPS 6.0, the command line interface has been successfully implemented on a Sun4 running SunOS, a DECstation running DEC RISC ULTRIX, an SGI Indigo Elan running IRIX, a DEC Alpha AXP running OSF/1, and an IBM RS/6000 running AIX. Command line interface executables are included for Sun4 computers running SunOS 4.1.1 or later and for the DEC RISC ULTRIX platform. The makefiles may have to be modified slightly to be used on other UNIX platforms. The UNIX, Macintosh, and IBM PC versions of CLIPS 6.0 each have a platform specific interface. Source code, a makefile, and an executable for the Windows 3.1 interface version of CLIPS 6.0 are provided only on the IBM PC distribution diskettes. Source code, a makefile, and an executable for the Macintosh interface version of CLIPS 6.0 are provided only on the Macintosh distribution diskettes. Likewise, for the UNIX version of CLIPS 6.0, only source code and a makefile for an X-Windows interface are provided. The X-Windows interface requires MIT's X Window System, Version 11, Release 4 (X11R4), the Athena Widget Set, and the Xmu library. The source code for the Athena Widget Set is provided on the distribution medium. The X-Windows interface has been successfully implemented on a Sun4 running SunOS 4.1.2 with the MIT distribution of X11R4 (not OpenWindows), an SGI Indigo Elan running IRIX 4.0.5, and a DEC Alpha AXP running OSF/1 1.2. The VAX version of CLIPS 6.0 comes only with the generic command line interface. ASCII makefiles for the command line version of CLIPS are provided on all the distribution media for UNIX, VMS, and DOS. Four executables are provided with the IBM PC version: a windowed interface executable for Windows 3.1 built using Borland C++ v3.1, an editor for use with the windowed interface, a command line version of CLIPS for Windows 3.1, and a 386 command line executable for DOS built using Zortech C++ v3.1. All four executables are capable of utilizing extended memory and require an 80386 CPU or better. Users needing an 8086/8088 or 80286 executable must recompile the CLIPS source code themselves. Users who wish to recompile the DOS executable using Borland C++ or MicroSoft C must use a DOS extender program to produce an executable capable of using extended memory. The version of CLIPS 6.0 for IBM PC compatibles requires DOS v3.3 or later and/or Windows 3.1 or later. It is distributed on a set of three 1.4Mb 3.5 inch diskettes. A hard disk is required. The Macintosh version is distributed in compressed form on two 3.5 inch 1.4Mb Macintosh format diskettes, and requires System 6.0.5, or higher, and 1Mb RAM. The version for DEC VAX/VMS is available in VAX BACKUP format on a 1600 BPI 9-track magnetic tape (standard distribution medium) or a TK50 tape cartridge. The UNIX version is distributed in UNIX tar format on a .25 inch streaming magnetic tape cartridge (Sun QIC-24). For the UNIX version, alternate distribution media and formats are available upon request. The CLIPS 6.0 documentation includes a User's Guide and a three volume Reference Manual consisting of Basic and Advanced Programming Guides and an Interfaces Guide. An electronic version of the documentation is provided on the distribution medium for each version: in MicroSoft Word format for the Macintosh and PC versions of CLIPS, and in both PostScript format and MicroSoft Word for Macintosh format for the UNIX and DEC VAX versions of CLIPS. CLIPS was developed in 1986 and Version 6.0 was released in 1993.

  4. CLIPS 6.0 - C LANGUAGE INTEGRATED PRODUCTION SYSTEM, VERSION 6.0 (DEC VAX VMS VERSION)

    NASA Technical Reports Server (NTRS)

    Donnell, B.

    1994-01-01

    CLIPS, the C Language Integrated Production System, is a complete environment for developing expert systems -- programs which are specifically intended to model human expertise or knowledge. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. CLIPS 6.0 provides a cohesive tool for handling a wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming allows knowledge to be represented as heuristics, or "rules-of-thumb" which specify a set of actions to be performed for a given situation. Object-oriented programming allows complex systems to be modeled as modular components (which can be easily reused to model other systems or create new components). The procedural programming capabilities provided by CLIPS 6.0 allow CLIPS to represent knowledge in ways similar to those allowed in languages such as C, Pascal, Ada, and LISP. Using CLIPS 6.0, one can develop expert system software using only rule-based programming, only object-oriented programming, only procedural programming, or combinations of the three. CLIPS provides extensive features to support the rule-based programming paradigm including seven conflict resolution strategies, dynamic rule priorities, and truth maintenance. CLIPS 6.0 supports more complex nesting of conditional elements in the if portion of a rule ("and", "or", and "not" conditional elements can be placed within a "not" conditional element). In addition, there is no longer a limitation on the number of multifield slots that a deftemplate can contain. The CLIPS Object-Oriented Language (COOL) provides object-oriented programming capabilities. Features supported by COOL include classes with multiple inheritance, abstraction, encapsulation, polymorphism, dynamic binding, and message passing with message-handlers. CLIPS 6.0 supports tight integration of the rule-based programming features of CLIPS with COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the command line version. For the UNIX version of CLIPS 6.0, the command line interface has been successfully implemented on a Sun4 running SunOS, a DECstation running DEC RISC ULTRIX, an SGI Indigo Elan running IRIX, a DEC Alpha AXP running OSF/1, and an IBM RS/6000 running AIX. Command line interface executables are included for Sun4 computers running SunOS 4.1.1 or later and for the DEC RISC ULTRIX platform. The makefiles may have to be modified slightly to be used on other UNIX platforms. The UNIX, Macintosh, and IBM PC versions of CLIPS 6.0 each have a platform specific interface. Source code, a makefile, and an executable for the Windows 3.1 interface version of CLIPS 6.0 are provided only on the IBM PC distribution diskettes. Source code, a makefile, and an executable for the Macintosh interface version of CLIPS 6.0 are provided only on the Macintosh distribution diskettes. Likewise, for the UNIX version of CLIPS 6.0, only source code and a makefile for an X-Windows interface are provided. The X-Windows interface requires MIT's X Window System, Version 11, Release 4 (X11R4), the Athena Widget Set, and the Xmu library. The source code for the Athena Widget Set is provided on the distribution medium. The X-Windows interface has been successfully implemented on a Sun4 running SunOS 4.1.2 with the MIT distribution of X11R4 (not OpenWindows), an SGI Indigo Elan running IRIX 4.0.5, and a DEC Alpha AXP running OSF/1 1.2. The VAX version of CLIPS 6.0 comes only with the generic command line interface. ASCII makefiles for the command line version of CLIPS are provided on all the distribution media for UNIX, VMS, and DOS. Four executables are provided with the IBM PC version: a windowed interface executable for Windows 3.1 built using Borland C++ v3.1, an editor for use with the windowed interface, a command line version of CLIPS for Windows 3.1, and a 386 command line executable for DOS built using Zortech C++ v3.1. All four executables are capable of utilizing extended memory and require an 80386 CPU or better. Users needing an 8086/8088 or 80286 executable must recompile the CLIPS source code themselves. Users who wish to recompile the DOS executable using Borland C++ or MicroSoft C must use a DOS extender program to produce an executable capable of using extended memory. The version of CLIPS 6.0 for IBM PC compatibles requires DOS v3.3 or later and/or Windows 3.1 or later. It is distributed on a set of three 1.4Mb 3.5 inch diskettes. A hard disk is required. The Macintosh version is distributed in compressed form on two 3.5 inch 1.4Mb Macintosh format diskettes, and requires System 6.0.5, or higher, and 1Mb RAM. The version for DEC VAX/VMS is available in VAX BACKUP format on a 1600 BPI 9-track magnetic tape (standard distribution medium) or a TK50 tape cartridge. The UNIX version is distributed in UNIX tar format on a .25 inch streaming magnetic tape cartridge (Sun QIC-24). For the UNIX version, alternate distribution media and formats are available upon request. The CLIPS 6.0 documentation includes a User's Guide and a three volume Reference Manual consisting of Basic and Advanced Programming Guides and an Interfaces Guide. An electronic version of the documentation is provided on the distribution medium for each version: in MicroSoft Word format for the Macintosh and PC versions of CLIPS, and in both PostScript format and MicroSoft Word for Macintosh format for the UNIX and DEC VAX versions of CLIPS. CLIPS was developed in 1986 and Version 6.0 was released in 1993.

  5. Towards a new modality-independent interface for a robotic wheelchair.

    PubMed

    Bastos-Filho, Teodiano Freire; Cheein, Fernando Auat; Müller, Sandra Mara Torres; Celeste, Wanderley Cardoso; de la Cruz, Celso; Cavalieri, Daniel Cruz; Sarcinelli-Filho, Mário; Amaral, Paulo Faria Santos; Perez, Elisa; Soria, Carlos Miguel; Carelli, Ricardo

    2014-05-01

    This work presents the development of a robotic wheelchair that can be commanded by users in a supervised way or by a fully automatic unsupervised navigation system. It provides flexibility to choose different modalities to command the wheelchair, in addition to be suitable for people with different levels of disabilities. Users can command the wheelchair based on their eye blinks, eye movements, head movements, by sip-and-puff and through brain signals. The wheelchair can also operate like an auto-guided vehicle, following metallic tapes, or in an autonomous way. The system is provided with an easy to use and flexible graphical user interface onboard a personal digital assistant, which is used to allow users to choose commands to be sent to the robotic wheelchair. Several experiments were carried out with people with disabilities, and the results validate the developed system as an assistive tool for people with distinct levels of disability.

  6. UTM Technical Capabilities Level 2 (TLC2) Test at Reno-Stead Airport.

    NASA Image and Video Library

    2016-10-06

    Test of Unmanned Aircraft Systems Traffic Management (UTM) technical capability Level 2 (TCL2) at Reno-Stead Airport, Nevada. During the test, five drones simultaneously crossed paths, separated by altitude. Two drones flew beyond visual line-of-sight and three flew within line-of-sight of their operators. Precision Hawk pilot launches UAS Lancaster Mark 3, one of 11 vehicles in the UTM TCL2 demonstration that will fly beyond line of sight of the pilot in command in Nevada test.

  7. Sharing Space Situational Awareness Data

    NASA Astrophysics Data System (ADS)

    Bird, D.

    2010-09-01

    The Commander, United States Strategic Command (CDRUSSTRATCOM) accepted responsibility for sharing space situational awareness (SSA) information/services with commercial & foreign entities from the US Air Force on 22 Dec 09 (formerly the Commercial & Foreign Entities Pilot Program). The requirement to share SSA services with non-US Government (USG) entities is derived from Title 10, United States Code, Section 2274 (2010) and is consistent with the new National Space Policy. US Strategic Command’s (USSTRATCOM’s) sharing of SSA services consists of basic services (Two-Line Elements, decay data and satellite catalog details) available on www.space-track.org and advanced services (conjunction assessment, launch support, etc) available with a signed agreement. USSTRATCOM has requested USG permission to enter into international agreements to enable SSA data exchange with our foreign partners. USSTRATCOM recently authorized Joint Functional Component Command for Space (JFCC SPACE) to share Conjunction Summary Messages (CSMs) with satellite owner/operators whose satellites have been identified as closely approaching another space object. CSMs contain vector and covariance data computed using Special Perturbations theory. To facilitate the utility of the CSMs, USSTRATCOM has and is hosting CSM Workshops to ensure satellite operators fully understand the data contained in the CSM in order to provide an informed recommendation to their leadership. As JFCC SPACE matures its ability to accept ephemeris data from a satellite operator, it will be necessary to automatically transfer that data from one security level to another. USSTRATCOM and Air Force Space Command are coordinating the integration of a cross domain solution that will allow JFCC SPACE to do just that. Finally, USSTRATCOM is also working with commercial and governmental organizations to develop an internationally-accepted conjunction assessment message. The United States Government (USG), specifically the Department of Defense, has been integrating data from diverse sources for decades. In recent years, more and more commercial entities have been integrating our data into their operations, whether to use General Perturbation (GP) Two-Line Elements (TLEs) to perform a rudimentary form of conjunction assessment (CA) or to provide a new app for the iPhone. For the longest time, the USG was one of the few organizations able to fund and conduct space surveillance using optical telescopes and various types of radar. Unfortunately, despite decades of experience tracking objects in space, we had not matured either our equipment or our processes to the point that we were able to prevent the Iridium-Cosmos collision about 18 months ago. As a result, there are two belts of debris orbiting our planet, today and for years to come. As space has become more congested and budgets have shrunk, the need to integrate data has increased. Fortunately, the number of organizations who have developed or are developing space situational awareness (SSA) capabilities, including analytical tools, has also increased.

  8. Decrease in medical command errors with use of a "standing orders" protocol system.

    PubMed

    Holliman, C J; Wuerz, R C; Meador, S A

    1994-05-01

    The purpose of this study was to determine the physician medical command error rates and paramedic error rates after implementation of a "standing orders" protocol system for medical command. These patient-care error rates were compared with the previously reported rates for a "required call-in" medical command system (Ann Emerg Med 1992; 21(4):347-350). A secondary aim of the study was to determine if the on-scene time interval was increased by the standing orders system. Prospectively conducted audit of prehospital advanced life support (ALS) trip sheets was made at an urban ALS paramedic service with on-line physician medical command from three local hospitals. All ALS run sheets from the start time of the standing orders system (April 1, 1991) for a 1-year period ending on March 30, 1992 were reviewed as part of an ongoing quality assurance program. Cases were identified as nonjustifiably deviating from regional emergency medical services (EMS) protocols as judged by agreement of three physician reviewers (the same methodology as a previously reported command error study in the same ALS system). Medical command and paramedic errors were identified from the prehospital ALS run sheets and categorized. Two thousand one ALS runs were reviewed; 24 physician errors (1.2% of the 1,928 "command" runs) and eight paramedic errors (0.4% of runs) were identified. The physician error rate was decreased from the 2.6% rate in the previous study (P < .0001 by chi 2 analysis). The on-scene time interval did not increase with the "standing orders" system.(ABSTRACT TRUNCATED AT 250 WORDS)

  9. Earth System Grid and EGI interoperability

    NASA Astrophysics Data System (ADS)

    Raciazek, J.; Petitdidier, M.; Gemuend, A.; Schwichtenberg, H.

    2012-04-01

    The Earth Science data centers have developed a data grid called Earth Science Grid Federation (ESGF) to give the scientific community world wide access to CMIP5 (Coupled Model Inter-comparison Project 5) climate data. The CMIP5 data will permit to evaluate the impact of climate change in various environmental and societal areas, such as regional climate, extreme events, agriculture, insurance… The ESGF grid provides services like searching, browsing and downloading of datasets. At the security level, ESGF data access is protected by an authentication mechanism. An ESGF trusted X509 Short-Lived EEC certificate with the correct roles/attributes is required to get access to the data in a non-interactive way (e.g. from a worker node). To access ESGF from EGI (i.e. by earth science applications running on EGI infrastructure), the security incompatibility between the two grids is the challenge: the EGI proxy certificate is not ESGF trusted nor it contains the correct roles/attributes. To solve this problem, we decided to use a Credential Translation Service (CTS) to translate the EGI X509 proxy certificate into the ESGF Short-Lived EEC certificate (the CTS will issue ESGF certificates based on EGI certificate authentication). From the end user perspective, the main steps to use the CTS are: the user binds his two identities (EGI and ESGF) together in the CTS using the CTS web interface (this steps has to be done only once) and then request an ESGF Short-Lived EEC certificate every time is needed, using a command-line tools. The implementation of the CTS is on-going. It is based on the open source MyProxy software stack, which is used in many grid infrastructures. On the client side, the "myproxy-logon" command-line tools is used to request the certificate translation. A new option has been added to "myproxy-logon" to select the original certificate (in our case, the EGI one). On the server side, MyProxy server operates in Certificate Authority mode, with a new module to store and manage identity pairs. Many European teams are working on the impact of climate change and face the problem of a lack of compute resources in connection with large data sets. This work between the ES VRC in EGI-Inspire and ESGF will be important to facilitate the exploitation of the CMIP5 data on EGI.

  10. Military Command Team Effectiveness: Model and Instrument for Assessment and Improvement (L’efficacite des Equipes de Commandement Militaires: un Modele et un Instrument Pour L’evaluation et L’amelioration)

    DTIC Science & Technology

    2005-04-01

    influence team performance (e.g., Gender , race, age) (Morgan and Lassiter, 1992). • Leadership – the deliberate attempt to influence team outcomes...members of an organisation to be subject to an implicit socialisation process that brings members’ belief structures, values and goals into line with...Bowers, 1999). • Mix of Demographic Characteristics (e.g., age, gender , ethnicity, and culture). Whether more homogeneity or more heterogeneity is

  11. Operational Knowledge Management: Signaleers Share Front Line Experiences

    DTIC Science & Technology

    2011-01-01

    commander recommended for you to be the division KMO . You’ve heard “knowledge management” thrown around at NTC or JRTC, and usually from your...technology? You check the division’s MTOE and fi nd out the authorizations are for an 02A, Branch Immaterial O-5/ LTC as the KMO and a FA 57A Battle...Command Systems Operator O-4/ MAJ as the deputy KMO . Then you start to research the fi eld of KM and fi nd FM 6-01.1 Knowledge Wading through mounds of

  12. Periodic, On-Demand, and User-Specified Information Reconciliation

    NASA Technical Reports Server (NTRS)

    Kolano, Paul

    2007-01-01

    Automated sequence generation (autogen) signifies both a process and software used to automatically generate sequences of commands to operate various spacecraft. Autogen requires fewer workers than are needed for older manual sequence-generation processes and reduces sequence-generation times from weeks to minutes. The autogen software comprises the autogen script plus the Activity Plan Generator (APGEN) program. APGEN can be used for planning missions and command sequences. APGEN includes a graphical user interface that facilitates scheduling of activities on a time line and affords a capability to automatically expand, decompose, and schedule activities.

  13. ObspyDMT: a Python toolbox for retrieving and processing large seismological data sets

    NASA Astrophysics Data System (ADS)

    Hosseini, Kasra; Sigloch, Karin

    2017-10-01

    We present obspyDMT, a free, open-source software toolbox for the query, retrieval, processing and management of seismological data sets, including very large, heterogeneous and/or dynamically growing ones. ObspyDMT simplifies and speeds up user interaction with data centers, in more versatile ways than existing tools. The user is shielded from the complexities of interacting with different data centers and data exchange protocols and is provided with powerful diagnostic and plotting tools to check the retrieved data and metadata. While primarily a productivity tool for research seismologists and observatories, easy-to-use syntax and plotting functionality also make obspyDMT an effective teaching aid. Written in the Python programming language, it can be used as a stand-alone command-line tool (requiring no knowledge of Python) or can be integrated as a module with other Python codes. It facilitates data archiving, preprocessing, instrument correction and quality control - routine but nontrivial tasks that can consume much user time. We describe obspyDMT's functionality, design and technical implementation, accompanied by an overview of its use cases. As an example of a typical problem encountered in seismogram preprocessing, we show how to check for inconsistencies in response files of two example stations. We also demonstrate the fully automated request, remote computation and retrieval of synthetic seismograms from the Synthetics Engine (Syngine) web service of the Data Management Center (DMC) at the Incorporated Research Institutions for Seismology (IRIS).

  14. Company Command: The Bottom Line

    DTIC Science & Technology

    1990-11-20

    who engage in original research on national security issues. NDU Press publishes the best of this research . In addition, the Press publishes other...79 Stan dard Inslallation/Division Person-. nel Systems jSIDPERS)-SIDPEiIS Be- p orts- Non coinmmission ed Officer Evaluation fleports (NO-ERs--Offtcer... Non - combatant Evacuation Operation (NEO) xii and Prenaration for Overseas Moveizent (PO.MW-Finance The Bottom Line for Personnel and Administration

  15. Smart BIT/TSMD Integration

    DTIC Science & Technology

    1991-12-01

    user using the ’: knn ’ option in the do-scenario command line). An instance of the K-Nearest Neighbor object is first created and initialized before...Navigation Computer HF High Frequency ILS Instrument Landing System KNN K - Nearest Neighbor LRU Line Replaceable Unit MC Mission Computer MTCA...approaches have been investigated here, K-nearest Neighbors ( KNN ) and neural networks (NN). Both approaches require that previously classified examples of

  16. IPeak: An open source tool to combine results from multiple MS/MS search engines.

    PubMed

    Wen, Bo; Du, Chaoqin; Li, Guilin; Ghali, Fawaz; Jones, Andrew R; Käll, Lukas; Xu, Shaohang; Zhou, Ruo; Ren, Zhe; Feng, Qiang; Xu, Xun; Wang, Jun

    2015-09-01

    Liquid chromatography coupled tandem mass spectrometry (LC-MS/MS) is an important technique for detecting peptides in proteomics studies. Here, we present an open source software tool, termed IPeak, a peptide identification pipeline that is designed to combine the Percolator post-processing algorithm and multi-search strategy to enhance the sensitivity of peptide identifications without compromising accuracy. IPeak provides a graphical user interface (GUI) as well as a command-line interface, which is implemented in JAVA and can work on all three major operating system platforms: Windows, Linux/Unix and OS X. IPeak has been designed to work with the mzIdentML standard from the Proteomics Standards Initiative (PSI) as an input and output, and also been fully integrated into the associated mzidLibrary project, providing access to the overall pipeline, as well as modules for calling Percolator on individual search engine result files. The integration thus enables IPeak (and Percolator) to be used in conjunction with any software packages implementing the mzIdentML data standard. IPeak is freely available and can be downloaded under an Apache 2.0 license at https://code.google.com/p/mzidentml-lib/. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. LEMON - LHC Era Monitoring for Large-Scale Infrastructures

    NASA Astrophysics Data System (ADS)

    Marian, Babik; Ivan, Fedorko; Nicholas, Hook; Hector, Lansdale Thomas; Daniel, Lenkes; Miroslav, Siket; Denis, Waldron

    2011-12-01

    At the present time computer centres are facing a massive rise in virtualization and cloud computing as these solutions bring advantages to service providers and consolidate the computer centre resources. However, as a result the monitoring complexity is increasing. Computer centre management requires not only to monitor servers, network equipment and associated software but also to collect additional environment and facilities data (e.g. temperature, power consumption, cooling efficiency, etc.) to have also a good overview of the infrastructure performance. The LHC Era Monitoring (Lemon) system is addressing these requirements for a very large scale infrastructure. The Lemon agent that collects data on every client and forwards the samples to the central measurement repository provides a flexible interface that allows rapid development of new sensors. The system allows also to report on behalf of remote devices such as switches and power supplies. Online and historical data can be visualized via a web-based interface or retrieved via command-line tools. The Lemon Alarm System component can be used for notifying the operator about error situations. In this article, an overview of the Lemon monitoring is provided together with a description of the CERN LEMON production instance. No direct comparison is made with other monitoring tool.

  18. iBIOMES Lite: Summarizing Biomolecular Simulation Data in Limited Settings

    PubMed Central

    2015-01-01

    As the amount of data generated by biomolecular simulations dramatically increases, new tools need to be developed to help manage this data at the individual investigator or small research group level. In this paper, we introduce iBIOMES Lite, a lightweight tool for biomolecular simulation data indexing and summarization. The main goal of iBIOMES Lite is to provide a simple interface to summarize computational experiments in a setting where the user might have limited privileges and limited access to IT resources. A command-line interface allows the user to summarize, publish, and search local simulation data sets. Published data sets are accessible via static hypertext markup language (HTML) pages that summarize the simulation protocols and also display data analysis graphically. The publication process is customized via extensible markup language (XML) descriptors while the HTML summary template is customized through extensible stylesheet language (XSL). iBIOMES Lite was tested on different platforms and at several national computing centers using various data sets generated through classical and quantum molecular dynamics, quantum chemistry, and QM/MM. The associated parsers currently support AMBER, GROMACS, Gaussian, and NWChem data set publication. The code is available at https://github.com/jcvthibault/ibiomes. PMID:24830957

  19. Know Your Enemy: Successful Bioinformatic Approaches to Predict Functional RNA Structures in Viral RNAs.

    PubMed

    Lim, Chun Shen; Brown, Chris M

    2017-01-01

    Structured RNA elements may control virus replication, transcription and translation, and their distinct features are being exploited by novel antiviral strategies. Viral RNA elements continue to be discovered using combinations of experimental and computational analyses. However, the wealth of sequence data, notably from deep viral RNA sequencing, viromes, and metagenomes, necessitates computational approaches being used as an essential discovery tool. In this review, we describe practical approaches being used to discover functional RNA elements in viral genomes. In addition to success stories in new and emerging viruses, these approaches have revealed some surprising new features of well-studied viruses e.g., human immunodeficiency virus, hepatitis C virus, influenza, and dengue viruses. Some notable discoveries were facilitated by new comparative analyses of diverse viral genome alignments. Importantly, comparative approaches for finding RNA elements embedded in coding and non-coding regions differ. With the exponential growth of computer power we have progressed from stem-loop prediction on single sequences to cutting edge 3D prediction, and from command line to user friendly web interfaces. Despite these advances, many powerful, user friendly prediction tools and resources are underutilized by the virology community.

  20. GEMINI: Integrative Exploration of Genetic Variation and Genome Annotations

    PubMed Central

    Paila, Umadevi; Chapman, Brad A.; Kirchner, Rory; Quinlan, Aaron R.

    2013-01-01

    Modern DNA sequencing technologies enable geneticists to rapidly identify genetic variation among many human genomes. However, isolating the minority of variants underlying disease remains an important, yet formidable challenge for medical genetics. We have developed GEMINI (GEnome MINIng), a flexible software package for exploring all forms of human genetic variation. Unlike existing tools, GEMINI integrates genetic variation with a diverse and adaptable set of genome annotations (e.g., dbSNP, ENCODE, UCSC, ClinVar, KEGG) into a unified database to facilitate interpretation and data exploration. Whereas other methods provide an inflexible set of variant filters or prioritization methods, GEMINI allows researchers to compose complex queries based on sample genotypes, inheritance patterns, and both pre-installed and custom genome annotations. GEMINI also provides methods for ad hoc queries and data exploration, a simple programming interface for custom analyses that leverage the underlying database, and both command line and graphical tools for common analyses. We demonstrate GEMINI's utility for exploring variation in personal genomes and family based genetic studies, and illustrate its ability to scale to studies involving thousands of human samples. GEMINI is designed for reproducibility and flexibility and our goal is to provide researchers with a standard framework for medical genomics. PMID:23874191

  1. Genexpi: a toolset for identifying regulons and validating gene regulatory networks using time-course expression data.

    PubMed

    Modrák, Martin; Vohradský, Jiří

    2018-04-13

    Identifying regulons of sigma factors is a vital subtask of gene network inference. Integrating multiple sources of data is essential for correct identification of regulons and complete gene regulatory networks. Time series of expression data measured with microarrays or RNA-seq combined with static binding experiments (e.g., ChIP-seq) or literature mining may be used for inference of sigma factor regulatory networks. We introduce Genexpi: a tool to identify sigma factors by combining candidates obtained from ChIP experiments or literature mining with time-course gene expression data. While Genexpi can be used to infer other types of regulatory interactions, it was designed and validated on real biological data from bacterial regulons. In this paper, we put primary focus on CyGenexpi: a plugin integrating Genexpi with the Cytoscape software for ease of use. As a part of this effort, a plugin for handling time series data in Cytoscape called CyDataseries has been developed and made available. Genexpi is also available as a standalone command line tool and an R package. Genexpi is a useful part of gene network inference toolbox. It provides meaningful information about the composition of regulons and delivers biologically interpretable results.

  2. Use of CellNetAnalyzer in biotechnology and metabolic engineering.

    PubMed

    von Kamp, Axel; Thiele, Sven; Hädicke, Oliver; Klamt, Steffen

    2017-11-10

    Mathematical models of the cellular metabolism have become an essential tool for the optimization of biotechnological processes. They help to obtain a systemic understanding of the metabolic processes in the used microorganisms and to find suitable genetic modifications maximizing the production performance. In particular, methods of stoichiometric and constraint-based modeling are frequently used in the context of metabolic and bioprocess engineering. Since metabolic networks can be complex and comprise hundreds or even thousands of metabolites and reactions, dedicated software tools are required for an efficient analysis. One such software suite is CellNetAnalyzer, a MATLAB package providing, among others, various methods for analyzing stoichiometric and constraint-based metabolic models. CellNetAnalyzer can be used via command-line based operations or via a graphical user interface with embedded network visualizations. Herein we will present key functionalities of CellNetAnalyzer for applications in biotechnology and metabolic engineering and thereby review constraint-based modeling techniques such as metabolic flux analysis, flux balance analysis, flux variability analysis, metabolic pathway analysis (elementary flux modes) and methods for computational strain design. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  3. The CARMEN software as a service infrastructure.

    PubMed

    Weeks, Michael; Jessop, Mark; Fletcher, Martyn; Hodge, Victoria; Jackson, Tom; Austin, Jim

    2013-01-28

    The CARMEN platform allows neuroscientists to share data, metadata, services and workflows, and to execute these services and workflows remotely via a Web portal. This paper describes how we implemented a service-based infrastructure into the CARMEN Virtual Laboratory. A Software as a Service framework was developed to allow generic new and legacy code to be deployed as services on a heterogeneous execution framework. Users can submit analysis code typically written in Matlab, Python, C/C++ and R as non-interactive standalone command-line applications and wrap them as services in a form suitable for deployment on the platform. The CARMEN Service Builder tool enables neuroscientists to quickly wrap their analysis software for deployment to the CARMEN platform, as a service without knowledge of the service framework or the CARMEN system. A metadata schema describes each service in terms of both system and user requirements. The search functionality allows services to be quickly discovered from the many services available. Within the platform, services may be combined into more complicated analyses using the workflow tool. CARMEN and the service infrastructure are targeted towards the neuroscience community; however, it is a generic platform, and can be targeted towards any discipline.

  4. Know Your Enemy: Successful Bioinformatic Approaches to Predict Functional RNA Structures in Viral RNAs

    PubMed Central

    Lim, Chun Shen; Brown, Chris M.

    2018-01-01

    Structured RNA elements may control virus replication, transcription and translation, and their distinct features are being exploited by novel antiviral strategies. Viral RNA elements continue to be discovered using combinations of experimental and computational analyses. However, the wealth of sequence data, notably from deep viral RNA sequencing, viromes, and metagenomes, necessitates computational approaches being used as an essential discovery tool. In this review, we describe practical approaches being used to discover functional RNA elements in viral genomes. In addition to success stories in new and emerging viruses, these approaches have revealed some surprising new features of well-studied viruses e.g., human immunodeficiency virus, hepatitis C virus, influenza, and dengue viruses. Some notable discoveries were facilitated by new comparative analyses of diverse viral genome alignments. Importantly, comparative approaches for finding RNA elements embedded in coding and non-coding regions differ. With the exponential growth of computer power we have progressed from stem-loop prediction on single sequences to cutting edge 3D prediction, and from command line to user friendly web interfaces. Despite these advances, many powerful, user friendly prediction tools and resources are underutilized by the virology community. PMID:29354101

  5. ISPATOM: A Generic Real-Time Data Processing Tool Without Programming

    NASA Technical Reports Server (NTRS)

    Dershowitz, Adam

    2007-01-01

    Information Sharing Protocol Advanced Tool of Math (ISPATOM) is an application program allowing for the streamlined generation of comps, which subscribe to streams of incoming telemetry data, perform any necessary computations on the data, then send the data to other programs for display and/or further processing in NASA mission control centers. Heretofore, the development of comps was difficult, expensive, and time-consuming: Each comp was custom written manually, in a low-level computing language, by a programmer attempting to follow requirements of flight controllers. ISPATOM enables a flight controller who is not a programmer to write a comp by simply typing in one or more equation( s) at a command line or retrieving the equation(s) from a text file. ISPATOM then subscribes to the necessary input data, performs all of necessary computations, and sends out the results. It sends out new results whenever the input data change. The use of equations in ISPATOM is no more difficult than is entering equations in a spreadsheet. The time involved in developing a comp is thus limited to the time taken to decide on the necessary equations. Thus, ISPATOM is a real-time dynamic calculator.

  6. PLIP: fully automated protein-ligand interaction profiler.

    PubMed

    Salentin, Sebastian; Schreiber, Sven; Haupt, V Joachim; Adasme, Melissa F; Schroeder, Michael

    2015-07-01

    The characterization of interactions in protein-ligand complexes is essential for research in structural bioinformatics, drug discovery and biology. However, comprehensive tools are not freely available to the research community. Here, we present the protein-ligand interaction profiler (PLIP), a novel web service for fully automated detection and visualization of relevant non-covalent protein-ligand contacts in 3D structures, freely available at projects.biotec.tu-dresden.de/plip-web. The input is either a Protein Data Bank structure, a protein or ligand name, or a custom protein-ligand complex (e.g. from docking). In contrast to other tools, the rule-based PLIP algorithm does not require any structure preparation. It returns a list of detected interactions on single atom level, covering seven interaction types (hydrogen bonds, hydrophobic contacts, pi-stacking, pi-cation interactions, salt bridges, water bridges and halogen bonds). PLIP stands out by offering publication-ready images, PyMOL session files to generate custom images and parsable result files to facilitate successive data processing. The full python source code is available for download on the website. PLIP's command-line mode allows for high-throughput interaction profiling. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  7. Exploration and Evaluation of Nanometer Low-power Multi-core VLSI Computer Architectures

    DTIC Science & Technology

    2015-03-01

    ICC, the Milkway database was created using the command: milkyway –galaxy –nogui –tcl –log memory.log one.tcl As stated previously, it is...EDA tools. Typically, Synopsys® tools use Milkway databases, whereas, Cadence Design System® use Layout Exchange Format (LEF) formats. To help

  8. STS-133 crew members Lindsey, Boe and Drew during Tool/Repair Kits training with instructor

    NASA Image and Video Library

    2010-01-26

    JSC2010-E-014267 (26 Jan. 2010) --- NASA astronauts Steve Lindsey (center), STS-133 commander; Eric Boe (left), pilot; and Alvin Drew, mission specialist, participate in an ISS tools and repair kits training session in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center.

  9. Integrating Computer-Assisted Translation Tools into Language Learning

    ERIC Educational Resources Information Center

    Fernández-Parra, María

    2016-01-01

    Although Computer-Assisted Translation (CAT) tools play an important role in the curriculum in many university translator training programmes, they are seldom used in the context of learning a language, as a good command of a language is needed before starting to translate. Since many institutions often have translator-training programmes as well…

  10. Language-Agnostic Reproducible Data Analysis Using Literate Programming.

    PubMed

    Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa

    2016-01-01

    A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir.

  11. Language-Agnostic Reproducible Data Analysis Using Literate Programming

    PubMed Central

    Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa

    2016-01-01

    A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir. PMID:27711123

  12. Community Intercomparison Suite (CIS) v1.4.0: a tool for intercomparing models and observations

    NASA Astrophysics Data System (ADS)

    Watson-Parris, Duncan; Schutgens, Nick; Cook, Nicholas; Kipling, Zak; Kershaw, Philip; Gryspeerdt, Edward; Lawrence, Bryan; Stier, Philip

    2016-09-01

    The Community Intercomparison Suite (CIS) is an easy-to-use command-line tool which has been developed to allow the straightforward intercomparison of remote sensing, in situ and model data. While there are a number of tools available for working with climate model data, the large diversity of sources (and formats) of remote sensing and in situ measurements necessitated a novel software solution. Developed by a professional software company, CIS supports a large number of gridded and ungridded data sources "out-of-the-box", including climate model output in NetCDF or the UK Met Office pp file format, CloudSat, CALIOP (Cloud-Aerosol Lidar with Orthogonal Polarization), MODIS (MODerate resolution Imaging Spectroradiometer), Cloud and Aerosol CCI (Climate Change Initiative) level 2 satellite data and a number of in situ aircraft and ground station data sets. The open-source architecture also supports user-defined plugins to allow many other sources to be easily added. Many of the key operations required when comparing heterogenous data sets are provided by CIS, including subsetting, aggregating, collocating and plotting the data. Output data are written to CF-compliant NetCDF files to ensure interoperability with other tools and systems. The latest documentation, including a user manual and installation instructions, can be found on our website (http://cistools.net). Here, we describe the need which this tool fulfils, followed by descriptions of its main functionality (as at version 1.4.0) and plugin architecture which make it unique in the field.

  13. Estimating Prediction Uncertainty from Geographical Information System Raster Processing: A User's Manual for the Raster Error Propagation Tool (REPTool)

    USGS Publications Warehouse

    Gurdak, Jason J.; Qi, Sharon L.; Geisler, Michael L.

    2009-01-01

    The U.S. Geological Survey Raster Error Propagation Tool (REPTool) is a custom tool for use with the Environmental System Research Institute (ESRI) ArcGIS Desktop application to estimate error propagation and prediction uncertainty in raster processing operations and geospatial modeling. REPTool is designed to introduce concepts of error and uncertainty in geospatial data and modeling and provide users of ArcGIS Desktop a geoprocessing tool and methodology to consider how error affects geospatial model output. Similar to other geoprocessing tools available in ArcGIS Desktop, REPTool can be run from a dialog window, from the ArcMap command line, or from a Python script. REPTool consists of public-domain, Python-based packages that implement Latin Hypercube Sampling within a probabilistic framework to track error propagation in geospatial models and quantitatively estimate the uncertainty of the model output. Users may specify error for each input raster or model coefficient represented in the geospatial model. The error for the input rasters may be specified as either spatially invariant or spatially variable across the spatial domain. Users may specify model output as a distribution of uncertainty for each raster cell. REPTool uses the Relative Variance Contribution method to quantify the relative error contribution from the two primary components in the geospatial model - errors in the model input data and coefficients of the model variables. REPTool is appropriate for many types of geospatial processing operations, modeling applications, and related research questions, including applications that consider spatially invariant or spatially variable error in geospatial data.

  14. Use of cloud computing technology in natural hazard assessment and emergency management

    NASA Astrophysics Data System (ADS)

    Webley, P. W.; Dehn, J.

    2015-12-01

    During a natural hazard event, the most up-to-date data needs to be in the hands of those on the front line. Decision support system tools can be developed to provide access to pre-made outputs to quickly assess the hazard and potential risk. However, with the ever growing availability of new satellite data as well as ground and airborne data generated in real-time there is a need to analyze the large volumes of data in an easy-to-access and effective environment. With the growth in the use of cloud computing, where the analysis and visualization system can grow with the needs of the user, then these facilities can used to provide this real-time analysis. Think of a central command center uploading the data to the cloud compute system and then those researchers in-the-field connecting to a web-based tool to view the newly acquired data. New data can be added by any user and then viewed instantly by anyone else in the organization through the cloud computing interface. This provides the ideal tool for collaborative data analysis, hazard assessment and decision making. We present the rationale for developing a cloud computing systems and illustrate how this tool can be developed for use in real-time environments. Users would have access to an interactive online image analysis tool without the need for specific remote sensing software on their local system therefore increasing their understanding of the ongoing hazard and mitigate its impact on the surrounding region.

  15. Method for hierarchical modeling of the command of flexible manufacturing systems

    NASA Astrophysics Data System (ADS)

    Ausfelder, Christian; Castelain, Emmanuel; Gentina, Jean-Claude

    1994-04-01

    The present paper focuses on the modeling of the command and proposes a hierarchical and modular approach which is oriented on the physical structure of FMS. The requirements issuing from monitoring of FMS are discussed and integrated in the proposed model. Its modularity makes the approach open for extensions concerning as well the production resources as the products. As a modeling tool, we have chosen Object Petri nets. The first part of the paper describes desirable features of an FMS command such as safety, robustness, and adaptability. As it is shown, these features result from the flexibility of the installation. The modeling method presented in the second part of the paper begins with a structural analysis of FMS and defines a natural command hierarchy, where the coordination of the production process, the synchronization of production resources on products, and the internal coordination are treated separately. The method is rigorous and leads to a structured and modular Petri net model which can be used for FMS simulation or translated into the final command code.

  16. ULSGEN (Uplink Summary Generator)

    NASA Technical Reports Server (NTRS)

    Wang, Y.-F.; Schrock, M.; Reeve, T.; Nguyen, K.; Smith, B.

    2014-01-01

    Uplink is an important part of spacecraft operations. Ensuring the accuracy of uplink content is essential to mission success. Before commands are radiated to the spacecraft, the command and sequence must be reviewed and verified by various teams. In most cases, this process requires collecting the command data, reviewing the data during a command conference meeting, and providing physical signatures by designated members of various teams to signify approval of the data. If commands or sequences are disapproved for some reason, the whole process must be restarted. Recording data and decision history is important for traceability reasons. Given that many steps and people are involved in this process, an easily accessible software tool for managing the process is vital to reducing human error which could result in uplinking incorrect data to the spacecraft. An uplink summary generator called ULSGEN was developed to assist this uplink content approval process. ULSGEN generates a web-based summary of uplink file content and provides an online review process. Spacecraft operations personnel view this summary as a final check before actual radiation of the uplink data. .

  17. Microfluidic Pneumatic Logic Circuits and Digital Pneumatic Microprocessors for Integrated Microfluidic Systems

    PubMed Central

    Rhee, Minsoung

    2010-01-01

    We have developed pneumatic logic circuits and microprocessors built with microfluidic channels and valves in polydimethylsiloxane (PDMS). The pneumatic logic circuits perform various combinational and sequential logic calculations with binary pneumatic signals (atmosphere and vacuum), producing cascadable outputs based on Boolean operations. A complex microprocessor is constructed from combinations of various logic circuits and receives pneumatically encoded serial commands at a single input line. The device then decodes the temporal command sequence by spatial parallelization, computes necessary logic calculations between parallelized command bits, stores command information for signal transportation and maintenance, and finally executes the command for the target devices. Thus, such pneumatic microprocessors will function as a universal on-chip control platform to perform complex parallel operations for large-scale integrated microfluidic devices. To demonstrate the working principles, we have built 2-bit, 3-bit, 4-bit, and 8-bit microprecessors to control various target devices for applications such as four color dye mixing, and multiplexed channel fluidic control. By significantly reducing the need for external controllers, the digital pneumatic microprocessor can be used as a universal on-chip platform to autonomously manipulate microfluids in a high throughput manner. PMID:19823730

  18. AXAF user interfaces for heterogeneous analysis environments

    NASA Technical Reports Server (NTRS)

    Mandel, Eric; Roll, John; Ackerman, Mark S.

    1992-01-01

    The AXAF Science Center (ASC) will develop software to support all facets of data center activities and user research for the AXAF X-ray Observatory, scheduled for launch in 1999. The goal is to provide astronomers with the ability to utilize heterogeneous data analysis packages, that is, to allow astronomers to pick the best packages for doing their scientific analysis. For example, ASC software will be based on IRAF, but non-IRAF programs will be incorporated into the data system where appropriate. Additionally, it is desired to allow AXAF users to mix ASC software with their own local software. The need to support heterogeneous analysis environments is not special to the AXAF project, and therefore finding mechanisms for coordinating heterogeneous programs is an important problem for astronomical software today. The approach to solving this problem has been to develop two interfaces that allow the scientific user to run heterogeneous programs together. The first is an IRAF-compatible parameter interface that provides non-IRAF programs with IRAF's parameter handling capabilities. Included in the interface is an application programming interface to manipulate parameters from within programs, and also a set of host programs to manipulate parameters at the command line or from within scripts. The parameter interface has been implemented to support parameter storage formats other than IRAF parameter files, allowing one, for example, to access parameters that are stored in data bases. An X Windows graphical user interface called 'agcl' has been developed, layered on top of the IRAF-compatible parameter interface, that provides a standard graphical mechanism for interacting with IRAF and non-IRAF programs. Users can edit parameters and run programs for both non-IRAF programs and IRAF tasks. The agcl interface allows one to communicate with any command line environment in a transparent manner and without any changes to the original environment. For example, the authors routinely layer the GUI on top of IRAF, ksh, SMongo, and IDL. The agcl, based on the facilities of a system called Answer Garden, also has sophisticated support for examining documentation and help files, asking questions of experts, and developing a knowledge base of frequently required information. Thus, the GUI becomes a total environment for running programs, accessing information, examining documents, and finding human assistance. Because the agcl can communicate with any command-line environment, most projects can make use of it easily. New applications are continually being found for these interfaces. It is the authors' intention to evolve the GUI and its underlying parameter interface in response to these needs - from users as well as developers - throughout the astronomy community. This presentation describes the capabilities and technology of the above user interface mechanisms and tools. It also discusses the design philosophies guiding the work, as well as hopes for the future.

  19. jCompoundMapper: An open source Java library and command-line tool for chemical fingerprints

    PubMed Central

    2011-01-01

    Background The decomposition of a chemical graph is a convenient approach to encode information of the corresponding organic compound. While several commercial toolkits exist to encode molecules as so-called fingerprints, only a few open source implementations are available. The aim of this work is to introduce a library for exactly defined molecular decompositions, with a strong focus on the application of these features in machine learning and data mining. It provides several options such as search depth, distance cut-offs, atom- and pharmacophore typing. Furthermore, it provides the functionality to combine, to compare, or to export the fingerprints into several formats. Results We provide a Java 1.6 library for the decomposition of chemical graphs based on the open source Chemistry Development Kit toolkit. We reimplemented popular fingerprinting algorithms such as depth-first search fingerprints, extended connectivity fingerprints, autocorrelation fingerprints (e.g. CATS2D), radial fingerprints (e.g. Molprint2D), geometrical Molprint, atom pairs, and pharmacophore fingerprints. We also implemented custom fingerprints such as the all-shortest path fingerprint that only includes the subset of shortest paths from the full set of paths of the depth-first search fingerprint. As an application of jCompoundMapper, we provide a command-line executable binary. We measured the conversion speed and number of features for each encoding and described the composition of the features in detail. The quality of the encodings was tested using the default parametrizations in combination with a support vector machine on the Sutherland QSAR data sets. Additionally, we benchmarked the fingerprint encodings on the large-scale Ames toxicity benchmark using a large-scale linear support vector machine. The results were promising and could often compete with literature results. On the large Ames benchmark, for example, we obtained an AUC ROC performance of 0.87 with a reimplementation of the extended connectivity fingerprint. This result is comparable to the performance achieved by a non-linear support vector machine using state-of-the-art descriptors. On the Sutherland QSAR data set, the best fingerprint encodings showed a comparable or better performance on 5 of the 8 benchmarks when compared against the results of the best descriptors published in the paper of Sutherland et al. Conclusions jCompoundMapper is a library for chemical graph fingerprints with several tweaking possibilities and exporting options for open source data mining toolkits. The quality of the data mining results, the conversion speed, the LPGL software license, the command-line interface, and the exporters should be useful for many applications in cheminformatics like benchmarks against literature methods, comparison of data mining algorithms, similarity searching, and similarity-based data mining. PMID:21219648

  20. Network Visualization Project (NVP)

    DTIC Science & Technology

    2016-07-01

    network visualization, network traffic analysis, network forensics 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT UU 18. NUMBER OF...shell, is a command-line framework used for network forensic analysis. Dshell processes existing pcap files and filters output information based on

  1. 50 CFR 218.13 - Mitigation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... activities. (ii) The Navy shall follow internal chain of command reporting procedures as promulgated through...) South and East of Block Island (37 km (20 NM) seaward of line between 41-4.49° N. lat. 071-51.15° W...

  2. 50 CFR 218.13 - Mitigation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... activities. (ii) The Navy shall follow internal chain of command reporting procedures as promulgated through...) South and East of Block Island (37 km (20 NM) seaward of line between 41-4.49° N. lat. 071-51.15° W...

  3. Sig2BioPAX: Java tool for converting flat files to BioPAX Level 3 format.

    PubMed

    Webb, Ryan L; Ma'ayan, Avi

    2011-03-21

    The World Wide Web plays a critical role in enabling molecular, cell, systems and computational biologists to exchange, search, visualize, integrate, and analyze experimental data. Such efforts can be further enhanced through the development of semantic web concepts. The semantic web idea is to enable machines to understand data through the development of protocol free data exchange formats such as Resource Description Framework (RDF) and the Web Ontology Language (OWL). These standards provide formal descriptors of objects, object properties and their relationships within a specific knowledge domain. However, the overhead of converting datasets typically stored in data tables such as Excel, text or PDF into RDF or OWL formats is not trivial for non-specialists and as such produces a barrier to seamless data exchange between researchers, databases and analysis tools. This problem is particularly of importance in the field of network systems biology where biochemical interactions between genes and their protein products are abstracted to networks. For the purpose of converting biochemical interactions into the BioPAX format, which is the leading standard developed by the computational systems biology community, we developed an open-source command line tool that takes as input tabular data describing different types of molecular biochemical interactions. The tool converts such interactions into the BioPAX level 3 OWL format. We used the tool to convert several existing and new mammalian networks of protein interactions, signalling pathways, and transcriptional regulatory networks into BioPAX. Some of these networks were deposited into PathwayCommons, a repository for consolidating and organizing biochemical networks. The software tool Sig2BioPAX is a resource that enables experimental and computational systems biologists to contribute their identified networks and pathways of molecular interactions for integration and reuse with the rest of the research community.

  4. Extended Testability Analysis Tool

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin; Maul, William A.; Fulton, Christopher

    2012-01-01

    The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.

  5. Design strategies and functionality of the Visual Interface for Virtual Interaction Development (VIVID) tool

    NASA Technical Reports Server (NTRS)

    Nguyen, Lac; Kenney, Patrick J.

    1993-01-01

    Development of interactive virtual environments (VE) has typically consisted of three primary activities: model (object) development, model relationship tree development, and environment behavior definition and coding. The model and relationship tree development activities are accomplished with a variety of well-established graphic library (GL) based programs - most utilizing graphical user interfaces (GUI) with point-and-click interactions. Because of this GUI format, little programming expertise on the part of the developer is necessary to create the 3D graphical models or to establish interrelationships between the models. However, the third VE development activity, environment behavior definition and coding, has generally required the greatest amount of time and programmer expertise. Behaviors, characteristics, and interactions between objects and the user within a VE must be defined via command line C coding prior to rendering the environment scenes. In an effort to simplify this environment behavior definition phase for non-programmers, and to provide easy access to model and tree tools, a graphical interface and development tool has been created. The principal thrust of this research is to effect rapid development and prototyping of virtual environments. This presentation will discuss the 'Visual Interface for Virtual Interaction Development' (VIVID) tool; an X-Windows based system employing drop-down menus for user selection of program access, models, and trees, behavior editing, and code generation. Examples of these selection will be highlighted in this presentation, as will the currently available program interfaces. The functionality of this tool allows non-programming users access to all facets of VE development while providing experienced programmers with a collection of pre-coded behaviors. In conjunction with its existing, interfaces and predefined suite of behaviors, future development plans for VIVID will be described. These include incorporation of dual user virtual environment enhancements, tool expansion, and additional behaviors.

  6. STS-47 crew during fire fighting exercises at JSC's Fire Training Pit

    NASA Technical Reports Server (NTRS)

    1992-01-01

    STS-47 Endeavour, Orbiter Vehicle (OV) 105, crewmembers line up along water hoses to extinguish a blaze in JSC's Fire Training Pit during fire fighting exercises. Manning the hose in the foreground are Payload Specialist Mamoru Mohri, holding the hose nozzle, backup Payload Specialist Takao Doi, Mission Specialist (MS) Jerome Apt, and Commander Robert L. Gibson, at rear. Lined up on the second hose are Pilot Curtis L. Brown, Jr, holding the hose nozzle, followed by MS N. Jan Davis, MS and Payload Commander (PLC) Mark C. Lee, and backup Payload Specialist Stan Koszelak. A veteran firefighter monitors the effort from a position between the two hoses. In the background, backup Payload Specialist Chiaki Naito-Mukai, donning gloves, and MS Mae C. Jemison look on. The Fire Training Pit is located across from the Gilruth Center Bldg 207. Mohri, Doi, and Mukai all represent Japan's National Space Development Agency (NASDA).

  7. Ada (Trade Name) Foundation Technology. Volume 4. Software Requirements for WIS (WWMCCS (World Wide Military Command and Control System) Information System) Text Processing Prototypes

    DTIC Science & Technology

    1986-12-01

    graphics : The package allows a character set which can be defined by users giving the picture for a character by designating its pixels. Such characters...type lonts and gsei-oriented "help" messages tailored to the operations being performed and user expertise In general, critical design issues...other volumes include command language, software design , description and analysis tools, database management system operating systems; planning and

  8. JOVIAL J73 Automated Verification System - Study Phase

    DTIC Science & Technology

    1980-08-01

    capabil- ities for the tool, and the high-level design of the tool are also described. Future capabilities for the tool are identified. -N CONTENTS...Implemented Test Tools 3-22 4 FUNCTIONAL DESCRIPTION OF Ji3AVS 4-1 4.1 Summary of Capabilities 4-3 4.2 J 3.AVS Operat . 4-11 5 DESIGN OF J73AVS 5-1 6...Both JOVIAL languages are primarily designed for command and control system programming. They are es- pecially well suited to large systems requiring

  9. Study of Tools for Command and Telemetry Dictionaries

    NASA Technical Reports Server (NTRS)

    Pires, Craig; Knudson, Matthew D.

    2017-01-01

    The Command and Telemetry Dictionary is at the heart of space missions. The C&T Dictionary represents all of the information that is exchanged between the various systems both in space and on the ground. Large amounts of ever-changing information has to be disseminated to all for the various systems and sub-systems throughout all phases of the mission. The typical approach of having each sub-system manage it's own information flow, results in a patchwork of methods within a mission. This leads to significant duplication of effort and potential errors. More centralized methods have been developed to manage this data flow. This presentation will compare two tools that have been developed for this purpose, CCDD and SCIMI that were designed to work with the Core Flight System (cFS).

  10. RSAT matrix-clustering: dynamic exploration and redundancy reduction of transcription factor binding motif collections

    PubMed Central

    Jaeger, Sébastien; Thieffry, Denis

    2017-01-01

    Abstract Transcription factor (TF) databases contain multitudes of binding motifs (TFBMs) from various sources, from which non-redundant collections are derived by manual curation. The advent of high-throughput methods stimulated the production of novel collections with increasing numbers of motifs. Meta-databases, built by merging these collections, contain redundant versions, because available tools are not suited to automatically identify and explore biologically relevant clusters among thousands of motifs. Motif discovery from genome-scale data sets (e.g. ChIP-seq) also produces redundant motifs, hampering the interpretation of results. We present matrix-clustering, a versatile tool that clusters similar TFBMs into multiple trees, and automatically creates non-redundant TFBM collections. A feature unique to matrix-clustering is its dynamic visualisation of aligned TFBMs, and its capability to simultaneously treat multiple collections from various sources. We demonstrate that matrix-clustering considerably simplifies the interpretation of combined results from multiple motif discovery tools, and highlights biologically relevant variations of similar motifs. We also ran a large-scale application to cluster ∼11 000 motifs from 24 entire databases, showing that matrix-clustering correctly groups motifs belonging to the same TF families, and drastically reduced motif redundancy. matrix-clustering is integrated within the RSAT suite (http://rsat.eu/), accessible through a user-friendly web interface or command-line for its integration in pipelines. PMID:28591841

  11. Digital Interface Board to Control Phase and Amplitude of Four Channels

    NASA Technical Reports Server (NTRS)

    Smith, Amy E.; Cook, Brian M.; Khan, Abdur R.; Lux, James P.

    2011-01-01

    An increasing number of parts are designed with digital control interfaces, including phase shifters and variable attenuators. When designing an antenna array in which each antenna has independent amplitude and phase control, the number of digital control lines that must be set simultaneously can grow very large. Use of a parallel interface would require separate line drivers, more parts, and thus additional failure points. A convenient form of control where single-phase shifters or attenuators could be set or the whole set could be programmed with an update rate of 100 Hz is needed to solve this problem. A digital interface board with a field-programmable gate array (FPGA) can simultaneously control an essentially arbitrary number of digital control lines with a serial command interface requiring only three wires. A small set of short, high-level commands provides a simple programming interface for an external controller. Parity bits are used to validate the control commands. Output timing is controlled within the FPGA to allow for rapid update rates of the phase shifters and attenuators. This technology has been used to set and monitor eight 5-bit control signals via a serial UART (universal asynchronous receiver/transmitter) interface. The digital interface board controls the phase and amplitude of the signals for each element in the array. A host computer running Agilent VEE sends commands via serial UART connection to a Xilinx VirtexII FPGA. The commands are decoded, and either outputs are set or telemetry data is sent back to the host computer describing the status and the current phase and amplitude settings. This technology is an integral part of a closed-loop system in which the angle of arrival of an X-band uplink signal is detected and the appropriate phase shifts are applied to the Ka-band downlink signal to electronically steer the array back in the direction of the uplink signal. It will also be used in the non-beam-steering case to compensate for phase shift variations through power amplifiers. The digital interface board can be used to set four 5-bit phase shifters and four 5-bit attenuators and monitor their current settings. Additionally, it is useful outside of the closed-loop system for beamsteering alone. When the VEE program is started, it prompts the user to initialize variables (to zero) or skip initialization. After that, the program enters into a continuous loop waiting for the telemetry period to elapse or a button to be pushed. A telemetry request is sent when the telemetry period is elapsed (every five seconds). Pushing one of the set or reset buttons will send the appropriate command. When a command is sent, the interface status is returned, and the user will be notified by a pop-up window if any error has occurred. The program runs until the End Program button is depressed.

  12. maxdLoad2 and maxdBrowse: standards-compliant tools for microarray experimental annotation, data management and dissemination.

    PubMed

    Hancock, David; Wilson, Michael; Velarde, Giles; Morrison, Norman; Hayes, Andrew; Hulme, Helen; Wood, A Joseph; Nashar, Karim; Kell, Douglas B; Brass, Andy

    2005-11-03

    maxdLoad2 is a relational database schema and Java application for microarray experimental annotation and storage. It is compliant with all standards for microarray meta-data capture; including the specification of what data should be recorded, extensive use of standard ontologies and support for data exchange formats. The output from maxdLoad2 is of a form acceptable for submission to the ArrayExpress microarray repository at the European Bioinformatics Institute. maxdBrowse is a PHP web-application that makes contents of maxdLoad2 databases accessible via web-browser, the command-line and web-service environments. It thus acts as both a dissemination and data-mining tool. maxdLoad2 presents an easy-to-use interface to an underlying relational database and provides a full complement of facilities for browsing, searching and editing. There is a tree-based visualization of data connectivity and the ability to explore the links between any pair of data elements, irrespective of how many intermediate links lie between them. Its principle novel features are: the flexibility of the meta-data that can be captured, the tools provided for importing data from spreadsheets and other tabular representations, the tools provided for the automatic creation of structured documents, the ability to browse and access the data via web and web-services interfaces. Within maxdLoad2 it is very straightforward to customise the meta-data that is being captured or change the definitions of the meta-data. These meta-data definitions are stored within the database itself allowing client software to connect properly to a modified database without having to be specially configured. The meta-data definitions (configuration file) can also be centralized allowing changes made in response to revisions of standards or terminologies to be propagated to clients without user intervention.maxdBrowse is hosted on a web-server and presents multiple interfaces to the contents of maxd databases. maxdBrowse emulates many of the browse and search features available in the maxdLoad2 application via a web-browser. This allows users who are not familiar with maxdLoad2 to browse and export microarray data from the database for their own analysis. The same browse and search features are also available via command-line and SOAP server interfaces. This both enables scripting of data export for use embedded in data repositories and analysis environments, and allows access to the maxd databases via web-service architectures. maxdLoad2 http://www.bioinf.man.ac.uk/microarray/maxd/ and maxdBrowse http://dbk.ch.umist.ac.uk/maxdBrowse are portable and compatible with all common operating systems and major database servers. They provide a powerful, flexible package for annotation of microarray experiments and a convenient dissemination environment. They are available for download and open sourced under the Artistic License.

  13. The Time is Now: Legislation for the Interagency

    DTIC Science & Technology

    2012-05-17

    Studies, 2005), 6. 7 Sunil B. Desai, “Solving the Interagency Puzzle”, Policy Review No. 129, Hoover Institution, Stanford University, [1 February... Management ,” which broke the problem of pacification down into three simultaneous lines of effort.23 The three lines of effort focused on security...Westmoreland’s command. This decision, with its short time constraint, was bankrupt from the beginning, because the civilian and military chains remained

  14. Pattern Recognition Control Design

    NASA Technical Reports Server (NTRS)

    Gambone, Elisabeth

    2016-01-01

    Spacecraft control algorithms must know the expected spacecraft response to any command to the available control effectors, such as reaction thrusters or torque devices. Spacecraft control system design approaches have traditionally relied on the estimated vehicle mass properties to determine the desired force and moment, as well as knowledge of the effector performance to efficiently control the spacecraft. A pattern recognition approach can be used to investigate the relationship between the control effector commands and the spacecraft responses. Instead of supplying the approximated vehicle properties and the effector performance characteristics, a database of information relating the effector commands and the desired vehicle response can be used for closed-loop control. A Monte Carlo simulation data set of the spacecraft dynamic response to effector commands can be analyzed to establish the influence a command has on the behavior of the spacecraft. A tool developed at NASA Johnson Space Center (Ref. 1) to analyze flight dynamics Monte Carlo data sets through pattern recognition methods can be used to perform this analysis. Once a comprehensive data set relating spacecraft responses with commands is established, it can be used in place of traditional control laws and gains set. This pattern recognition approach can be compared with traditional control algorithms to determine the potential benefits and uses.

  15. Flexible Environmental Modeling with Python and Open - GIS

    NASA Astrophysics Data System (ADS)

    Pryet, Alexandre; Atteia, Olivier; Delottier, Hugo; Cousquer, Yohann

    2015-04-01

    Numerical modeling now represents a prominent task of environmental studies. During the last decades, numerous commercial programs have been made available to environmental modelers. These software applications offer user-friendly graphical user interfaces that allow an efficient management of many case studies. However, they suffer from a lack of flexibility and closed-source policies impede source code reviewing and enhancement for original studies. Advanced modeling studies require flexible tools capable of managing thousands of model runs for parameter optimization, uncertainty and sensitivity analysis. In addition, there is a growing need for the coupling of various numerical models associating, for instance, groundwater flow modeling to multi-species geochemical reactions. Researchers have produced hundreds of open-source powerful command line programs. However, there is a need for a flexible graphical user interface allowing an efficient processing of geospatial data that comes along any environmental study. Here, we present the advantages of using the free and open-source Qgis platform and the Python scripting language for conducting environmental modeling studies. The interactive graphical user interface is first used for the visualization and pre-processing of input geospatial datasets. Python scripting language is then employed for further input data processing, call to one or several models, and post-processing of model outputs. Model results are eventually sent back to the GIS program, processed and visualized. This approach combines the advantages of interactive graphical interfaces and the flexibility of Python scripting language for data processing and model calls. The numerous python modules available facilitate geospatial data processing and numerical analysis of model outputs. Once input data has been prepared with the graphical user interface, models may be run thousands of times from the command line with sequential or parallel calls. We illustrate this approach with several case studies in groundwater hydrology and geochemistry and provide links to several python libraries that facilitate pre- and post-processing operations.

  16. The Chandra Source Catalog: User Interface

    NASA Astrophysics Data System (ADS)

    Bonaventura, Nina; Evans, I. N.; Harbo, P. N.; Rots, A. H.; Tibbetts, M. S.; Van Stone, D. W.; Zografou, P.; Anderson, C. S.; Chen, J. C.; Davis, J. E.; Doe, S. M.; Evans, J. D.; Fabbiano, G.; Galle, E.; Gibbs, D. G.; Glotfelty, K. J.; Grier, J. D.; Hain, R.; Hall, D. M.; He, X.; Houck, J. C.; Karovska, M.; Lauer, J.; McCollough, M. L.; McDowell, J. C.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Nowak, M. A.; Plummer, D. A.; Primini, F. A.; Refsdal, B. L.; Siemiginowska, A. L.; Sundheim, B. A.; Winkelman, S. L.

    2009-01-01

    The Chandra Source Catalog (CSC) is the definitive catalog of all X-ray sources detected by Chandra. The CSC is presented to the user in two tables: the Master Chandra Source Table and the Table of Individual Source Observations. Each distinct X-ray source identified in the CSC is represented by a single master source entry and one or more individual source entries. If a source is unaffected by confusion and pile-up in multiple observations, the individual source observations are merged to produce a master source. In each table, a row represents a source, and each column a quantity that is officially part of the catalog. The CSC contains positions and multi-band fluxes for the sources, as well as derived spatial, spectral, and temporal source properties. The CSC also includes associated source region and full-field data products for each source, including images, photon event lists, light curves, and spectra. The master source properties represent the best estimates of the properties of a source, and are presented in the following categories: Position and Position Errors, Source Flags, Source Extent and Errors, Source Fluxes, Source Significance, Spectral Properties, and Source Variability. The CSC Data Access GUI provides direct access to the source properties and data products contained in the catalog. The user may query the catalog database via a web-style search or an SQL command-line query. Each query returns a table of source properties, along with the option to browse and download associated data products. The GUI is designed to run in a web browser with Java version 1.5 or higher, and may be accessed via a link on the CSC website homepage (http://cxc.harvard.edu/csc/). As an alternative to the GUI, the contents of the CSC may be accessed directly through a URL, using the command-line tool, cURL. Support: NASA contract NAS8-03060 (CXC).

  17. UNIX as an environment for producing numerical software

    NASA Technical Reports Server (NTRS)

    Schryer, N. L.

    1978-01-01

    The UNIX operating system supports a number of software tools; a mathematical equation-setting language, a phototypesetting language, a FORTRAN preprocessor language, a text editor, and a command interpreter. The design, implementation, documentation, and maintenance of a portable FORTRAN test of the floating-point arithmetic unit of a computer is used to illustrate these tools at work.

  18. STS-133 crew members Lindsey, Boe and Drew during Tool/Repair Kits training with instructor

    NASA Image and Video Library

    2010-01-26

    JSC2010-E-014262 (26 Jan. 2010) --- NASA astronauts Eric Boe (left), STS-133 pilot; Steve Lindsey, commander; and Alvin Drew, mission specialist, participate in an ISS tools and repair kits training session in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center. Instructor Ivy Apostolakopoulos assisted the crew members.

  19. The 2009 DOD Cost Research Workshop: Acquisition Reform

    DTIC Science & Technology

    2010-02-01

    2 ACEIT Enhancement, Help-Desk/Training, Consulting DASA-CE–3 Command, Control, Communications, Computers, Intelligence, Surveillance, and...Management Information System (OSMIS) online interactive relational database DASA-CE–2 Title: ACEIT Enhancement, Help-Desk/Training, Consulting Summary...support and training for the Automated Cost estimator Integrated Tools ( ACEIT ) software suite. ACEIT is the Army standard suite of analytical tools for

  20. Xyce parallel electronic simulator : reference guide.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mei, Ting; Rankin, Eric Lamont; Thornquist, Heidi K.

    2011-05-01

    This document is a reference guide to the Xyce Parallel Electronic Simulator, and is a companion document to the Xyce Users Guide. The focus of this document is (to the extent possible) exhaustively list device parameters, solver options, parser options, and other usage details of Xyce. This document is not intended to be a tutorial. Users who are new to circuit simulation are better served by the Xyce Users Guide. The Xyce Parallel Electronic Simulator has been written to support, in a rigorous manner, the simulation needs of the Sandia National Laboratories electrical designers. It is targeted specifically to runmore » on large-scale parallel computing platforms but also runs well on a variety of architectures including single processor workstations. It also aims to support a variety of devices and models specific to Sandia needs. This document is intended to complement the Xyce Users Guide. It contains comprehensive, detailed information about a number of topics pertinent to the usage of Xyce. Included in this document is a netlist reference for the input-file commands and elements supported within Xyce; a command line reference, which describes the available command line arguments for Xyce; and quick-references for users of other circuit codes, such as Orcad's PSpice and Sandia's ChileSPICE.« less

  1. Astronaut John Young reaches for tools in Lunar Roving Vehicle during EVA

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Astronaut John W. Young, commander of the Apollo 16 lunar landing mission, reaches for tools in the Apollo lunar hand tool carrier at the aft end of the Lunar Roving Vehicle during the second Apollo 16 extravehicular activity (EVA-2) at the Descartes landing site. This photograph was taken by Astronaut Charles M. Duke Jr., lunar module pilot. This view is looking south from the base of Stone Mountain.

  2. Effects of a Network-Centric Multi-Modal Communication Tool on a Communication Monitoring Task

    DTIC Science & Technology

    2012-03-01

    replaced (Nelson, Bolia, Vidulich, & Langhorne , 2004). Communication will continue to be the central tool for Command and Control (C2) operators. However...Nelson, Bolia, Vidulich, & Langhorne , 2004). The two highest ratings for most potential technologies were data capture/replay tools and chat...analysis of variance (ANOVA). A significant main effect was found for Difficulty, F (1, 13) = 21.11, p < .05; the overall level of detections was

  3. Assessing hospital disaster preparedness: a comparison of an on-site survey, directly observed drill performance, and video analysis of teamwork.

    PubMed

    Kaji, Amy H; Langford, Vinette; Lewis, Roger J

    2008-09-01

    There is currently no validated method for assessing hospital disaster preparedness. We determine the degree of correlation between the results of 3 methods for assessing hospital disaster preparedness: administration of an on-site survey, drill observation using a structured evaluation tool, and video analysis of team performance in the hospital incident command center. This was a prospective, observational study conducted during a regional disaster drill, comparing the results from an on-site survey, a structured disaster drill evaluation tool, and a video analysis of teamwork, performed at 6 911-receiving hospitals in Los Angeles County, CA. The on-site survey was conducted separately from the drill and assessed hospital disaster plan structure, vendor agreements, modes of communication, medical and surgical supplies, involvement of law enforcement, mutual aid agreements with other facilities, drills and training, surge capacity, decontamination capability, and pharmaceutical stockpiles. The drill evaluation tool, developed by Johns Hopkins University under contract from the Agency for Healthcare Research and Quality, was used to assess various aspects of drill performance, such as the availability of the hospital disaster plan, the geographic configuration of the incident command center, whether drill participants were identifiable, whether the noise level interfered with effective communication, and how often key information (eg, number of available staffed floor, intensive care, and isolation beds; number of arriving victims; expected triage level of victims; number of potential discharges) was received by the incident command center. Teamwork behaviors in the incident command center were quantitatively assessed, using the MedTeams analysis of the video recordings obtained during the disaster drill. Spearman rank correlations of the results between pair-wise groupings of the 3 assessment methods were calculated. The 3 evaluation methods demonstrated qualitatively different results with respect to each hospital's level of disaster preparedness. The Spearman rank correlation coefficient between the results of the on-site survey and the video analysis of teamwork was -0.34; between the results of the on-site survey and the structured drill evaluation tool, 0.15; and between the results of the video analysis and the drill evaluation tool, 0.82. The disparate results obtained from the 3 methods suggest that each measures distinct aspects of disaster preparedness, and perhaps no single method adequately characterizes overall hospital preparedness.

  4. QQACCT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacobsen, Douglas

    2015-01-01

    batchacct provides convenient library and command-line access to batch system accounting data for GridEngine and SLURM schedulers. It can be used to perform queries useful for data analysis of the accounting data alone or for integrative analysis in the context of a larger query.

  5. Thematic Mapper. Volume 1: Calibration report flight model, LANDSAT 5

    NASA Technical Reports Server (NTRS)

    Cooley, R. C.; Lansing, J. C.

    1984-01-01

    The calibration of the Flight 1 Model Thematic Mapper is discussed. Spectral response, scan profile, coherent noise, line spread profiles and white light leaks, square wave response, radiometric calibration, and commands and telemetry are specifically addressed.

  6. Revitalization of Nuclear Powered Flight

    DTIC Science & Technology

    2016-05-01

    1 AU/ACSC/2016 AIR COMMAND AND STAFF COLLEGE AIR UNIVERSITY Revitalization of Nuclear Powered Flight by Todd C...Aviation History On-Line Museum . August 6, 2007. Accessed February 16, 2016. http://www.aviation-history.com/articles/nuke-american.htm. Courtland

  7. 50 CFR 218.23 - Mitigation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... training activities. (ii) The Navy shall follow internal chain of command reporting procedures as... Block Island (37 km (20 NM) seaward of line between 41-4.49° N. lat. 071-51.15° W. long. and 41-18.58° N...

  8. 50 CFR 218.23 - Mitigation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... training activities. (ii) The Navy shall follow internal chain of command reporting procedures as... Block Island (37 km (20 NM) seaward of line between 41-4.49° N. lat. 071-51.15° W. long. and 41-18.58° N...

  9. KSC-2009-3103

    NASA Image and Video Library

    2009-05-11

    CAPE CANAVERAL, Fla. – The mini-convoy is lined up on the Shuttle Landing Facility runway at NASA's Kennedy Space Center in Florida awaiting space shuttle Atlantis' launch on the STS-125 mission to service NASA's Hubble Space Telescope. The convoy is prepared to act should the shuttle need to return to the launch site in the event of an emergency. At left is the Convoy Command Vehicle which is the command post for the convoy commander. Atlantis launched successfully on time at 2:01 p.m. EDT. Atlantis' 11-day flight will include five spacewalks to refurbish and upgrade the telescope with state-of-the-art science instruments that will expand Hubble's capabilities and extend its operational lifespan through at least 2014. The payload includes a Wide Field Camera 3, Fine Guidance Sensor and the Cosmic Origins Spectrograph. Photo credit: NASA/Jack Pfaller

  10. Interaction of memory systems during acquisition of tool knowledge and skills in Parkinson's disease.

    PubMed

    Roy, Shumita; Park, Norman W; Roy, Eric A; Almeida, Quincy J

    2015-01-01

    Previous research suggests that different aspects of tool knowledge are mediated by different memory systems. It is believed that tool attributes (e.g., function, color) are represented as declarative memory while skill learning is supported by procedural memory. It has been proposed that other aspects (e.g., skilled tool use) may rely on an interaction of both declarative and procedural memory. However, the specific form of procedural memory underlying skilled tool use and the nature of interaction between declarative and procedural memory systems remain unclear. In the current study, individuals with Parkinson's disease (PD) and healthy controls were trained over 2 sessions, 3 weeks apart, to use a set of novel complex tools. They were also tested on their ability to recall tool attributes as well as their ability to demonstrate grasp and use of the tools to command. Results showed that, compared to controls, participants with PD showed intact motor skill acquisition and tool use to command within sessions, but failed to retain performance across sessions. In contrast, people with PD showed equivalent recall of tool attributes and tool grasping relative to controls, both within and across sessions. Current findings demonstrate that the frontal-striatal network, compromised in PD, mediates long-term retention of motor skills. Intact initial skill learning raises the possibility of compensation from declarative memory for frontal-striatal dysfunction. Lastly, skilled tool use appears to rely on both memory systems which may reflect a cooperative interaction between the two systems. Current findings regarding memory representations of tool knowledge and skill learning may have important implications for delivery of rehabilitation programs for individuals with PD. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. UTM Technical Capabilities Level 2 (TLC2) Test at Reno-Stead Airport.

    NASA Image and Video Library

    2016-10-06

    Test of Unmanned Aircraft Systems Traffic Management (UTM) technical capability Level 2 (TCL2) at Reno-Stead Airport, Nevada. During the test, five drones simultaneously crossed paths, separated by altitude. Two drones flew beyond visual line-of-sight and three flew within line-of-sight of their operators. Drone Co-habitation Services operates a Phantom 3 commercial multi-rotor unmanned aircraft, one of 11 vehicles in the UTM TCL2 demonstration that will fly beyond line of sight of the pilot in command in Nevada test.

  12. UTM Technical Capabilities Level 2 (TLC2) Test at Reno-Stead Airport.

    NASA Image and Video Library

    2016-10-06

    Test of Unmanned Aircraft Systems Traffic Management (UTM) technical capability Level 2 (TCL2) at Reno-Stead Airport, Nevada. During the test, five drones simultaneously crossed paths, separated by altitude. Two drones flew beyond visual line-of-sight and three flew within line-of-sight of their operators. Karen Bollinger pilot and Nick Atkins of Alaska Center for Unmanned Aircraft Systems Integration program fly Ptarmigan quadcopter, one of 11 vehicles in the UTM TCL2 demonstration that will fly beyond line of sight of the pilot in command in Nevada test.

  13. Guidelines for Line-Oriented Flight Training, Volume 1

    NASA Technical Reports Server (NTRS)

    Lauber, J. K.; Foushee, H. C.

    1981-01-01

    Line-Oriented Flight Training (LOFT) is a developing training technology which synthesizes high-fidelity aircraft simulation and high-fidelity line-operations simulation to provide realistic, dynamic pilot training in a simulated line environment. LOFT is an augmentation of existing pilot training which concentrates upon command, leadership, and resource management skills. This report, based on an NASA/Industry workshop held in January, 1981, is designed to serve as a handbook for LOFT users. In addition to providing background information, guidelines are presented for designing LOFT scenarios, conducting real-time LOFT operations, pilot debriefing, and instructor qualification and training. The final chapter addressed other uses of LOFT and line-operations (or full-mission) simulation.

  14. Assured communications and combat resiliency: the relationship between effective national communications and combat efficiency

    NASA Astrophysics Data System (ADS)

    Allgood, Glenn O.; Kuruganti, Phani Teja; Nutaro, James; Saffold, Jay

    2009-05-01

    Combat resiliency is the ability of a commander to prosecute, control, and consolidate his/her's sphere of influence in adverse and changing conditions. To support this, an infrastructure must exist that allows the commander to view the world in varying degrees of granularity with sufficient levels of detail to permit confidence estimates to be levied against decisions and course of actions. An infrastructure such as this will include the ability to effectively communicate context and relevance within and across the battle space. To achieve this will require careful thought, planning, and understanding of a network and its capacity limitations in post-event command and control. Relevance and impact on any existing infrastructure must be fully understood prior to deployment to exploit the system's full capacity and capabilities. In this view, the combat communication network is considered an integral part of or National communication network and infrastructure. This paper will describe an analytical tool set developed at ORNL and RNI incorporating complexity theory, advanced communications modeling, simulation, and visualization technologies that could be used as a pre-planning tool or post event reasoning application to support response and containment.

  15. GAP: yet another image processing system for solar observations.

    NASA Astrophysics Data System (ADS)

    Keller, C. U.

    GAP is a versatile, interactive image processing system for analyzing solar observations, in particular extended time sequences, and for preparing publication quality figures. It consists of an interpreter that is based on a language with a control flow similar to PASCAL and C. The interpreter may be accessed from a command line editor and from user-supplied functions, procedures, and command scripts. GAP is easily expandable via external FORTRAN programs that are linked to the GAP interface routines. The current version of GAP runs on VAX, DECstation, Sun, and Apollo computers. Versions for MS-DOS and OS/2 are in preparation.

  16. Pycellerator: an arrow-based reaction-like modelling language for biological simulations.

    PubMed

    Shapiro, Bruce E; Mjolsness, Eric

    2016-02-15

    We introduce Pycellerator, a Python library for reading Cellerator arrow notation from standard text files, conversion to differential equations, generating stand-alone Python solvers, and optionally running and plotting the solutions. All of the original Cellerator arrows, which represent reactions ranging from mass action, Michales-Menten-Henri (MMH) and Gene-Regulation (GRN) to Monod-Wyman-Changeaux (MWC), user defined reactions and enzymatic expansions (KMech), were previously represented with the Mathematica extended character set. These are now typed as reaction-like commands in ASCII text files that are read by Pycellerator, which includes a Python command line interface (CLI), a Python application programming interface (API) and an iPython notebook interface. Cellerator reaction arrows are now input in text files. The arrows are parsed by Pycellerator and translated into differential equations in Python, and Python code is automatically generated to solve the system. Time courses are produced by executing the auto-generated Python code. Users have full freedom to modify the solver and utilize the complete set of standard Python tools. The new libraries are completely independent of the old Cellerator software and do not require Mathematica. All software is available (GPL) from the github repository at https://github.com/biomathman/pycellerator/releases. Details, including installation instructions and a glossary of acronyms and terms, are given in the Supplementary information. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  17. MSLICE Sequencing

    NASA Technical Reports Server (NTRS)

    Crockett, Thomas M.; Joswig, Joseph C.; Shams, Khawaja S.; Norris, Jeffrey S.; Morris, John R.

    2011-01-01

    MSLICE Sequencing is a graphical tool for writing sequences and integrating them into RML files, as well as for producing SCMF files for uplink. When operated in a testbed environment, it also supports uplinking these SCMF files to the testbed via Chill. This software features a free-form textural sequence editor featuring syntax coloring, automatic content assistance (including command and argument completion proposals), complete with types, value ranges, unites, and descriptions from the command dictionary that appear as they are typed. The sequence editor also has a "field mode" that allows tabbing between arguments and displays type/range/units/description for each argument as it is edited. Color-coded error and warning annotations on problematic tokens are included, as well as indications of problems that are not visible in the current scroll range. "Quick Fix" suggestions are made for resolving problems, and all the features afforded by modern source editors are also included such as copy/cut/paste, undo/redo, and a sophisticated find-and-replace system optionally using regular expressions. The software offers a full XML editor for RML files, which features syntax coloring, content assistance and problem annotations as above. There is a form-based, "detail view" that allows structured editing of command arguments and sequence parameters when preferred. The "project view" shows the user s "workspace" as a tree of "resources" (projects, folders, and files) that can subsequently be opened in editors by double-clicking. Files can be added, deleted, dragged-dropped/copied-pasted between folders or projects, and these operations are undoable and redoable. A "problems view" contains a tabular list of all problems in the current workspace. Double-clicking on any row in the table opens an editor for the appropriate sequence, scrolling to the specific line with the problem, and highlighting the problematic characters. From there, one can invoke "quick fix" as described above to resolve the issue. Once resolved, saving the file causes the problem to be removed from the problem view.

  18. Connecting to HPC Systems | High-Performance Computing | NREL

    Science.gov Websites

    one of the following methods, which use multi-factor authentication. First, you will need to set up If you just need access to a command line on an HPC system, use one of the following methods

  19. Method and systems for a radiation tolerant bus interface circuit

    NASA Technical Reports Server (NTRS)

    Kinstler, Gary A. (Inventor)

    2007-01-01

    A bus management tool that allows communication to be maintained between a group of nodes operatively connected on two busses in the presence of radiation by transmitting periodically a first message from one to another of the nodes on one of the busses, determining whether the first message was received by the other of the nodes on the first bus, and when it is determined that the first message was not received by the other of the nodes, transmitting a recovery command to the other of the nodes on a second of the of busses. Methods, systems, and articles of manufacture consistent with the present invention also provide for a bus recovery tool on the other node that re-initializes a bus interface circuit operatively connecting the other node to the first bus in response to the recovery command.

  20. Acoustic/Seismic Ground Sensors for Detection, Localization and Classification on the Battlefield

    DTIC Science & Technology

    2006-10-01

    controlled so that collisions are avoided. Figure 1 presents BACH system components. 3 BACH Sensor Posts (1 to 8) Command Post BACH MMI PC VHF...2.2.4 Processing scheme Processing inside SP is dedicated to stationary spectral lines extraction and derives from ASW algorithms. Special attention...is similar to that used for helicopters (see figure 4), with adaptations to cope with vehicles signatures (fuzzy unstable spectral lines, abrupt

Top