Science.gov

Sample records for files aggregation toolkit

  1. The ALFA (Activity Log Files Aggregation) toolkit: a method for precise observation of the consultation.

    PubMed

    de Lusignan, Simon; Kumarapeli, Pushpa; Chan, Tom; Pflug, Bernhard; van Vlymen, Jeremy; Jones, Beryl; Freeman, George K

    2008-09-08

    There is a lack of tools to evaluate and compare Electronic patient record (EPR) systems to inform a rational choice or development agenda. To develop a tool kit to measure the impact of different EPR system features on the consultation. We first developed a specification to overcome the limitations of existing methods. We divided this into work packages: (1) developing a method to display multichannel video of the consultation; (2) code and measure activities, including computer use and verbal interactions; (3) automate the capture of nonverbal interactions; (4) aggregate multiple observations into a single navigable output; and (5) produce an output interpretable by software developers. We piloted this method by filming live consultations (n = 22) by 4 general practitioners (GPs) using different EPR systems. We compared the time taken and variations during coded data entry, prescribing, and blood pressure (BP) recording. We used nonparametric tests to make statistical comparisons. We contrasted methods of BP recording using Unified Modeling Language (UML) sequence diagrams. We found that 4 channels of video were optimal. We identified an existing application for manual coding of video output. We developed in-house tools for capturing use of keyboard and mouse and to time stamp speech. The transcript is then typed within this time stamp. Although we managed to capture body language using pattern recognition software, we were unable to use this data quantitatively. We loaded these observational outputs into our aggregation tool, which allows simultaneous navigation and viewing of multiple files. This also creates a single exportable file in XML format, which we used to develop UML sequence diagrams. In our pilot, the GP using the EMIS LV (Egton Medical Information Systems Limited, Leeds, UK) system took the longest time to code data (mean 11.5 s, 95% CI 8.7-14.2). Nonparametric comparison of EMIS LV with the other systems showed a significant difference, with EMIS

  2. The ALFA (Activity Log Files Aggregation) Toolkit: A Method for Precise Observation of the Consultation

    PubMed Central

    2008-01-01

    Background There is a lack of tools to evaluate and compare Electronic patient record (EPR) systems to inform a rational choice or development agenda. Objective To develop a tool kit to measure the impact of different EPR system features on the consultation. Methods We first developed a specification to overcome the limitations of existing methods. We divided this into work packages: (1) developing a method to display multichannel video of the consultation; (2) code and measure activities, including computer use and verbal interactions; (3) automate the capture of nonverbal interactions; (4) aggregate multiple observations into a single navigable output; and (5) produce an output interpretable by software developers. We piloted this method by filming live consultations (n = 22) by 4 general practitioners (GPs) using different EPR systems. We compared the time taken and variations during coded data entry, prescribing, and blood pressure (BP) recording. We used nonparametric tests to make statistical comparisons. We contrasted methods of BP recording using Unified Modeling Language (UML) sequence diagrams. Results We found that 4 channels of video were optimal. We identified an existing application for manual coding of video output. We developed in-house tools for capturing use of keyboard and mouse and to time stamp speech. The transcript is then typed within this time stamp. Although we managed to capture body language using pattern recognition software, we were unable to use this data quantitatively. We loaded these observational outputs into our aggregation tool, which allows simultaneous navigation and viewing of multiple files. This also creates a single exportable file in XML format, which we used to develop UML sequence diagrams. In our pilot, the GP using the EMIS LV (Egton Medical Information Systems Limited, Leeds, UK) system took the longest time to code data (mean 11.5 s, 95% CI 8.7-14.2). Nonparametric comparison of EMIS LV with the other systems showed

  3. Small file aggregation in a parallel computing system

    SciTech Connect

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Grider, Gary; Zhang, Jingwang

    2014-09-02

    Techniques are provided for small file aggregation in a parallel computing system. An exemplary method for storing a plurality of files generated by a plurality of processes in a parallel computing system comprises aggregating the plurality of files into a single aggregated file; and generating metadata for the single aggregated file. The metadata comprises an offset and a length of each of the plurality of files in the single aggregated file. The metadata can be used to unpack one or more of the files from the single aggregated file.

  4. The Internet Toolkit: File Compression and Archive Utilities.

    ERIC Educational Resources Information Center

    Delfino, Erik

    1993-01-01

    Describes utility programs available for posttransfer processing files that have been downloaded from the Internet. Highlights include file compression; viewing graphics files; converting binary files to ASCII files; and how to find utility programs. (LRW)

  5. SeqKit: A Cross-Platform and Ultrafast Toolkit for FASTA/Q File Manipulation.

    PubMed

    Shen, Wei; Le, Shuai; Li, Yan; Hu, Fuquan

    2016-01-01

    FASTA and FASTQ are basic and ubiquitous formats for storing nucleotide and protein sequences. Common manipulations of FASTA/Q file include converting, searching, filtering, deduplication, splitting, shuffling, and sampling. Existing tools only implement some of these manipulations, and not particularly efficiently, and some are only available for certain operating systems. Furthermore, the complicated installation process of required packages and running environments can render these programs less user friendly. This paper describes a cross-platform ultrafast comprehensive toolkit for FASTA/Q processing. SeqKit provides executable binary files for all major operating systems, including Windows, Linux, and Mac OSX, and can be directly used without any dependencies or pre-configurations. SeqKit demonstrates competitive performance in execution time and memory usage compared to similar tools. The efficiency and usability of SeqKit enable researchers to rapidly accomplish common FASTA/Q file manipulations. SeqKit is open source and available on Github at https://github.com/shenwei356/seqkit.

  6. A next-generation open-source toolkit for FITS file image viewing

    NASA Astrophysics Data System (ADS)

    Jeschke, Eric; Inagaki, Takeshi; Kackley, Russell

    2012-09-01

    The astronomical community has a long tradition of sharing and collaborating on FITS file tools, including viewers. Several excellent viewers such as DS9 and Skycat have been successfully reused again and again. Yet this "first generation" of viewers predate the emergence of a new class of powerful object-oriented scripting languages such as Python, which has quickly become a very popular language for astronomical (and general scientific) use. Integration and extension of these viewers by Python is cumbersome. Furthermore, these viewers are also built on older widget toolkits such as Tcl/Tk, which are becoming increasingly difficult to support and extend as time passes. Suburu Telescope's second-generation observation control system (Gen2) is built on a a foundation of Python-based technologies and leverages several important astronomically useful packages such as numpy and pyfits. We have written a new flexible core widget for viewing FITS files which is available in versions for both the modern Gtk and Qt-based desktops. The widget offers seamless integration with pyfits and numpy arrays of FITS data. A full-featured viewer based on this widget has been developed, and supports a plug-in architecture in which new features can be added by scripting simple Python modules. In this paper we will describe and demonstrate the capabilities of the new widget and viewer and discuss the architecture of the software which allows new features and widgets to easily developed by subclassing a powerful abstract base class. The software will be released as open-source.

  7. Basic Internet Software Toolkit.

    ERIC Educational Resources Information Center

    Buchanan, Larry

    1998-01-01

    Once schools are connected to the Internet, the next step is getting network workstations configured for Internet access. This article describes a basic toolkit comprising software currently available on the Internet for free or modest cost. Lists URLs for Web browser, Telnet, FTP, file decompression, portable document format (PDF) reader,…

  8. Basic Internet Software Toolkit.

    ERIC Educational Resources Information Center

    Buchanan, Larry

    1998-01-01

    Once schools are connected to the Internet, the next step is getting network workstations configured for Internet access. This article describes a basic toolkit comprising software currently available on the Internet for free or modest cost. Lists URLs for Web browser, Telnet, FTP, file decompression, portable document format (PDF) reader,…

  9. MISR Toolkit

    Atmospheric Science Data Center

    2014-05-07

    ...   The MISR Toolkit (MTK) is a simplified programming interface to access MISR Level 1B2, Level 2, and ancillary data ... of MISR Toolkit was tested on Linux32 Fedora Core 19, Linux 64 Fedora Core 19, Mac OS X 10.6.8 (Intel), Windows XP (x86), and Windows ...

  10. Molecular techniques in ecohealth research toolkit: facilitating estimation of aggregate gastroenteritis burden in an irrigated periurban landscape.

    PubMed

    Tserendorj, Ariuntuya; Anceno, Alfredo J; Houpt, Eric R; Icenhour, Crystal R; Sethabutr, Orntipa; Mason, Carl S; Shipin, Oleg V

    2011-09-01

    Assessment of microbial hazards associated with certain environmental matrices, livelihood strategies, and food handling practices are constrained by time-consuming conventional microbiological techniques that lead to health risk assessments of narrow geographic or time scope, often targeting very few pathogens. Health risk assessment based on one or few indicator organisms underestimates true disease burden due a number of coexisting causative pathogens. Here, we employed molecular techniques in a survey of Cryptosporidium parvum, Giardia lamblia, Campylobacter jejuni, Escherichia coli O157:H7, Listeria monocytogenes, Salmonella spp., Shigella spp., Vibrio cholera, and Rotavirus A densities in canal water with respect to seasonality and spatial distribution of point-nonpoint pollution sources. Three irrigational canals stretching across nearly a 150-km(2) periurban landscape, traditionally used for agricultural irrigation but function as vital part of municipal wastewater stabilization in recent years, were investigated. Compiled stochastic data (pathogen concentration, susceptible populations) and literature-obtained deterministic data (pathogen dose-response model parameter values) were used in estimating waterborne gastroenteritis burden. Exposure scenarios include swimming or fishing, consuming canal water-irrigated vegetables, and ingesting or inhaling water aerosols while working in canal water-irrigated fields. Estimated annual gastroenteritis burden due individual pathogens among the sampling points was -10.6log(10) to -2.2log(10) DALYs. Aggregated annual gastroenteritis burden due all the target pathogens per sampling point was -3.1log(10) to -1.9log(10) DALYs, far exceeding WHO acceptable limit of -6.0log(10) DALYs. The present approach will facilitate the comprehensive collection of surface water microbiological baseline data and setting of benchmarks for interventions aimed at reducing microbial hazards in similar landscapes worldwide.

  11. Literacy Toolkit

    ERIC Educational Resources Information Center

    Center for Best Practices in Early Childhood Education, 2005

    2005-01-01

    The toolkit contains print and electronic resources, including (1) "eMERGing Literacy and Technology: Working Together", A 492 page curriculum guide; (2) "LitTECH Interactive Presents: The Beginning of Literacy", a DVD that provides and overview linking technology to the concepts of emerging literacy; (3) "Your Preschool Classroom Computer Center:…

  12. Tracker Toolkit

    NASA Technical Reports Server (NTRS)

    Lewis, Steven J.; Palacios, David M.

    2013-01-01

    This software can track multiple moving objects within a video stream simultaneously, use visual features to aid in the tracking, and initiate tracks based on object detection in a subregion. A simple programmatic interface allows plugging into larger image chain modeling suites. It extracts unique visual features for aid in tracking and later analysis, and includes sub-functionality for extracting visual features about an object identified within an image frame. Tracker Toolkit utilizes a feature extraction algorithm to tag each object with metadata features about its size, shape, color, and movement. Its functionality is independent of the scale of objects within a scene. The only assumption made on the tracked objects is that they move. There are no constraints on size within the scene, shape, or type of movement. The Tracker Toolkit is also capable of following an arbitrary number of objects in the same scene, identifying and propagating the track of each object from frame to frame. Target objects may be specified for tracking beforehand, or may be dynamically discovered within a tripwire region. Initialization of the Tracker Toolkit algorithm includes two steps: Initializing the data structures for tracked target objects, including targets preselected for tracking; and initializing the tripwire region. If no tripwire region is desired, this step is skipped. The tripwire region is an area within the frames that is always checked for new objects, and all new objects discovered within the region will be tracked until lost (by leaving the frame, stopping, or blending in to the background).

  13. Tribal Green Building Toolkit

    EPA Pesticide Factsheets

    This Tribal Green Building Toolkit (Toolkit) is designed to help tribal officials, community members, planners, developers, and architects develop and adopt building codes to support green building practices. Anyone can use this toolkit!

  14. Green Infrastructure Modeling Toolkit

    EPA Pesticide Factsheets

    EPA's Green Infrastructure Modeling Toolkit is a toolkit of 5 EPA green infrastructure models and tools, along with communication materials, that can be used as a teaching tool and a quick reference resource when making GI implementation decisions.

  15. Scheduling and routing algorithm for aggregating large data files from distributed databases to super-computers on lambda grid

    NASA Astrophysics Data System (ADS)

    Sun, Shen; Guo, Wei; Sun, Weiqiang; Jin, Yaohui; Hu, Weisheng

    2008-11-01

    These days while the traditional Internet cannot meet the requirement of data-intensive communications in large scale escience grid applications, Optical network which is also referred to as Lambda Grid provide a simple means of achieving guaranteed high bandwidth, guaranteed latency and deterministic connection. Lots of e-science applications like e-VLBI and GTL require aggregating several hundred GB data files from distributed databases to super-computers frequently at real time. Thus minimizing the aggregation time can improve the overall system performance. We consider the problem of aggregating large data files from distributed databases to distributed computational resources on lambda grid. We modify the model of Time-Path Scheduling Problem (TPSP) which has been proposed and propose a new N-destination TPSP (NDTPSP) model. We present the proof of NDTPSP's NP-completeness. We also propose a list scheduling algorithm and a modified list scheduling algorithm for our problem. The performance of different algorithms will be compared and analyzed by simulations.

  16. Liaison Officer Toolkit

    DTIC Science & Technology

    2010-01-01

    pharmacists , epidemiologists, safety officers, logisticians, DSCA Handbook Liaison Officer Toolkit 7-26 UNCLASSIFIED High Demand Task...DSCA Handbook Liaison Officer Toolkit GTA 90-01-021 Approved for public release; distribution is unlimited. Report Documentation Page...Planning Chapters Chapter 5 provides DSCA planning factors for response to all hazard events. Chapter 6 is a review of safety and operational/composite

  17. The Hybrid Toolkit

    SciTech Connect

    Davis IV, Warren L; Dunlavy, Danny; Nebergall, Christopher; Wylie, Brian N; Ingram, Joey Burton; Letter, Matthew; Fabian, Nathan D; Choudhury, Roni; Baumes, Jeff

    2016-09-11

    The Hybrid Toolkit facilitates moving research algorithms into a production environment by creating useful abstractions that separate analytics developers from the intricacies of the production data formats, data flows, and result representations. The toolkit also assists developers with activities such as creating analysis feature vectors, converting between data structures, and creating data pipelines.

  18. Teacher Quality Toolkit

    ERIC Educational Resources Information Center

    Lauer, Patricia A.; Dean, Ceri B.

    2004-01-01

    This Teacher Quality Toolkit aims to support the continuum of teacher learning by providing tools that institutions of higher education, districts, and schools can use to improve both preservice and inservice teacher education. The toolkit incorporates McREL?s accumulated knowledge and experience related to teacher quality and standards-based…

  19. Student Success Center Toolkit

    ERIC Educational Resources Information Center

    Jobs For the Future, 2014

    2014-01-01

    "Student Success Center Toolkit" is a compilation of materials organized to assist Student Success Center directors as they staff, launch, operate, and sustain Centers. The toolkit features materials created and used by existing Centers, such as staffing and budgeting templates, launch materials, sample meeting agendas, and fundraising…

  20. TOOLKIT, Version 2. 0

    SciTech Connect

    Schroeder, E.; Bagot, B.; McNeill, R.L.

    1990-05-09

    The purpose of this User's Guide is to show by example many of the features of Toolkit II. Some examples will be copies of screens as they appear while running the Toolkit. Other examples will show what the user should enter in various situations; in these instances, what the computer asserts will be in boldface and what the user responds will be in regular type. The User's Guide is divided into four sections. The first section, FOCUS Databases'', will give a broad overview of the Focus administrative databases that are available on the VAX; easy-to-use reports are available for most of them in the Toolkit. The second section, Getting Started'', will cover the steps necessary to log onto the Computer Center VAX cluster and how to start Focus and the Toolkit. The third section, Using the Toolkit'', will discuss some of the features in the Toolkit -- the available reports and how to access them, as well as some utilities. The fourth section, Helpful Hints'', will cover some useful facts about the VAX and Focus as well as some of the more common problems that can occur. The Toolkit is not set in concrete but is continually being revised and improved. If you have any opinions as to changes that you would like to see made to the Toolkit or new features that you would like included, please let us know. Since we do try to respond to the needs of the user and make periodic improvement to the Toolkit, this User's Guide may not correspond exactly to what is available in the computer. In general, changes are made to provide new options or features; rarely is an existing feature deleted.

  1. Pybel: a Python wrapper for the OpenBabel cheminformatics toolkit.

    PubMed

    O'Boyle, Noel M; Morley, Chris; Hutchison, Geoffrey R

    2008-03-09

    Scripting languages such as Python are ideally suited to common programming tasks in cheminformatics such as data analysis and parsing information from files. However, for reasons of efficiency, cheminformatics toolkits such as the OpenBabel toolkit are often implemented in compiled languages such as C++. We describe Pybel, a Python module that provides access to the OpenBabel toolkit. Pybel wraps the direct toolkit bindings to simplify common tasks such as reading and writing molecular files and calculating fingerprints. Extensive use is made of Python iterators to simplify loops such as that over all the molecules in a file. A Pybel Molecule can be easily interconverted to an OpenBabel OBMol to access those methods or attributes not wrapped by Pybel. Pybel allows cheminformaticians to rapidly develop Python scripts that manipulate chemical information. It is open source, available cross-platform, and offers the power of the OpenBabel toolkit to Python programmers.

  2. Pybel: a Python wrapper for the OpenBabel cheminformatics toolkit

    PubMed Central

    O'Boyle, Noel M; Morley, Chris; Hutchison, Geoffrey R

    2008-01-01

    Background Scripting languages such as Python are ideally suited to common programming tasks in cheminformatics such as data analysis and parsing information from files. However, for reasons of efficiency, cheminformatics toolkits such as the OpenBabel toolkit are often implemented in compiled languages such as C++. We describe Pybel, a Python module that provides access to the OpenBabel toolkit. Results Pybel wraps the direct toolkit bindings to simplify common tasks such as reading and writing molecular files and calculating fingerprints. Extensive use is made of Python iterators to simplify loops such as that over all the molecules in a file. A Pybel Molecule can be easily interconverted to an OpenBabel OBMol to access those methods or attributes not wrapped by Pybel. Conclusion Pybel allows cheminformaticians to rapidly develop Python scripts that manipulate chemical information. It is open source, available cross-platform, and offers the power of the OpenBabel toolkit to Python programmers. PMID:18328109

  3. Hydropower RAPID Toolkit

    SciTech Connect

    2016-12-01

    This fact sheet provides a brief overview of the U.S. Department of Energy (DOE) Hydropower Regulatory and Permitting Information Desktop (RAPID) Toolkit including its capabilities, features, and benefits.

  4. Emergency care toolkits.

    PubMed

    Black, Steven

    2004-06-01

    Emergency care services are the focus of a series of toolkits developed by the NHS National electronic Library for Health to provide resources for emergency care leads and others involved in modernising emergency care, writes Steven Black.

  5. System Design Toolkit for Integrated Modular Avionics for Space

    NASA Astrophysics Data System (ADS)

    Hann, Mark; Balbastre Betoret, Patricia; Simo Ten, Jose Enrique; De Ferluc, Regis; Ramachandran, Jinesh

    2015-09-01

    The IMA-SP development process identified tools were needed to perform the activities of: i) Partitioning and Resource Allocation and ii) System Feasibility Assessment. This paper describes the definition, design, implementation and test of the tool support required to perform the IMA-SP development process activities. This includes the definition of a data model, with associated files and file formats, describing the complete setup of a partitioned system and allowing system feasibility assessment; the development of a prototype of the tool set, that is called the IMA-SP System Design Toolkit (SDT) and the demonstration of the toolkit on a case study.

  6. JAVA Stereo Display Toolkit

    NASA Technical Reports Server (NTRS)

    Edmonds, Karina

    2008-01-01

    This toolkit provides a common interface for displaying graphical user interface (GUI) components in stereo using either specialized stereo display hardware (e.g., liquid crystal shutter or polarized glasses) or anaglyph display (red/blue glasses) on standard workstation displays. An application using this toolkit will work without modification in either environment, allowing stereo software to reach a wider audience without sacrificing high-quality display on dedicated hardware. The toolkit is written in Java for use with the Swing GUI Toolkit and has cross-platform compatibility. It hooks into the graphics system, allowing any standard Swing component to be displayed in stereo. It uses the OpenGL graphics library to control the stereo hardware and to perform the rendering. It also supports anaglyph and special stereo hardware using the same API (application-program interface), and has the ability to simulate color stereo in anaglyph mode by combining the red band of the left image with the green/blue bands of the right image. This is a low-level toolkit that accomplishes simply the display of components (including the JadeDisplay image display component). It does not include higher-level functions such as disparity adjustment, 3D cursor, or overlays all of which can be built using this toolkit.

  7. LAIT: a local ancestry inference toolkit.

    PubMed

    Hui, Daniel; Fang, Zhou; Lin, Jerome; Duan, Qing; Li, Yun; Hu, Ming; Chen, Wei

    2017-09-06

    Inferring local ancestry in individuals of mixed ancestry has many applications, most notably in identifying disease-susceptible loci that vary among different ethnic groups. Many software packages are available for inferring local ancestry in admixed individuals. However, most of these existing software packages require specific formatted input files and generate output files in various types, yielding practical inconvenience. We developed a tool set, Local Ancestry Inference Toolkit (LAIT), which can convert standardized files into software-specific input file formats as well as standardize and summarize inference results for four popular local ancestry inference software: HAPMIX, LAMP, LAMP-LD, and ELAI. We tested LAIT using both simulated and real data sets and demonstrated that LAIT provides convenience to run multiple local ancestry inference software. In addition, we evaluated the performance of local ancestry software among different supported software packages, mainly focusing on inference accuracy and computational resources used. We provided a toolkit to facilitate the use of local ancestry inference software, especially for users with limited bioinformatics background.

  8. The Einstein Toolkit

    NASA Astrophysics Data System (ADS)

    Löffler, Frank

    2012-03-01

    The Einstein Toolkit Consortium is developing and supporting open software for relativistic astrophysics. Its aim is to provide the core computational tools that can enable new science, broaden our community, facilitate interdisciplinary research and take advantage of petascale computers and advanced cyberinfrastructure. The Einstein Toolkit currently consists of an open set of over 100 modules for the Cactus framework, primarily for computational relativity along with associated tools for simulation management and visualization. The toolkit includes solvers for vacuum spacetimes as well as relativistic magneto-hydrodynamics, along with modules for initial data, analysis and computational infrastructure. These modules have been developed and improved over many years by many different researchers. The Einstein Toolkit is supported by a distributed model, combining core support of software, tools, and documentation in its own repositories and through partnerships with other developers who contribute open software and coordinate together on development. As of January 2012 it has 68 registered members from 30 research groups world-wide. This talk will present the current capabilities of the Einstein Toolkit and will point to information how to leverage it for future research.

  9. Knowledge information management toolkit and method

    DOEpatents

    Hempstead, Antoinette R.; Brown, Kenneth L.

    2006-08-15

    A system is provided for managing user entry and/or modification of knowledge information into a knowledge base file having an integrator support component and a data source access support component. The system includes processing circuitry, memory, a user interface, and a knowledge base toolkit. The memory communicates with the processing circuitry and is configured to store at least one knowledge base. The user interface communicates with the processing circuitry and is configured for user entry and/or modification of knowledge pieces within a knowledge base. The knowledge base toolkit is configured for converting knowledge in at least one knowledge base from a first knowledge base form into a second knowledge base form. A method is also provided.

  10. THE EPANET PROGRAMMER'S TOOLKIT FOR ANALYSIS OF WATER DISTRIBUTION SYSTEMS

    EPA Science Inventory

    The EPANET Programmer's Toolkit is a collection of functions that helps simplify computer programming of water distribution network analyses. the functions can be used to read in a pipe network description file, modify selected component properties, run multiple hydraulic and wa...

  11. THE EPANET PROGRAMMER'S TOOLKIT FOR ANALYSIS OF WATER DISTRIBUTION SYSTEMS

    EPA Science Inventory

    The EPANET Programmer's Toolkit is a collection of functions that helps simplify computer programming of water distribution network analyses. the functions can be used to read in a pipe network description file, modify selected component properties, run multiple hydraulic and wa...

  12. Water Security Toolkit

    SciTech Connect

    2012-09-11

    The Water Security Toolkit (WST) provides software for modeling and analyzing water distribution systems to minimize the potential impact of contamination incidents. WST wraps capabilities for contaminant transport, impact assessment, and sensor network design with response action plans, including source identification, rerouting, and decontamination, to provide a range of water security planning and real-time applications.

  13. Graphics database creation and manipulation: HyperCard Graphics Database Toolkit and Apple Graphics Source

    NASA Astrophysics Data System (ADS)

    Herman, Jeffrey; Fry, David

    1990-08-01

    Because graphic files can be stored in a number ofdifferent file formats, it has traditionally been difficult to create a graphics database from which users can open, copy, and print graphic files, where each file in the database may be in one ofseverai different formats. HyperCard Graphics Database Toolkit has been designed and written by Apple Computer to enable software developers to facilitate the creation of customized graphics databases. Using a database developed with the toolkit, users can open, copy, or print a graphic transparently, without having to know or understand the complexities of file formats. In addition, the toolkit includes a graphic user interface, graphic design, on-line help, and search algorithms that enable users to locate specific graphics quickly. Currently, the toolkit handles graphics in the formats MPNT, PICT, and EPSF, and is open to supporting other formats as well. Developers can use the toolkit to alter the code, the interface, and the graphic design in order to customize their database for the needs oftheir users. This paper discusses the structure ofthe toolkit and one implementation, Apple Graphics Source (AGS). AGS contains over 2,000 graphics used in Apple's books and manuals. AGS enables users to find existing graphics of Apple products and use them for presentations, new publications, papers, and software projects.

  14. Livermore Big Artificial Neural Network Toolkit

    SciTech Connect

    Essen, Brian Van; Jacobs, Sam; Kim, Hyojin; Dryden, Nikoli; Moon, Tim

    2016-07-01

    LBANN is a toolkit that is designed to train artificial neural networks efficiently on high performance computing architectures. It is optimized to take advantages of key High Performance Computing features to accelerate neural network training. Specifically it is optimized for low-latency, high bandwidth interconnects, node-local NVRAM, node-local GPU accelerators, and high bandwidth parallel file systems. It is built on top of the open source Elemental distributed-memory dense and spars-direct linear algebra and optimization library that is released under the BSD license. The algorithms contained within LBANN are drawn from the academic literature and implemented to work within a distributed-memory framework.

  15. Parallel Climate Analysis Toolkit (ParCAT)

    SciTech Connect

    Smith, Brian Edward

    2013-06-30

    The parallel analysis toolkit (ParCAT) provides parallel statistical processing of large climate model simulation datasets. ParCAT provides parallel point-wise average calculations, frequency distributions, sum/differences of two datasets, and difference-of-average and average-of-difference for two datasets for arbitrary subsets of simulation time. ParCAT is a command-line utility that can be easily integrated in scripts or embedded in other application. ParCAT supports CMIP5 post-processed datasets as well as non-CMIP5 post-processed datasets. ParCAT reads and writes standard netCDF files.

  16. Alma Data Mining Toolkit

    NASA Astrophysics Data System (ADS)

    Friedel, Douglas; Looney, Leslie; Teuben, Peter J.; Pound, Marc W.; Rauch, Kevin P.; Mundy, Lee; Harris, Robert J.; Xu, Lisa

    2016-06-01

    ADMIT (ALMA Data Mining Toolkit) is a Python based pipeline toolkit for the creation and analysis of new science products from ALMA data. ADMIT quickly provides users with a detailed overview of their science products, for example: line identifications, line 'cutout' cubes, moment maps, and emission type analysis (e.g., feature detection). Users can download the small ADMIT pipeline product (< 20MB), analyze the results, then fine-tune and re-run the ADMIT pipeline (or any part thereof) on their own machines and interactively inspect the results. ADMIT has both a web browser and command line interface available for this purpose. By analyzing multiple data cubes simultaneously, data mining between many astronomical sources and line transitions are possible. Users are also able to enhance the capabilities of ADMIT by creating customized ADMIT tasks satisfying any special processing needs. We will present some of the salient features of ADMIT and example use cases.

  17. Network algorithms for information analysis using the Titan Toolkit.

    SciTech Connect

    McLendon, William Clarence, III; Baumes, Jeffrey; Wilson, Andrew T.; Wylie, Brian Neil; Shead, Timothy M.

    2010-07-01

    The analysis of networked activities is dramatically more challenging than many traditional kinds of analysis. A network is defined by a set of entities (people, organizations, banks, computers, etc.) linked by various types of relationships. These entities and relationships are often uninteresting alone, and only become significant in aggregate. The analysis and visualization of these networks is one of the driving factors behind the creation of the Titan Toolkit. Given the broad set of problem domains and the wide ranging databases in use by the information analysis community, the Titan Toolkit's flexible, component based pipeline provides an excellent platform for constructing specific combinations of network algorithms and visualizations.

  18. The Weather and Climate Toolkit

    NASA Astrophysics Data System (ADS)

    Ansari, S.; Del Greco, S.; Hankins, B.

    2010-12-01

    The Weather and Climate Toolkit (WCT) is free, platform independent software distributed from NOAA’s National Climatic Data Center (NCDC). The WCT allows the visualization and data export of weather and climate data, including Radar, Satellite and Model data. By leveraging the NetCDF for Java library and Common Data Model, the WCT is extremely scalable and capable of supporting many new datasets in the future. Gridded NetCDF files (regular and irregularly spaced, using Climate-Forecast (CF) conventions) are supported, along with many other formats including GRIB. The WCT provides tools for custom data overlays, Web Map Service (WMS) background maps, animations and basic filtering. The export of images and movies is provided in multiple formats. The WCT Data Export Wizard allows for data export in both vector polygon/point (Shapefile, Well-Known Text) and raster (GeoTIFF, ESRI Grid, VTK, Gridded NetCDF) formats. These data export features promote the interoperability of weather and climate information with various scientific communities and common software packages such as ArcGIS, Google Earth, MatLAB, GrADS and R. The WCT also supports an embedded, integrated Google Earth instance. The Google Earth Browser Plugin allows seamless visualization of data on a native 3-D Google Earth instance linked to the standard 2-D map. Level-II NEXRAD data for Hurricane Katrina GPCP (Global Precipitation Product), visualized in 2-D and internal Google Earth view.

  19. Practical computational toolkits for dendrimers and dendrons structure design.

    PubMed

    Martinho, Nuno; Silva, Liana C; Florindo, Helena F; Brocchini, Steve; Barata, Teresa; Zloh, Mire

    2017-09-15

    Dendrimers and dendrons offer an excellent platform for developing novel drug delivery systems and medicines. The rational design and further development of these repetitively branched systems are restricted by difficulties in scalable synthesis and structural determination, which can be overcome by judicious use of molecular modelling and molecular simulations. A major difficulty to utilise in silico studies to design dendrimers lies in the laborious generation of their structures. Current modelling tools utilise automated assembly of simpler dendrimers or the inefficient manual assembly of monomer precursors to generate more complicated dendrimer structures. Herein we describe two novel graphical user interface toolkits written in Python that provide an improved degree of automation for rapid assembly of dendrimers and generation of their 2D and 3D structures. Our first toolkit uses the RDkit library, SMILES nomenclature of monomers and SMARTS reaction nomenclature to generate SMILES and mol files of dendrimers without 3D coordinates. These files are used for simple graphical representations and storing their structures in databases. The second toolkit assembles complex topology dendrimers from monomers to construct 3D dendrimer structures to be used as starting points for simulation using existing and widely available software and force fields. Both tools were validated for ease-of-use to prototype dendrimer structure and the second toolkit was especially relevant for dendrimers of high complexity and size.

  20. Flow Analysis Software Toolkit

    NASA Technical Reports Server (NTRS)

    Watson, Velvin; Castagnera, Karen; Plessel, Todd; Merritt, Fergus; Kelaita, Paul; West, John; Sandstrom, Tim; Clucas, Jean; Globus, AL; Bancroft, Gordon; hide

    1993-01-01

    Flow Analysis Software Toolkit (FAST) computer program provides software environment facilitating visualization of data. Collection of separate programs (modules) running simultaneously and helps user to examine results of numerical and experimental simulations. Intended for graphical depiction of computed flows, also assists in analysis of other types of data. Combines capabilities of such programs as PLOT3D, RIP, SURF, and GAS into one software environment with modules sharing data. All modules have consistent, highly interactive graphical user interface. Modular construction makes it flexible and extensible. Environment custom-configured, and new modules developed and added as needed. Written in ANSI compliant FORTRAN 77 and C language.

  1. Multiphysics Application Coupling Toolkit

    SciTech Connect

    Campbell, Michael T.

    2013-12-02

    This particular consortium implementation of the software integration infrastructure will, in large part, refactor portions of the Rocstar multiphysics infrastructure. Development of this infrastructure originated at the University of Illinois DOE ASCI Center for Simulation of Advanced Rockets (CSAR) to support the center's massively parallel multiphysics simulation application, Rocstar, and has continued at IllinoisRocstar, a small company formed near the end of the University-based program. IllinoisRocstar is now licensing these new developments as free, open source, in hopes to help improve their own and others' access to infrastructure which can be readily utilized in developing coupled or composite software systems; with particular attention to more rapid production and utilization of multiphysics applications in the HPC environment. There are two major pieces to the consortium implementation, the Application Component Toolkit (ACT), and the Multiphysics Application Coupling Toolkit (MPACT). The current development focus is the ACT, which is (will be) the substrate for MPACT. The ACT itself is built up from the components described in the technical approach. In particular, the ACT has the following major components: 1.The Component Object Manager (COM): The COM package provides encapsulation of user applications, and their data. COM also provides the inter-component function call mechanism. 2.The System Integration Manager (SIM): The SIM package provides constructs and mechanisms for orchestrating composite systems of multiply integrated pieces.

  2. The MIS Pipeline Toolkit

    NASA Astrophysics Data System (ADS)

    Teuben, Peter J.; Pound, M. W.; Storm, S.; Mundy, L. G.; Salter, D. M.; Lee, K.; Kwon, W.; Fernandez Lopez, M.; Plunkett, A.

    2013-01-01

    A pipeline toolkit was developed to help organizing, reducing and analyzing a large number of near-identical datasets. This is a very general problem, for which many different solutions have been implemented. In this poster we present one such solution that lends itself to users of the Unix command line, using the Unix "make" utility, and adapts itself easily to observational as well as theoretical projects. Two examples are given, one from the CARMA CLASSy survey, and another from a simulated kinematic survey of early galaxy forming disks. The CLASSy survey (discussed in more detail in three accompanying posters) consists of 5 different star forming regions, observed with CARMA, each containing roughly 10-20 datasets in continuum and 3 different molecular lines, that need to be combined in final data cubes and maps. The strength of such a pipeline toolkit shows itself as new data are accumulated, the data reduction steps are improved and easily re-applied to previously taken data. For this we employed a master script that was run nightly, and collaborators submitted improved script and/or pipeline parameters that control these scripts. MIS is freely available for download.

  3. NAIF Toolkit - Extended

    NASA Technical Reports Server (NTRS)

    Acton, Charles H., Jr.; Bachman, Nathaniel J.; Semenov, Boris V.; Wright, Edward D.

    2010-01-01

    The Navigation Ancillary Infor ma tion Facility (NAIF) at JPL, acting under the direction of NASA s Office of Space Science, has built a data system named SPICE (Spacecraft Planet Instrument Cmatrix Events) to assist scientists in planning and interpreting scientific observations (see figure). SPICE provides geometric and some other ancillary information needed to recover the full value of science instrument data, including correlation of individual instrument data sets with data from other instruments on the same or other spacecraft. This data system is used to produce space mission observation geometry data sets known as SPICE kernels. It is also used to read SPICE kernels and to compute derived quantities such as positions, orientations, lighting angles, etc. The SPICE toolkit consists of a subroutine/ function library, executable programs (both large applications and simple utilities that focus on kernel management), and simple examples of using SPICE toolkit subroutines. This software is very accurate, thoroughly tested, and portable to all computers. It is extremely stable and reusable on all missions. Since the previous version, three significant capabilities have been added: Interactive Data Language (IDL) interface, MATLAB interface, and a geometric event finder subsystem.

  4. Mission Simulation Toolkit

    NASA Technical Reports Server (NTRS)

    Pisaich, Gregory; Flueckiger, Lorenzo; Neukom, Christian; Wagner, Mike; Buchanan, Eric; Plice, Laura

    2007-01-01

    The Mission Simulation Toolkit (MST) is a flexible software system for autonomy research. It was developed as part of the Mission Simulation Facility (MSF) project that was started in 2001 to facilitate the development of autonomous planetary robotic missions. Autonomy is a key enabling factor for robotic exploration. There has been a large gap between autonomy software (at the research level), and software that is ready for insertion into near-term space missions. The MST bridges this gap by providing a simulation framework and a suite of tools for supporting research and maturation of autonomy. MST uses a distributed framework based on the High Level Architecture (HLA) standard. A key feature of the MST framework is the ability to plug in new models to replace existing ones with the same services. This enables significant simulation flexibility, particularly the mixing and control of fidelity level. In addition, the MST provides automatic code generation from robot interfaces defined with the Unified Modeling Language (UML), methods for maintaining synchronization across distributed simulation systems, XML-based robot description, and an environment server. Finally, the MSF supports a number of third-party products including dynamic models and terrain databases. Although the communication objects and some of the simulation components that are provided with this toolkit are specifically designed for terrestrial surface rovers, the MST can be applied to any other domain, such as aerial, aquatic, or space.

  5. Multiphysics Application Coupling Toolkit

    SciTech Connect

    Campbell, Michael T.

    2013-12-02

    This particular consortium implementation of the software integration infrastructure will, in large part, refactor portions of the Rocstar multiphysics infrastructure. Development of this infrastructure originated at the University of Illinois DOE ASCI Center for Simulation of Advanced Rockets (CSAR) to support the center's massively parallel multiphysics simulation application, Rocstar, and has continued at IllinoisRocstar, a small company formed near the end of the University-based program. IllinoisRocstar is now licensing these new developments as free, open source, in hopes to help improve their own and others' access to infrastructure which can be readily utilized in developing coupled or composite software systems; with particular attention to more rapid production and utilization of multiphysics applications in the HPC environment. There are two major pieces to the consortium implementation, the Application Component Toolkit (ACT), and the Multiphysics Application Coupling Toolkit (MPACT). The current development focus is the ACT, which is (will be) the substrate for MPACT. The ACT itself is built up from the components described in the technical approach. In particular, the ACT has the following major components: 1.The Component Object Manager (COM): The COM package provides encapsulation of user applications, and their data. COM also provides the inter-component function call mechanism. 2.The System Integration Manager (SIM): The SIM package provides constructs and mechanisms for orchestrating composite systems of multiply integrated pieces.

  6. Einstein Toolkit for Relativistic Astrophysics

    NASA Astrophysics Data System (ADS)

    Collaborative Effort

    2011-02-01

    The Einstein Toolkit is a collection of software components and tools for simulating and analyzing general relativistic astrophysical systems. Such systems include gravitational wave space-times, collisions of compact objects such as black holes or neutron stars, accretion onto compact objects, core collapse supernovae and Gamma-Ray Bursts. The Einstein Toolkit builds on numerous software efforts in the numerical relativity community including CactusEinstein, Whisky, and Carpet. The Einstein Toolkit currently uses the Cactus Framework as the underlying computational infrastructure that provides large-scale parallelization, general computational components, and a model for collaborative, portable code development.

  7. A Prototype Search Toolkit

    NASA Astrophysics Data System (ADS)

    Knepper, Margaret M.; Fox, Kevin L.; Frieder, Ophir

    Information overload is now a reality. We no longer worry about obtaining a sufficient volume of data; we now are concerned with sifting and understanding the massive volumes of data available to us. To do so, we developed an integrated information processing toolkit that provides the user with a variety of ways to view their information. The views include keyword search results, a domain specific ranking system that allows for adaptively capturing topic vocabularies to customize and focus the search results, navigation pages for browsing, and a geospatial and temporal component to visualize results in time and space, and provide “what if” scenario playing. Integrating the information from different tools and sources gives the user additional information and another way to analyze the data. An example of the integration is illustrated on reports of the avian influenza (bird flu).

  8. ParCAT: Parallel Climate Analysis Toolkit

    SciTech Connect

    Smith, Brian E.; Steed, Chad A.; Shipman, Galen M.; Ricciuto, Daniel M.; Thornton, Peter E.; Wehner, Michael; Williams, Dean N.

    2013-01-01

    Climate science is employing increasingly complex models and simulations to analyze the past and predict the future of Earth s climate. This growth in complexity is creating a widening gap between the data being produced and the ability to analyze the datasets. Parallel computing tools are necessary to analyze, compare, and interpret the simulation data. The Parallel Climate Analysis Toolkit (ParCAT) provides basic tools to efficiently use parallel computing techniques to make analysis of these datasets manageable. The toolkit provides the ability to compute spatio-temporal means, differences between runs or differences between averages of runs, and histograms of the values in a data set. ParCAT is implemented as a command-line utility written in C. This allows for easy integration in other tools and allows for use in scripts. This also makes it possible to run ParCAT on many platforms from laptops to supercomputers. ParCAT outputs NetCDF files so it is compatible with existing utilities such as Panoply and UV-CDAT. This paper describes ParCAT and presents results from some example runs on the Titan system at ORNL.

  9. Third Party TMDL Development Toolkit

    EPA Pesticide Factsheets

    Water Environment Federation's toolkit provides basic steps in which an organization or group other than the lead water quality agency takes responsibility for developing the TMDL document and supporting analysis.

  10. Lean and Information Technology Toolkit

    EPA Pesticide Factsheets

    The Lean and Information Technology Toolkit is a how-to guide which provides resources to environmental agencies to help them use Lean Startup, Lean process improvement, and Agile tools to streamline and automate processes.

  11. The Lean and Environment Toolkit

    EPA Pesticide Factsheets

    This Lean and Environment Toolkit assembles practical experience collected by the U.S. Environmental Protection Agency (EPA) and partner companies and organizations that have experience with coordinating Lean implementation and environmental management.

  12. Introducing the Ginga FITS Viewer and Toolkit

    NASA Astrophysics Data System (ADS)

    Jeschke, E.; Inagaki, T.; Kackley, R.

    2013-10-01

    We introduce Ginga, a new open-source FITS viewer and toolkit based on Python astronomical packages such as pyfits, numpy, scipy, matplotlib, and pywcs. For developers, we present a set of Python classes for viewing FITS files under the modern Gtk and Qt widget sets and a more full-featured viewer that has a plugin architecture. We further describe how plugins can be written to extend the viewer with many different capabilities. The software may be of interest to software developers who are looking for a solution for integrating FITS visualization into their Python programs and end users interested in a new and different FITS viewer that is not based on Tcl/Tk widget technology. The software has been released under a BSD license.

  13. Simplifying operations with an uplink/downlink integration toolkit

    NASA Technical Reports Server (NTRS)

    Murphy, Susan C.; Miller, Kevin J.; Guerrero, Ana Maria; Joe, Chester; Louie, John J.; Aguilera, Christine

    1994-01-01

    The Operations Engineering Lab (OEL) at JPL has developed a simple, generic toolkit to integrate the uplink/downlink processes, (often called closing the loop), in JPL's Multimission Ground Data System. This toolkit provides capabilities for integrating telemetry verification points with predicted spacecraft commands and ground events in the Mission Sequence Of Events (SOE) document. In the JPL ground data system, the uplink processing functions and the downlink processing functions are separate subsystems that are not well integrated because of the nature of planetary missions with large one-way light times for spacecraft-to-ground communication. Our new closed-loop monitoring tool allows an analyst or mission controller to view and save uplink commands and ground events with their corresponding downlinked telemetry values regardless of the delay in downlink telemetry and without requiring real-time intervention by the user. An SOE document is a time-ordered list of all the planned ground and spacecraft events, including all commands, sequence loads, ground events, significant mission activities, spacecraft status, and resource allocations. The SOE document is generated by expansion and integration of spacecraft sequence files, ground station allocations, navigation files, and other ground event files. This SOE generation process has been automated within the OEL and includes a graphical, object-oriented SOE editor and real-time viewing tool running under X/Motif. The SOE toolkit was used as the framework for the integrated implementation. The SOE is used by flight engineers to coordinate their operations tasks, serving as a predict data set in ground operations and mission control. The closed-loop SOE toolkit allows simple, automated integration of predicted uplink events with correlated telemetry points in a single SOE document for on-screen viewing and archiving. It automatically interfaces with existing real-time or non real-time sources of information, to

  14. ParCAT: A Parallel Climate Analysis Toolkit

    NASA Astrophysics Data System (ADS)

    Haugen, B.; Smith, B.; Steed, C.; Ricciuto, D. M.; Thornton, P. E.; Shipman, G.

    2012-12-01

    Climate science has employed increasingly complex models and simulations to analyze the past and predict the future of our climate. The size and dimensionality of climate simulation data has been growing with the complexity of the models. This growth in data is creating a widening gap between the data being produced and the tools necessary to analyze large, high dimensional data sets. With single run data sets increasing into 10's, 100's and even 1000's of gigabytes, parallel computing tools are becoming a necessity in order to analyze and compare climate simulation data. The Parallel Climate Analysis Toolkit (ParCAT) provides basic tools that efficiently use parallel computing techniques to narrow the gap between data set size and analysis tools. ParCAT was created as a collaborative effort between climate scientists and computer scientists in order to provide efficient parallel implementations of the computing tools that are of use to climate scientists. Some of the basic functionalities included in the toolkit are the ability to compute spatio-temporal means and variances, differences between two runs and histograms of the values in a data set. ParCAT is designed to facilitate the "heavy lifting" that is required for large, multidimensional data sets. The toolkit does not focus on performing the final visualizations and presentation of results but rather, reducing large data sets to smaller, more manageable summaries. The output from ParCAT is provided in commonly used file formats (NetCDF, CSV, ASCII) to allow for simple integration with other tools. The toolkit is currently implemented as a command line utility, but will likely also provide a C library for developers interested in tighter software integration. Elements of the toolkit are already being incorporated into projects such as UV-CDAT and CMDX. There is also an effort underway to implement portions of the CCSM Land Model Diagnostics package using ParCAT in conjunction with Python and gnuplot. Par

  15. BIT: Biosignal Igniter Toolkit.

    PubMed

    da Silva, Hugo Plácido; Lourenço, André; Fred, Ana; Martins, Raúl

    2014-06-01

    The study of biosignals has had a transforming role in multiple aspects of our society, which go well beyond the health sciences domains to which they were traditionally associated with. While biomedical engineering is a classical discipline where the topic is amply covered, today biosignals are a matter of interest for students, researchers and hobbyists in areas including computer science, informatics, electrical engineering, among others. Regardless of the context, the use of biosignals in experimental activities and practical projects is heavily bounded by the cost, and limited access to adequate support materials. In this paper we present an accessible, albeit versatile toolkit, composed of low-cost hardware and software, which was created to reinforce the engagement of different people in the field of biosignals. The hardware consists of a modular wireless biosignal acquisition system that can be used to support classroom activities, interface with other devices, or perform rapid prototyping of end-user applications. The software comprehends a set of programming APIs, a biosignal processing toolbox, and a framework for real time data acquisition and postprocessing.

  16. Pizza.py Toolkit

    SciTech Connect

    Plimpton, Steve; Jones, Matt; Crozier, Paul

    2006-01-01

    Pizza.py is a loosely integrated collection of tools, many of which provide support for the LAMMPS molecular dynamics and ChemCell cell modeling packages. There are tools to create input files. convert between file formats, process log and dump files, create plots, and visualize and animate simulation snapshots. Software packages that are wrapped by Pizza.py. so they can invoked from within Python, include GnuPlot, MatLab, Raster3d. and RasMol. Pizza.py is written in Python and runs on any platform that supports Python. Pizza.py enhances the standard Python interpreter in a few simple ways. Its tools are Python modules which can be invoked interactively, from scripts, or from GUIs when appropriate. Some of the tools require additional Python packages to be installed as part of the users Python. Others are wrappers on software packages (as listed above) which must be available on the users system. It is easy to modify or extend Pizza.py with new functionality or new tools, which need not have anything to do with LAMMPS or ChemCell.

  17. A Scalable Analysis Toolkit

    NASA Technical Reports Server (NTRS)

    Aiken, Alexander

    2001-01-01

    The Scalable Analysis Toolkit (SAT) project aimed to demonstrate that it is feasible and useful to statically detect software bugs in very large systems. The technical focus of the project was on a relatively new class of constraint-based techniques for analysis software, where the desired facts about programs (e.g., the presence of a particular bug) are phrased as constraint problems to be solved. At the beginning of this project, the most successful forms of formal software analysis were limited forms of automatic theorem proving (as exemplified by the analyses used in language type systems and optimizing compilers), semi-automatic theorem proving for full verification, and model checking. With a few notable exceptions these approaches had not been demonstrated to scale to software systems of even 50,000 lines of code. Realistic approaches to large-scale software analysis cannot hope to make every conceivable formal method scale. Thus, the SAT approach is to mix different methods in one application by using coarse and fast but still adequate methods at the largest scales, and reserving the use of more precise but also more expensive methods at smaller scales for critical aspects (that is, aspects critical to the analysis problem under consideration) of a software system. The principled method proposed for combining a heterogeneous collection of formal systems with different scalability characteristics is mixed constraints. This idea had been used previously in small-scale applications with encouraging results: using mostly coarse methods and narrowly targeted precise methods, useful information (meaning the discovery of bugs in real programs) was obtained with excellent scalability.

  18. ProtoMD: A prototyping toolkit for multiscale molecular dynamics

    NASA Astrophysics Data System (ADS)

    Somogyi, Endre; Mansour, Andrew Abi; Ortoleva, Peter J.

    2016-05-01

    ProtoMD is a toolkit that facilitates the development of algorithms for multiscale molecular dynamics (MD) simulations. It is designed for multiscale methods which capture the dynamic transfer of information across multiple spatial scales, such as the atomic to the mesoscopic scale, via coevolving microscopic and coarse-grained (CG) variables. ProtoMD can be also be used to calibrate parameters needed in traditional CG-MD methods. The toolkit integrates 'GROMACS wrapper' to initiate MD simulations, and 'MDAnalysis' to analyze and manipulate trajectory files. It facilitates experimentation with a spectrum of coarse-grained variables, prototyping rare events (such as chemical reactions), or simulating nanocharacterization experiments such as terahertz spectroscopy, AFM, nanopore, and time-of-flight mass spectroscopy. ProtoMD is written in python and is freely available under the GNU General Public License from github.com/CTCNano/proto_md.

  19. CRAVAT: cancer-related analysis of variants toolkit

    PubMed Central

    Douville, Christopher; Carter, Hannah; Kim, Rick; Niknafs, Noushin; Diekhans, Mark; Stenson, Peter D.; Cooper, David N.; Ryan, Michael; Karchin, Rachel

    2013-01-01

    Summary: Advances in sequencing technology have greatly reduced the costs incurred in collecting raw sequencing data. Academic laboratories and researchers therefore now have access to very large datasets of genomic alterations but limited time and computational resources to analyse their potential biological importance. Here, we provide a web-based application, Cancer-Related Analysis of Variants Toolkit, designed with an easy-to-use interface to facilitate the high-throughput assessment and prioritization of genes and missense alterations important for cancer tumorigenesis. Cancer-Related Analysis of Variants Toolkit provides predictive scores for germline variants, somatic mutations and relative gene importance, as well as annotations from published literature and databases. Results are emailed to users as MS Excel spreadsheets and/or tab-separated text files. Availability: http://www.cravat.us/ Contact: karchin@jhu.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:23325621

  20. toolkit computational mesh conceptual model.

    SciTech Connect

    Baur, David G.; Edwards, Harold Carter; Cochran, William K.; Williams, Alan B.; Sjaardema, Gregory D.

    2010-03-01

    The Sierra Toolkit computational mesh is a software library intended to support massively parallel multi-physics computations on dynamically changing unstructured meshes. This domain of intended use is inherently complex due to distributed memory parallelism, parallel scalability, heterogeneity of physics, heterogeneous discretization of an unstructured mesh, and runtime adaptation of the mesh. Management of this inherent complexity begins with a conceptual analysis and modeling of this domain of intended use; i.e., development of a domain model. The Sierra Toolkit computational mesh software library is designed and implemented based upon this domain model. Software developers using, maintaining, or extending the Sierra Toolkit computational mesh library must be familiar with the concepts/domain model presented in this report.

  1. WIND Toolkit Offshore Summary Dataset

    DOE Data Explorer

    Draxl, Caroline; Musial, Walt; Scott, George; Phillips, Caleb

    2017-08-18

    This dataset contains summary statistics for offshore wind resources for the continental United States derived from the Wind Integration National Datatset (WIND) Toolkit. These data are available in two formats: GDB - Compressed geodatabases containing statistical summaries aligned with lease blocks (aliquots) stored in a GIS format. These data are partitioned into Pacific, Atlantic, and Gulf resource regions. HDF5 - Statistical summaries of all points in the offshore Pacific, Atlantic, and Gulf offshore regions. These data are located on the original WIND Toolkit grid and have not been reassigned or downsampled to lease blocks. These data were developed under contract by NREL for the Bureau of Oceanic Energy Management (BOEM).

  2. The MicroWeb Toolkit: Bringing the WWW to the Classroom.

    ERIC Educational Resources Information Center

    Thomson, Judi R.; Cooke, John E.; Greer, Jim E.

    Computer applications that facilitate the use of the World Wide Web (WWW) within elementary and secondary education must provide support for educators to locate materials quickly and easily; they must minimize the amount of time spent waiting for files to traverse the network; and they must deal sensitively with censorship. The MicroWeb Toolkit is…

  3. A Toolkit for Teacher Engagement

    ERIC Educational Resources Information Center

    Grantmakers for Education, 2014

    2014-01-01

    Teachers are critical to the success of education grantmaking strategies, yet in talking with them we discovered that the world of philanthropy is often a mystery. GFE's Toolkit for Teacher Engagement aims to assist funders in authentically and effectively involving teachers in the education reform and innovation process. Built directly from the…

  4. Cinfony – combining Open Source cheminformatics toolkits behind a common interface

    PubMed Central

    O'Boyle, Noel M; Hutchison, Geoffrey R

    2008-01-01

    Background Open Source cheminformatics toolkits such as OpenBabel, the CDK and the RDKit share the same core functionality but support different sets of file formats and forcefields, and calculate different fingerprints and descriptors. Despite their complementary features, using these toolkits in the same program is difficult as they are implemented in different languages (C++ versus Java), have different underlying chemical models and have different application programming interfaces (APIs). Results We describe Cinfony, a Python module that presents a common interface to all three of these toolkits, allowing the user to easily combine methods and results from any of the toolkits. In general, the run time of the Cinfony modules is almost as fast as accessing the underlying toolkits directly from C++ or Java, but Cinfony makes it much easier to carry out common tasks in cheminformatics such as reading file formats and calculating descriptors. Conclusion By providing a simplified interface and improving interoperability, Cinfony makes it easy to combine complementary features of OpenBabel, the CDK and the RDKit. PMID:19055766

  5. pypet: A Python Toolkit for Data Management of Parameter Explorations.

    PubMed

    Meyer, Robert; Obermayer, Klaus

    2016-01-01

    pypet (Python parameter exploration toolkit) is a new multi-platform Python toolkit for managing numerical simulations. Sampling the space of model parameters is a key aspect of simulations and numerical experiments. pypet is designed to allow easy and arbitrary sampling of trajectories through a parameter space beyond simple grid searches. pypet collects and stores both simulation parameters and results in a single HDF5 file. This collective storage allows fast and convenient loading of data for further analyses. pypet provides various additional features such as multiprocessing and parallelization of simulations, dynamic loading of data, integration of git version control, and supervision of experiments via the electronic lab notebook Sumatra. pypet supports a rich set of data formats, including native Python types, Numpy and Scipy data, Pandas DataFrames, and BRIAN(2) quantities. Besides these formats, users can easily extend the toolkit to allow customized data types. pypet is a flexible tool suited for both short Python scripts and large scale projects. pypet's various features, especially the tight link between parameters and results, promote reproducible research in computational neuroscience and simulation-based disciplines.

  6. pypet: A Python Toolkit for Data Management of Parameter Explorations

    PubMed Central

    Meyer, Robert; Obermayer, Klaus

    2016-01-01

    pypet (Python parameter exploration toolkit) is a new multi-platform Python toolkit for managing numerical simulations. Sampling the space of model parameters is a key aspect of simulations and numerical experiments. pypet is designed to allow easy and arbitrary sampling of trajectories through a parameter space beyond simple grid searches. pypet collects and stores both simulation parameters and results in a single HDF5 file. This collective storage allows fast and convenient loading of data for further analyses. pypet provides various additional features such as multiprocessing and parallelization of simulations, dynamic loading of data, integration of git version control, and supervision of experiments via the electronic lab notebook Sumatra. pypet supports a rich set of data formats, including native Python types, Numpy and Scipy data, Pandas DataFrames, and BRIAN(2) quantities. Besides these formats, users can easily extend the toolkit to allow customized data types. pypet is a flexible tool suited for both short Python scripts and large scale projects. pypet's various features, especially the tight link between parameters and results, promote reproducible research in computational neuroscience and simulation-based disciplines. PMID:27610080

  7. User`s guide for SDDS toolkit Version 1.4

    SciTech Connect

    Borland, M.

    1995-07-06

    The Self Describing Data Sets (SDDS) file protocol is the basis for a powerful and expanding toolkit of over 40 generic programs. These programs are used for simulation postprocessing, graphics, data preparation, program interfacing, and experimental data analysis. This document describes Version 1.4 of the SDDS commandline toolkit. Those wishing to write programs using SDDS should consult the Application Programmer`s Guide for SDDS Version 1.4. The first section of the present document is shared with this reference. This document does not describe SDDS-compliant EPICS applications, of which there are presently 25.

  8. The Connectome Viewer Toolkit: An Open Source Framework to Manage, Analyze, and Visualize Connectomes

    PubMed Central

    Gerhard, Stephan; Daducci, Alessandro; Lemkaddem, Alia; Meuli, Reto; Thiran, Jean-Philippe; Hagmann, Patric

    2011-01-01

    Advanced neuroinformatics tools are required for methods of connectome mapping, analysis, and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration, and sharing. We have designed and implemented the Connectome Viewer Toolkit – a set of free and extensible open source neuroimaging tools written in Python. The key components of the toolkit are as follows: (1) The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. (2) The Connectome File Format Library enables management and sharing of connectome files. (3) The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration, and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/ PMID:21713110

  9. Land surface Verification Toolkit (LVT)

    NASA Technical Reports Server (NTRS)

    Kumar, Sujay V.

    2017-01-01

    LVT is a framework developed to provide an automated, consolidated environment for systematic land surface model evaluation Includes support for a range of in-situ, remote-sensing and other model and reanalysis products. Supports the analysis of outputs from various LIS subsystems, including LIS-DA, LIS-OPT, LIS-UE. Note: The Land Information System Verification Toolkit (LVT) is a NASA software tool designed to enable the evaluation, analysis and comparison of outputs generated by the Land Information System (LIS). The LVT software is released under the terms and conditions of the NASA Open Source Agreement (NOSA) Version 1.1 or later. Land Information System Verification Toolkit (LVT) NOSA.

  10. Geant4 - A Simulation Toolkit

    SciTech Connect

    Wright, Dennis H

    2002-08-09

    GEANT4 is a toolkit for simulating the passage of particles through matter. it includes a complete range of functionality including tracking, geometry, physics models and hits. The physics processes offered cover a comprehensive range, including electromagnetic, hadronic and optical processes, a large set of long-lived particles, materials and elements, over a wide energy range starting, in some cases, from 250 eV and extending in others to the TeV energy range. it has been designed and constructed to expose the physics models utilized, to handle complex geometries, and to enable its easy adaptation for optimal use in different sets of applications. The toolkit is the result of a worldwide collaboration of physicists and software engineers. It has been created exploiting software engineering and object-oriented technology and implemented in the C++ programming language. It has been used in applications in particle physics, nuclear physics, accelerator design, space engineering and medical physics.

  11. Light-Field Imaging Toolkit

    NASA Astrophysics Data System (ADS)

    Bolan, Jeffrey; Hall, Elise; Clifford, Chris; Thurow, Brian

    The Light-Field Imaging Toolkit (LFIT) is a collection of MATLAB functions designed to facilitate the rapid processing of raw light field images captured by a plenoptic camera. An included graphical user interface streamlines the necessary post-processing steps associated with plenoptic images. The generation of perspective shifted views and computationally refocused images is supported, in both single image and animated formats. LFIT performs necessary calibration, interpolation, and structuring steps to enable future applications of this technology.

  12. Parallel Power Grid Simulation Toolkit

    SciTech Connect

    Smith, Steve; Kelley, Brian; Banks, Lawrence; Top, Philip; Woodward, Carol

    2015-09-14

    ParGrid is a 'wrapper' that integrates a coupled Power Grid Simulation toolkit consisting of a library to manage the synchronization and communication of independent simulations. The included library code in ParGid, named FSKIT, is intended to support the coupling multiple continuous and discrete even parallel simulations. The code is designed using modern object oriented C++ methods utilizing C++11 and current Boost libraries to ensure compatibility with multiple operating systems and environments.

  13. The Bio-Community Perl toolkit for microbial ecology

    PubMed Central

    Angly, Florent E.; Fields, Christopher J.; Tyson, Gene W.

    2014-01-01

    Summary: The development of bioinformatic solutions for microbial ecology in Perl is limited by the lack of modules to represent and manipulate microbial community profiles from amplicon and meta-omics studies. Here we introduce Bio-Community, an open-source, collaborative toolkit that extends BioPerl. Bio-Community interfaces with commonly used programs using various file formats, including BIOM, and provides operations such as rarefaction and taxonomic summaries. Bio-Community will help bioinformaticians to quickly piece together custom analysis pipelines and develop novel software. Availability an implementation: Bio-Community is cross-platform Perl code available from http://search.cpan.org/dist/Bio-Community under the Perl license. A readme file describes software installation and how to contribute. Contact: f.angly@uq.edu.au Supplementary information: Supplementary data are available at Bioinformatics online PMID:24618462

  14. The Bio-Community Perl toolkit for microbial ecology.

    PubMed

    Angly, Florent E; Fields, Christopher J; Tyson, Gene W

    2014-07-01

    The development of bioinformatic solutions for microbial ecology in Perl is limited by the lack of modules to represent and manipulate microbial community profiles from amplicon and meta-omics studies. Here we introduce Bio-Community, an open-source, collaborative toolkit that extends BioPerl. Bio-Community interfaces with commonly used programs using various file formats, including BIOM, and provides operations such as rarefaction and taxonomic summaries. Bio-Community will help bioinformaticians to quickly piece together custom analysis pipelines and develop novel software. Availability an implementation: Bio-Community is cross-platform Perl code available from http://search.cpan.org/dist/Bio-Community under the Perl license. A readme file describes software installation and how to contribute. © The Author 2014. Published by Oxford University Press.

  15. The REACH Youth Program Learning Toolkit

    ERIC Educational Resources Information Center

    Sierra Health Foundation, 2011

    2011-01-01

    Believing in the value of using video documentaries and data as learning tools, members of the REACH technical assistance team collaborated to develop this toolkit. The learning toolkit was designed using and/or incorporating components of the "Engaging Youth in Community Change: Outcomes and Lessons Learned from Sierra Health Foundation's…

  16. Python-ARM Radar Toolkit

    SciTech Connect

    Jonathan Helmus, Scott Collis

    2013-03-17

    The Python-ARM Radar Toolkit (Py-ART) is a collection of radar quality control and retrieval codes which all work on two unifying Python objects: the PyRadar and PyGrid objects. By building ingests to several popular radar formats and then abstracting the interface Py-ART greatly simplifies data processing over several other available utilities. In addition Py-ART makes use of Numpy arrays as its primary storage mechanism enabling use of existing and extensive community software tools.

  17. Design Optimization Toolkit: Users' Manual

    SciTech Connect

    Aguilo Valentin, Miguel Alejandro

    2014-07-01

    The Design Optimization Toolkit (DOTk) is a stand-alone C++ software package intended to solve complex design optimization problems. DOTk software package provides a range of solution methods that are suited for gradient/nongradient-based optimization, large scale constrained optimization, and topology optimization. DOTk was design to have a flexible user interface to allow easy access to DOTk solution methods from external engineering software packages. This inherent flexibility makes DOTk barely intrusive to other engineering software packages. As part of this inherent flexibility, DOTk software package provides an easy-to-use MATLAB interface that enables users to call DOTk solution methods directly from the MATLAB command window.

  18. Python-ARM Radar Toolkit

    SciTech Connect

    Jonathan Helmus, Scott Collis

    2013-03-17

    The Python-ARM Radar Toolkit (Py-ART) is a collection of radar quality control and retrieval codes which all work on two unifying Python objects: the PyRadar and PyGrid objects. By building ingests to several popular radar formats and then abstracting the interface Py-ART greatly simplifies data processing over several other available utilities. In addition Py-ART makes use of Numpy arrays as its primary storage mechanism enabling use of existing and extensive community software tools.

  19. An Introduction to the Einstein Toolkit

    NASA Astrophysics Data System (ADS)

    Zilhão, Miguel; Löffler, Frank

    2013-09-01

    We give an introduction to the Einstein Toolkit, a mature, open-source computational infrastructure for numerical relativity based on the Cactus Framework, for the target group of new users. This toolkit is composed of several different modules, is developed by researchers from different institutions throughout the world and is in active continuous development. Documentation for the toolkit and its several modules is often scattered across different locations, a difficulty new users may at times have to struggle with. Scientific papers exist describing the toolkit and its methods in detail, but they might be overwhelming at first. With these lecture notes we hope to provide an initial overview for new users. We cover how to obtain, compile and run the toolkit, and give an overview of some of the tools and modules provided with it.

  20. The SCRAM tool-kit

    NASA Technical Reports Server (NTRS)

    Tamir, David; Flanigan, Lee A.; Weeks, Jack L.; Siewert, Thomas A.; Kimbrough, Andrew G.; Mcclure, Sidney R.

    1994-01-01

    This paper proposes a new series of on-orbit capabilities to support the near-term Hubble Space Telescope, Extended Duration Orbiter, Long Duration Orbiter, Space Station Freedom, other orbital platforms, and even the future manned Lunar/Mars missions. These proposed capabilities form a toolkit termed Space Construction, Repair, and Maintenance (SCRAM). SCRAM addresses both intra-Vehicular Activity (IVA) and Extra-Vehicular Activity (EVA) needs. SCRAM provides a variety of tools which enable welding, brazing, cutting, coating, heating, and cleaning, as well as corresponding nondestructive examination. Near-term IVA-SCRAM applications include repair and modification to fluid lines, structure, and laboratory equipment inside a shirt-sleeve environment (i.e. inside Spacelab or Space Station). Near-term EVA-SCRAM applications include construction of fluid lines and structural members, repair of punctures by orbital debris, refurbishment of surfaces eroded by contaminants. The SCRAM tool-kit also promises future EVA applications involving mass production tasks automated by robotics and artificial intelligence, for construction of large truss, aerobrake, and nuclear reactor shadow shields structures. The leading candidate tool processes for SCRAM, currently undergoing research and development, include Electron Beam, Gas Tungsten Arc, Plasma Arc, and Laser Beam. A series of strategic space flight experiments would make SCRAM available to help conquer the space frontier.

  1. ADMIT: ALMA Data Mining Toolkit

    NASA Astrophysics Data System (ADS)

    Friedel, Douglas N.; Xu, Lisa; Looney, Leslie; Teuben, Peter J.; Pound, Marc W.; Rauch, Kevin P.; Mundy, Lee G.; Kern, Jeffrey S.

    2015-01-01

    ADMIT (ALMA Data Mining Toolkit) is a toolkit for the creation and analysis of new science products from ALMA data. ADMIT is an ALMA Development Project written purely in Python. While specifically targeted for ALMA science and production use after the ALMA pipeline, it is designed to be generally applicable to radio-astronomical data. ADMIT quickly provides users with a detailed overview of their science products: line identifications, line 'cutout' cubes, moment maps, emission type analysis (e.g., feature detection), etc. Users can download the small ADMIT pipeline product (< 20MB), analyze the results, then fine-tune and re-run the ADMIT pipeline (or any part thereof) on their own machines and interactively inspect the results. ADMIT will have both a GUI and command line interface available for this purpose. By analyzing multiple data cubes simultaneously, data mining between many astronomical sources and line transitions will be possible. Users will also be able to enhance the capabilities of ADMIT by creating customized ADMIT tasks satisfying any special processing needs. Future implementations of ADMIT may include EVLA and other instruments.

  2. Admit: Alma Data Mining Toolkit

    NASA Astrophysics Data System (ADS)

    Friedel, Douglas; Looney, Leslie; Xu, Lisa; Pound, Marc W.; Teuben, Peter J.; Rauch, Kevin P.; Mundy, Lee; Kern, Jeffrey S.

    2015-06-01

    ADMIT (ALMA Data Mining Toolkit) is a toolkit for the creation and analysis of new science products from ALMA data. ADMIT is an ALMA Development Project written purely in Python. While specifically targeted for ALMA science and production use after the ALMA pipeline, it is designed to be generally applicable to radio-astronomical data. ADMIT quickly provides users with a detailed overview of their science products: line identifications, line 'cutout' cubes, moment maps, emission type analysis (e.g., feature detection), etc. Users can download the small ADMIT pipeline product (<20MB), analyze the results, then fine-tune and re-run the ADMIT pipeline (or any part thereof) on their own machines and interactively inspect the results. ADMIT will have both a GUI and command line interface available for this purpose. By analyzing multiple data cubes simultaneously, data mining between many astronomical sources and line transitions will be possible. Users will also be able to enhance the capabilities of ADMIT by creating customized ADMIT tasks satisfying any special processing needs. Future implementations of ADMIT may include EVLA and other instruments.

  3. [A biomedical signal processing toolkit programmed by Java].

    PubMed

    Xie, Haiyuan

    2012-09-01

    According to the biomedical signal characteristics, a new biomedical signal processing toolkit is developed. The toolkit is programmed by Java. It is used in basic digital signal processing, random signal processing and etc. All the methods in toolkit has been tested, the program is robust. The feature of the toolkit is detailed explained, easy use and good practicability.

  4. Agent Toolkit Satisfaction and Use in Higher Education.

    ERIC Educational Resources Information Center

    Serenko, Alexander; Detlor, Brian

    2003-01-01

    Examined instructors' satisfaction with and use of intelligent agent toolkits in the classroom. Found that no single uniform toolkit satisfied the needs of instructors. Moreover, satisfaction levels were influenced primarily by user interactions with the toolkit, followed to a lesser extent by toolkit performance and functionality. (EV)

  5. Pydpiper: a flexible toolkit for constructing novel registration pipelines.

    PubMed

    Friedel, Miriam; van Eede, Matthijs C; Pipitone, Jon; Chakravarty, M Mallar; Lerch, Jason P

    2014-01-01

    Using neuroimaging technologies to elucidate the relationship between genotype and phenotype and brain and behavior will be a key contribution to biomedical research in the twenty-first century. Among the many methods for analyzing neuroimaging data, image registration deserves particular attention due to its wide range of applications. Finding strategies to register together many images and analyze the differences between them can be a challenge, particularly given that different experimental designs require different registration strategies. Moreover, writing software that can handle different types of image registration pipelines in a flexible, reusable and extensible way can be challenging. In response to this challenge, we have created Pydpiper, a neuroimaging registration toolkit written in Python. Pydpiper is an open-source, freely available software package that provides multiple modules for various image registration applications. Pydpiper offers five key innovations. Specifically: (1) a robust file handling class that allows access to outputs from all stages of registration at any point in the pipeline; (2) the ability of the framework to eliminate duplicate stages; (3) reusable, easy to subclass modules; (4) a development toolkit written for non-developers; (5) four complete applications that run complex image registration pipelines "out-of-the-box." In this paper, we will discuss both the general Pydpiper framework and the various ways in which component modules can be pieced together to easily create new registration pipelines. This will include a discussion of the core principles motivating code development and a comparison of Pydpiper with other available toolkits. We also provide a comprehensive, line-by-line example to orient users with limited programming knowledge and highlight some of the most useful features of Pydpiper. In addition, we will present the four current applications of the code.

  6. Pydpiper: a flexible toolkit for constructing novel registration pipelines

    PubMed Central

    Friedel, Miriam; van Eede, Matthijs C.; Pipitone, Jon; Chakravarty, M. Mallar; Lerch, Jason P.

    2014-01-01

    Using neuroimaging technologies to elucidate the relationship between genotype and phenotype and brain and behavior will be a key contribution to biomedical research in the twenty-first century. Among the many methods for analyzing neuroimaging data, image registration deserves particular attention due to its wide range of applications. Finding strategies to register together many images and analyze the differences between them can be a challenge, particularly given that different experimental designs require different registration strategies. Moreover, writing software that can handle different types of image registration pipelines in a flexible, reusable and extensible way can be challenging. In response to this challenge, we have created Pydpiper, a neuroimaging registration toolkit written in Python. Pydpiper is an open-source, freely available software package that provides multiple modules for various image registration applications. Pydpiper offers five key innovations. Specifically: (1) a robust file handling class that allows access to outputs from all stages of registration at any point in the pipeline; (2) the ability of the framework to eliminate duplicate stages; (3) reusable, easy to subclass modules; (4) a development toolkit written for non-developers; (5) four complete applications that run complex image registration pipelines “out-of-the-box.” In this paper, we will discuss both the general Pydpiper framework and the various ways in which component modules can be pieced together to easily create new registration pipelines. This will include a discussion of the core principles motivating code development and a comparison of Pydpiper with other available toolkits. We also provide a comprehensive, line-by-line example to orient users with limited programming knowledge and highlight some of the most useful features of Pydpiper. In addition, we will present the four current applications of the code. PMID:25126069

  7. The DLESE Evaluation Toolkit Project

    NASA Astrophysics Data System (ADS)

    Buhr, S. M.; Barker, L. J.; Marlino, M.

    2002-12-01

    The Evaluation Toolkit and Community project is a new Digital Library for Earth System Education (DLESE) collection designed to raise awareness of project evaluation within the geoscience education community, and to enable principal investigators, teachers, and evaluators to implement project evaluation more readily. This new resource is grounded in the needs of geoscience educators, and will provide a virtual home for a geoscience education evaluation community. The goals of the project are to 1) provide a robust collection of evaluation resources useful for Earth systems educators, 2) establish a forum and community for evaluation dialogue within DLESE, and 3) disseminate the resources through the DLESE infrastructure and through professional society workshops and proceedings. Collaboration and expertise in education, geoscience and evaluation are necessary if we are to conduct the best possible geoscience education. The Toolkit allows users to engage in evaluation at whichever level best suits their needs, get more evaluation professional development if desired, and access the expertise of other segments of the community. To date, a test web site has been built and populated, initial community feedback from the DLESE and broader community is being garnered, and we have begun to heighten awareness of geoscience education evaluation within our community. The web site contains features that allow users to access professional development about evaluation, search and find evaluation resources, submit resources, find or offer evaluation services, sign up for upcoming workshops, take the user survey, and submit calendar items. The evaluation resource matrix currently contains resources that have met our initial review. The resources are currently organized by type; they will become searchable on multiple dimensions of project type, audience, objectives and evaluation resource type as efforts to develop a collection-specific search engine mature. The peer review

  8. An introduction to the Lagan alignment toolkit.

    PubMed

    Brudno, Michael

    2007-01-01

    The Lagan Toolkit is a software package for comparison of genomic sequences. It includes the CHAOS local alignment program, LAGAN global alignment program for two, or more sequences and Shuffle-LAGAN, a "glocal" alignment method that handles genomic rearrangements in a global alignment framework. The alignment programs included in the Lagan Toolkit have been widely used to compare genomes of many organisms, from bacteria to large mammalian genomes. This chapter provides an overview of the algorithms used by the LAGAN programs to construct genomic alignments, explains how to build alignments using either the standalone program or the web server, and discusses some of the common pitfalls users encounter when using the toolkit.

  9. Flightspeed Integral Image Analysis Toolkit

    NASA Technical Reports Server (NTRS)

    Thompson, David R.

    2009-01-01

    The Flightspeed Integral Image Analysis Toolkit (FIIAT) is a C library that provides image analysis functions in a single, portable package. It provides basic low-level filtering, texture analysis, and subwindow descriptor for applications dealing with image interpretation and object recognition. Designed with spaceflight in mind, it addresses: Ease of integration (minimal external dependencies) Fast, real-time operation using integer arithmetic where possible (useful for platforms lacking a dedicated floatingpoint processor) Written entirely in C (easily modified) Mostly static memory allocation 8-bit image data The basic goal of the FIIAT library is to compute meaningful numerical descriptors for images or rectangular image regions. These n-vectors can then be used directly for novelty detection or pattern recognition, or as a feature space for higher-level pattern recognition tasks. The library provides routines for leveraging training data to derive descriptors that are most useful for a specific data set. Its runtime algorithms exploit a structure known as the "integral image." This is a caching method that permits fast summation of values within rectangular regions of an image. This integral frame facilitates a wide range of fast image-processing functions. This toolkit has applicability to a wide range of autonomous image analysis tasks in the space-flight domain, including novelty detection, object and scene classification, target detection for autonomous instrument placement, and science analysis of geomorphology. It makes real-time texture and pattern recognition possible for platforms with severe computational restraints. The software provides an order of magnitude speed increase over alternative software libraries currently in use by the research community. FIIAT can commercially support intelligent video cameras used in intelligent surveillance. It is also useful for object recognition by robots or other autonomous vehicles

  10. A universal postprocessing toolkit for accelerator simulation and data analysis.

    SciTech Connect

    Borland, M.

    1998-12-16

    The Self-Describing Data Sets (SDDS) toolkit comprises about 70 generally-applicable programs sharing a common data protocol. At the Advanced Photon Source (APS), SDDS performs the vast majority of operational data collection and processing, most data display functions, and many control functions. In addition, a number of accelerator simulation codes use SDDS for all post-processing and data display. This has three principle advantages: first, simulation codes need not provide customized post-processing tools, thus simplifying development and maintenance. Second, users can enhance code capabilities without changing the code itself, by adding SDDS-based pre- and post-processing. Third, multiple codes can be used together more easily, by employing SDDS for data transfer and adaptation. Given its broad applicability, the SDDS file protocol is surprisingly simple, making it quite easy for simulations to generate SDDS-compliant data. This paper discusses the philosophy behind SDDS, contrasting it with some recent trends, and outlines the capabilities of the toolkit. The paper also gives examples of using SDDS for accelerator simulation.

  11. Water Quality Trading Toolkit for Permit Writers

    EPA Pesticide Factsheets

    The Water Quality Trading Toolkit for Permit Writers is EPA’s first “how-to” manual on designing and implementing water quality trading programs. It helps NPDES permitting authorities incorporate trading provisions into permits.

  12. Usability testing of a fall prevention toolkit.

    PubMed

    Keuter, Kayla R; Berg, Gina M; Hervey, Ashley M; Rogers, Nicole

    2015-05-01

    This study sought to evaluate a fall prevention toolkit, determine its ease of use and user satisfaction, and determine the preferred venue of distribution. Three forms of assessment were used: focus groups, usability testing, and surveys. Focus group participants were recruited from four locations: two rural health clinics and two urban centers. Usability testing participants were recruited from two rural health clinics. Survey questions included self-reported prior falls, current fall prevention habits, reaction to the toolkit, and demographics. Participants reported the toolkit was attractive, well-organized, and easy to use, but may contain too much information. Most participants admitted they would not actively use the toolkit on their own, but prefer having it introduced by a healthcare provider or in a social setting. Healthcare focuses on customer satisfaction; therefore, providers benefit from knowing patient preferred methods of learning fall prevention strategies.

  13. Development of an Integrated Human Factors Toolkit

    NASA Technical Reports Server (NTRS)

    Resnick, Marc L.

    2003-01-01

    An effective integration of human abilities and limitations is crucial to the success of all NASA missions. The Integrated Human Factors Toolkit facilitates this integration by assisting system designers and analysts to select the human factors tools that are most appropriate for the needs of each project. The HF Toolkit contains information about a broad variety of human factors tools addressing human requirements in the physical, information processing and human reliability domains. Analysis of each tool includes consideration of the most appropriate design stage, the amount of expertise in human factors that is required, the amount of experience with the tool and the target job tasks that are needed, and other factors that are critical for successful use of the tool. The benefits of the Toolkit include improved safety, reliability and effectiveness of NASA systems throughout the agency. This report outlines the initial stages of development for the Integrated Human Factors Toolkit.

  14. Performance of the ISIS Distributed Computing Toolkit

    DTIC Science & Technology

    1994-06-22

    Best Available Copy .. A a ~ d ~ . 1) - . Fs’A aiaer rnrgC"opyr~IL tI.ru~ Performance of the ISIS Distributed Computing Toolkit* Kenneth P. Birman...isis.com. Please cite as Technical Report TR-94-1432, Dept. of Computer Science, Cornell University. Performance of the Isis Distributed Computing Toolkit... Distributed computing , performance, process groups, atomic broadcast, causal and total message ordering, cbcast, abcast, multiple process groups

  15. Network Visualization Design Using Prefuse Visualization Toolkit

    DTIC Science & Technology

    2008-03-01

    Lipinski. “ JULIUS - An Extendable Software Framework for Surgi- cal Planning”. Caesar , Berlin, Germany, 2001. URL http://www.caesar.de/ fileadmin/user upload... Julius framework and Ball modelar . . . . . 13 2.9 PNode class hierarchy showing monolithic Piccolo toolkit design [3... JULIUS [24] (used for medical imaging). However, all frameworks studied, except one, selected OpenGL as 12 their graphical visualization toolkit. This

  16. Jefferson Lab Plotting Toolkit for accelerator controls

    SciTech Connect

    Chen, J; Keesee, M; Larrieu, C; Lei, G

    1999-03-01

    Experimental physics generates numerous data sets that scientists analyze using plots, graphs, etc. The Jefferson Lab Plotting Toolkit, JPT, a graphical user interface toolkit, was developed at Jefferson Lab to do data plotting. JPT provides data structures for sets of data, analyzes the range of the data, calculates the reasonable maximum, minimum, and scale of axes, sets line styles and marker styles, plots curves and fills areas.

  17. ISO/IEEE 11073 PHD message generation toolkit to standardize healthcare device.

    PubMed

    Lim, Joon-Ho; Park, Chanyong; Park, Soo-Jun; Lee, Kyu-Chul

    2011-01-01

    As senior population increases, various healthcare devices and services are developed such as fall detection device, home hypertension management service, and etc. However, to vitalize healthcare devices and services market, standardization for interoperability between device and service must precede. To achieve the standardization goal, the IEEE 11073 Personal Health Device (PHD) group has been standardized many healthcare devices, but until now there are few devices compatible with the PHD standard. One of main reasons is that it isn't easy for device manufactures to implement standard communication module by analyzing standard documents of over 600 pages. In this paper, we propose a standard message generation toolkit to easily standardize existing non-standard healthcare devices. The proposed toolkit generates standard PHD messages using inputted device information, and the generated messages are adapted to the device with the standard state machine file. For the experiments, we develop a reference H/W, and test the proposed toolkit with three healthcare devices: blood pressure, weighting scale, and glucose meter. The proposed toolkit has an advantage that even if the user doesn't know the standard in detail, the user can easily standardize the non-standard healthcare devices.

  18. Start/Pat; A parallel-programming toolkit

    SciTech Connect

    Appelbe, B.; Smith, K. ); McDowell, C. )

    1989-07-01

    How can you make Fortran code parallel without isolating the programmer from learning to understand and exploit parallelism effectively. With an interactive toolkit that automates parallelization as it educates. This paper discusses the Start/Pat toolkit.

  19. The Topology ToolKit.

    PubMed

    Tierny, Julien; Favelier, Guillaume; Levine, Joshua A; Gueunet, Charles; Michaux, Michael

    2017-08-29

    This system paper presents the Topology ToolKit (TTK), a software platform designed for the topological analysis of scalar data in scientific visualization. While topological data analysis has gained in popularity over the last two decades, it has not yet been widely adopted as a standard data analysis tool for end users or developers. TTK aims at addressing this problem by providing a unified, generic, efficient, and robust implementation of key algorithms for the topological analysis of scalar data, including: critical points, integral lines, persistence diagrams, persistence curves, merge trees, contour trees, Morse-Smale complexes, fiber surfaces, continuous scatterplots, Jacobi sets, Reeb spaces, and more. TTK is easily accessible to end users due to a tight integration with ParaView. It is also easily accessible to developers through a variety of bindings (Python, VTK/C++) for fast prototyping or through direct, dependency-free, C++, to ease integration into pre-existing complex systems. While developing TTK, we faced several algorithmic and software engineering challenges, which we document in this paper. In particular, we present an algorithm for the construction of a discrete gradient that complies to the critical points extracted in the piecewise-linear setting. This algorithm guarantees a combinatorial consistency across the topological abstractions supported by TTK, and importantly, a unified implementation of topological data simplification for multi-scale exploration and analysis. We also present a cached triangulation data structure, that supports time efficient and generic traversals, which self-adjusts its memory usage on demand for input simplicial meshes and which implicitly emulates a triangulation for regular grids with no memory overhead. Finally, we describe an original software architecture, which guarantees memory efficient and direct accesses to TTK features, while still allowing for researchers powerful and easy bindings and extensions

  20. Integrated Systems Health Management (ISHM) Toolkit

    NASA Technical Reports Server (NTRS)

    Venkatesh, Meera; Kapadia, Ravi; Walker, Mark; Wilkins, Kim

    2013-01-01

    A framework of software components has been implemented to facilitate the development of ISHM systems according to a methodology based on Reliability Centered Maintenance (RCM). This framework is collectively referred to as the Toolkit and was developed using General Atomics' Health MAP (TM) technology. The toolkit is intended to provide assistance to software developers of mission-critical system health monitoring applications in the specification, implementation, configuration, and deployment of such applications. In addition to software tools designed to facilitate these objectives, the toolkit also provides direction to software developers in accordance with an ISHM specification and development methodology. The development tools are based on an RCM approach for the development of ISHM systems. This approach focuses on defining, detecting, and predicting the likelihood of system functional failures and their undesirable consequences.

  1. Desensitized Optimal Filtering and Sensor Fusion Toolkit

    NASA Technical Reports Server (NTRS)

    Karlgaard, Christopher D.

    2015-01-01

    Analytical Mechanics Associates, Inc., has developed a software toolkit that filters and processes navigational data from multiple sensor sources. A key component of the toolkit is a trajectory optimization technique that reduces the sensitivity of Kalman filters with respect to model parameter uncertainties. The sensor fusion toolkit also integrates recent advances in adaptive Kalman and sigma-point filters for non-Gaussian problems with error statistics. This Phase II effort provides new filtering and sensor fusion techniques in a convenient package that can be used as a stand-alone application for ground support and/or onboard use. Its modular architecture enables ready integration with existing tools. A suite of sensor models and noise distribution as well as Monte Carlo analysis capability are included to enable statistical performance evaluations.

  2. "Handy Manny" and the Emergent Literacy Technology Toolkit

    ERIC Educational Resources Information Center

    Hourcade, Jack J.; Parette, Howard P., Jr.; Boeckmann, Nichole; Blum, Craig

    2010-01-01

    This paper outlines the use of a technology toolkit to support emergent literacy curriculum and instruction in early childhood education settings. Components of the toolkit include hardware and software that can facilitate key emergent literacy skills. Implementation of the comprehensive technology toolkit enhances the development of these…

  3. "Handy Manny" and the Emergent Literacy Technology Toolkit

    ERIC Educational Resources Information Center

    Hourcade, Jack J.; Parette, Howard P., Jr.; Boeckmann, Nichole; Blum, Craig

    2010-01-01

    This paper outlines the use of a technology toolkit to support emergent literacy curriculum and instruction in early childhood education settings. Components of the toolkit include hardware and software that can facilitate key emergent literacy skills. Implementation of the comprehensive technology toolkit enhances the development of these…

  4. The Ames MER microscopic imager toolkit

    USGS Publications Warehouse

    Sargent, R.; Deans, Matthew; Kunz, C.; Sims, M.; Herkenhoff, K.

    2005-01-01

    12The Mars Exploration Rovers, Spirit and Opportunity, have spent several successful months on Mars, returning gigabytes of images and spectral data to scientists on Earth. One of the instruments on the MER rovers, the Athena Microscopic Imager (MI), is a fixed focus, megapixel camera providing a ??3mm depth of field and a 31??31mm field of view at a working distance of 63 mm from the lens to the object being imaged. In order to maximize the science return from this instrument, we developed the Ames MI Toolkit and supported its use during the primary mission. The MI Toolkit is a set of programs that operate on collections of MI images, with the goal of making the data more understandable to the scientists on the ground. Because of the limited depth of field of the camera, and the often highly variable topography of the terrain being imaged, MI images of a given rock are often taken as a stack, with the Instrument Deployment Device (IDD) moving along a computed normal vector, pausing every few millimeters for the MI to acquire an image. The MI Toolkit provides image registration and focal section merging, which combine these images to form a single, maximally in-focus image, while compensating for changes in lighting as well as parallax due to the motion of the camera. The MI Toolkit also provides a 3-D reconstruction of the surface being imaged using stereo and can embed 2-D MI images as texture maps into 3-D meshes produced by other imagers on board the rover to provide context. The 2-D images and 3-D meshes output from the Toolkit are easily viewed by scientists using other mission tools, such as Viz or the MI Browser.This paper describes the MI Toolkit in detail, as well as our experience using it with scientists at JPL during the primary MER mission. ?? 2005 IEEE.

  5. The 2016 ACCP Pharmacotherapy Didactic Curriculum Toolkit.

    PubMed

    Schwinghammer, Terry L; Crannage, Andrew J; Boyce, Eric G; Bradley, Bridget; Christensen, Alyssa; Dunnenberger, Henry M; Fravel, Michelle; Gurgle, Holly; Hammond, Drayton A; Kwon, Jennifer; Slain, Douglas; Wargo, Kurt A

    2016-11-01

    The 2016 American College of Clinical Pharmacy (ACCP) Educational Affairs Committee was charged with updating and contemporizing ACCP's 2009 Pharmacotherapy Didactic Curriculum Toolkit. The toolkit has been designed to guide schools and colleges of pharmacy in developing, maintaining, and modifying their curricula. The 2016 committee reviewed the recent medical literature and other documents to identify disease states that are responsive to drug therapy. Diseases and content topics were organized by organ system, when feasible, and grouped into tiers as defined by practice competency. Tier 1 topics should be taught in a manner that prepares all students to provide collaborative, patient-centered care upon graduation and licensure. Tier 2 topics are generally taught in the professional curriculum, but students may require additional knowledge or skills after graduation (e.g., residency training) to achieve competency in providing direct patient care. Tier 3 topics may not be taught in the professional curriculum; thus, graduates will be required to obtain the necessary knowledge and skills on their own to provide direct patient care, if required in their practice. The 2016 toolkit contains 276 diseases and content topics, of which 87 (32%) are categorized as tier 1, 133 (48%) as tier 2, and 56 (20%) as tier 3. The large number of tier 1 topics will require schools and colleges to use creative pedagogical strategies to achieve the necessary practice competencies. Almost half of the topics (48%) are tier 2, highlighting the importance of postgraduate residency training or equivalent practice experience to competently care for patients with these disorders. The Pharmacotherapy Didactic Curriculum Toolkit will continue to be updated to provide guidance to faculty at schools and colleges of pharmacy as these academic pharmacy institutions regularly evaluate and modify their curricula to keep abreast of scientific advances and associated practice changes. Access the

  6. The Ames MER Microscopic Imager Toolkit

    NASA Technical Reports Server (NTRS)

    Sargent, Randy; Deans, Matthew; Kunz, Clayton; Sims, Michael; Herkenhoff, Ken

    2005-01-01

    The Mars Exploration Rovers, Spirit and Opportunity, have spent several successful months on Mars, returning gigabytes of images and spectral data to scientists on Earth. One of the instruments on the MER rovers, the Athena Microscopic Imager (MI), is a fixed focus, megapixel camera providing a plus or minus mm depth of field and a 3lx31mm field of view at a working distance of 63 mm from the lens to the object being imaged. In order to maximize the science return from this instrument, we developed the Ames MI Toolkit and supported its use during the primary mission. The MI Toolkit is a set of programs that operate on collections of MI images, with the goal of making the data more understandable to the scientists on the ground. Because of the limited depth of field of the camera, and the often highly variable topography of the terrain being imaged, MI images of a given rock are often taken as a stack, with the Instrument Deployment Device (IDD) moving along a computed normal vector, pausing every few millimeters for the MI to acquire an image. The MI Toolkit provides image registration and focal section merging, which combine these images to form a single, maximally in-focus image, while compensating for changes in lighting as well as parallax due to the motion of the camera. The MI Toolkit also provides a 3-D reconstruction of the surface being imaged using stereo and can embed 2-D MI images as texture maps into 3-D meshes produced by other imagers on board the rover to provide context. The 2-D images and 3-D meshes output from the Toolkit are easily viewed by scientists using other mission tools, such as Viz or the MI Browser. This paper describes the MI Toolkit in detail, as well as our experience using it with scientists at JPL during the primary MER mission.

  7. TRSkit: A Simple Digital Library Toolkit

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Esler, Sandra L.

    1997-01-01

    This paper introduces TRSkit, a simple and effective toolkit for building digital libraries on the World Wide Web. The toolkit was developed for the creation of the Langley Technical Report Server and the NASA Technical Report Server, but is applicable to most simple distribution paradigms. TRSkit contains a handful of freely available software components designed to be run under the UNIX operating system and served via the World Wide Web. The intended customer is the person that must continuously and synchronously distribute anywhere from 100 - 100,000's of information units and does not have extensive resources to devote to the problem.

  8. Anchor Toolkit - a secure mobile agent system

    SciTech Connect

    Mudumbai, Srilekha S.; Johnston, William; Essiari, Abdelilah

    1999-05-19

    Mobile agent technology facilitates intelligent operation insoftware systems with less human interaction. Major challenge todeployment of mobile agents include secure transmission of agents andpreventing unauthorized access to resources between interacting systems,as either hosts, or agents, or both can act maliciously. The Anchortoolkit, designed by LBNL, handles the transmission and secure managementof mobile agents in a heterogeneous distributed computing environment. Itprovides users with the option of incorporating their security managers.This paper concentrates on the architecture, features, access control anddeployment of Anchor toolkit. Application of this toolkit in a securedistributed CVS environment is discussed as a case study.

  9. WIND Toolkit Power Data Site Index

    SciTech Connect

    Draxl, Caroline; Mathias-Hodge, Bri

    2016-10-19

    This spreadsheet contains per-site metadata for the WIND Toolkit sites and serves as an index for the raw data hosted on Globus connect (nrel#globus:/globusro/met_data). Aside from the metadata, per site average power and capacity factor are given. This data was prepared by 3TIER under contract by NREL and is public domain. Authoritative documentation on the creation of the underlying dataset is at: Final Report on the Creation of the Wind Integration National Dataset (WIND) Toolkit and API: http://www.nrel.gov/docs/fy16osti/66189.pdf

  10. Marine Debris and Plastic Source Reduction Toolkit

    EPA Pesticide Factsheets

    Many plastic food service ware items originate on college and university campuses—in cafeterias, snack rooms, cafés, and eateries with take-out dining options. This Campus Toolkit is a detailed “how to” guide for reducing plastic waste on college campuses.

  11. Healthy People 2010: Oral Health Toolkit

    ERIC Educational Resources Information Center

    Isman, Beverly

    2007-01-01

    The purpose of this Toolkit is to provide guidance, technical tools, and resources to help states, territories, tribes and communities develop and implement successful oral health components of Healthy People 2010 plans as well as other oral health plans. These plans are useful for: (1) promoting, implementing and tracking oral health objectives;…

  12. Toolkit Design for Interactive Structured Graphics

    DTIC Science & Technology

    2003-01-01

    GUI toolkits provide higher-level support for creating custom application widgets, or provide support for structured graphics. Amulet [21] is a...1997). The Amulet Environment: New Models for Effective User Interface Software Development". IEEE Transactions on Software Engineering, 23(6), pp. 347

  13. Virginia Adult Education Health Literacy Toolkit.

    ERIC Educational Resources Information Center

    Singleton, Kate, Comp.

    This toolkit is a resource to help adult education instructors and administrators better understand the problem of health literacy as it affects their learners. It is designed to support creative approaches to helping learners increase their health literacy as they engage in sound, productive adult literacy instruction. Information resources are…

  14. The Two-Way Immersion Toolkit

    ERIC Educational Resources Information Center

    Howard, Elizabeth; Sugarman, Julie; Perdomo, Marleny; Adger, Carolyn Temple

    2005-01-01

    This Toolkit is meant to be a resource for teachers, parents, and administrators involved with two-way immersion (TWI) programs, particularly those at the elementary level. Two-way immersion is a form of dual language instruction that brings together students from two native language groups for language, literacy, and academic content instruction…

  15. Ready, Set, Respect! GLSEN's Elementary School Toolkit

    ERIC Educational Resources Information Center

    Gay, Lesbian and Straight Education Network (GLSEN), 2012

    2012-01-01

    "Ready, Set, Respect!" provides a set of tools to help elementary school educators ensure that all students feel safe and respected and develop respectful attitudes and behaviors. It is not a program to be followed but instead is designed to help educators prepare themselves for teaching about and modeling respect. The toolkit responds to…

  16. Toolkit of Available EPA Green Infrastructure Modeling ...

    EPA Pesticide Factsheets

    This webinar will present a toolkit consisting of five EPA green infrastructure models and tools, along with communication material. This toolkit can be used as a teaching and quick reference resource for use by planners and developers when making green infrastructure implementation decisions. It can also be used for low impact development design competitions. Models and tools included: Green Infrastructure Wizard (GIWiz), Watershed Management Optimization Support Tool (WMOST), Visualizing Ecosystem Land Management Assessments (VELMA) Model, Storm Water Management Model (SWMM), and the National Stormwater Calculator (SWC). This webinar will present a toolkit consisting of five EPA green infrastructure models and tools, along with communication material. This toolkit can be used as a teaching and quick reference resource for use by planners and developers when making green infrastructure implementation decisions. It can also be used for low impact development design competitions. Models and tools included: Green Infrastructure Wizard (GIWiz), Watershed Management Optimization Support Tool (WMOST), Visualizing Ecosystem Land Management Assessments (VELMA) Model, Storm Water Management Model (SWMM), and the National Stormwater Calculator (SWC).

  17. Teacher Quality Toolkit. 2nd Edition

    ERIC Educational Resources Information Center

    Lauer, Patricia A.; Dean, Ceri B.; Martin-Glenn, Mya L.; Asensio, Margaret L.

    2005-01-01

    The Teacher Quality Toolkit addresses the continuum of teacher learning by providing tools that can be used to improve both preservice, and inservice teacher education. Each chapter provides self assessment tools that can guide progress toward improved teacher quality and describes resources for designing exemplary programs and practices. Chapters…

  18. A Toolkit for Stimulating Productive Thinking

    ERIC Educational Resources Information Center

    Janssen, Fred; de Hullu, Els

    2008-01-01

    Students need tools, thinking skills, to help them think actively and in depth about biological phenomena. They need to know what kind of questions to ask and how to find answers to those questions. In this article we present a toolkit with 12 "thinking tools" for asking and answering questions about biological phenomena from different…

  19. Plus 50: Business Community Outreach Toolkit

    ERIC Educational Resources Information Center

    American Association of Community Colleges (NJ1), 2009

    2009-01-01

    This toolkit is designed to support you in building partnerships with the business community. It includes a series of fact sheets you can distribute to employers that discuss the value in hiring plus 50 workers. Individual sections contain footnotes. (Contains 5 web resources.)

  20. Integrated System Health Management Development Toolkit

    NASA Technical Reports Server (NTRS)

    Figueroa, Jorge; Smith, Harvey; Morris, Jon

    2009-01-01

    This software toolkit is designed to model complex systems for the implementation of embedded Integrated System Health Management (ISHM) capability, which focuses on determining the condition (health) of every element in a complex system (detect anomalies, diagnose causes, and predict future anomalies), and to provide data, information, and knowledge (DIaK) to control systems for safe and effective operation.

  1. Sandia multispectral analyst remote sensing toolkit (SMART).

    SciTech Connect

    Post, Brian Nelson; Smith, Jody Lynn; Geib, Peter L.; Nandy, Prabal; Wang, Nancy Nairong

    2003-03-01

    This remote sensing science and exploitation work focused on exploitation algorithms and methods targeted at the analyst. SMART is a 'plug-in' to commercial remote sensing software that provides algorithms to enhance the utility of the Multispectral Thermal Imager (MTI) and other multispectral satellite data. This toolkit has been licensed to 22 government organizations.

  2. Karma: Visualisation Test-Bed Toolkit

    NASA Astrophysics Data System (ADS)

    Gooch, Richard

    2011-02-01

    Karma is a toolkit for interprocess communications, authentication, encryption, graphics display, user interface and manipulating the Karma network data structure. It contains KarmaLib (the structured libraries and API) and a large number of modules (applications) to perform many standard tasks. A suite of visualisation tools are distributed with the library.

  3. Media Toolkit for Anti-Drug Action.

    ERIC Educational Resources Information Center

    Office of National Drug Control Policy, Washington, DC.

    This toolkit provides proven methods, models, and templates for tying anti-drug efforts to the National Youth Anti-Drug Media Campaign. It helps organizations deliver the Campaign's messages to the media and to other groups and individuals who care about keeping the nation's youth drug free. Eight sections focus on: (1) "Campaign…

  4. A Toolkit for the Effective Teaching Assistant

    ERIC Educational Resources Information Center

    Tyrer, Richard; Gunn, Stuart; Lee, Chris; Parker, Maureen; Pittman, Mary; Townsend, Mark

    2004-01-01

    This book offers the notion of a "toolkit" to allow Teaching Assistants (TAs) and colleagues to review and revise their thinking and practice about real issues and challenges in managing individuals, groups, colleagues and themselves in school. In a rapidly changing educational environment the book focuses on combining the underpinning knowledge…

  5. Services development toolkit for Open Research Data (Promarket)

    NASA Astrophysics Data System (ADS)

    Som de Cerff, W.; Schwichtenberg, H.; Gemünd, A.; Claus, S.; Reichwald, J.; Denvil, S.; Mazetti, P.; Nativi, S.

    2012-04-01

    According to the declaration of the Organisation for Economic Co-operation and Development (OECD) on Open Access: "OECD Principles and Guidelines for Access to Research Data from Public Funding" research data should be available for everyone and Europe follows these directions (Digital Agenda, N. Kroes). Data being 'open' does not mean directly applicable: research data are often complex to use and difficult to interpret by non-experts. Also, if extra services are needed, e.g. certain delivery guarantees, SLAs need to be negotiated. Comparable to OSS development models, where software is open and services and support are paid for, there is a large potential for commercial activities and services around this open and free research data. E.g. Climate, weather or data from instruments can be used to generate business values when offered as easy and reliable services for Apps integration. The project will design a toolkit for developers in research data centres. The tools will allow to develop services to provide research data and map business processes e.g. automatic service level agreements to their service to make open research data attractive for commercial and academic use by the centre and others. It will enable to build and develop open, reliable and scalable services and end products, e.g. accessible from end user devices such as smart phones. Researchers, interested citizen or company developers will be enabled to access open data as an "easy-to-use" service and aggregate it with other services. The project will address a broad range of developers and give them a toolkit in well-known settings, portable, scalable, open and useable in public development environments and tools. This topic will be addressed technically by utilizing service-oriented approaches based on open standards and protocols, combined with new programming models and techniques.

  6. Construction aggregates

    USGS Publications Warehouse

    Tepordei, V.V.

    1995-01-01

    Part of the 1994 Industrial Minerals Review. The production, consumption, and applications of construction aggregates are reviewed. In 1994, the production of construction aggregates, which includes crushed stone and construction sand and gravel combined, increased 7.7 percent to 2.14 Gt compared with the previous year. These record production levels are mostly a result of funding for highway construction work provided by the Intermodal Surface Transportation Efficiency Act of 1991. Demand is expected to increase for construction aggregates in 1995.

  7. Global Arrays Parallel Programming Toolkit

    SciTech Connect

    Nieplocha, Jaroslaw; Krishnan, Manoj Kumar; Palmer, Bruce J.; Tipparaju, Vinod; Harrison, Robert J.; Chavarría-Miranda, Daniel

    2011-01-01

    The two predominant classes of programming models for parallel computing are distributed memory and shared memory. Both shared memory and distributed memory models have advantages and shortcomings. Shared memory model is much easier to use but it ignores data locality/placement. Given the hierarchical nature of the memory subsystems in modern computers this characteristic can have a negative impact on performance and scalability. Careful code restructuring to increase data reuse and replacing fine grain load/stores with block access to shared data can address the problem and yield performance for shared memory that is competitive with message-passing. However, this performance comes at the cost of compromising the ease of use that the shared memory model advertises. Distributed memory models, such as message-passing or one-sided communication, offer performance and scalability but they are difficult to program. The Global Arrays toolkit attempts to offer the best features of both models. It implements a shared-memory programming model in which data locality is managed by the programmer. This management is achieved by calls to functions that transfer data between a global address space (a distributed array) and local storage. In this respect, the GA model has similarities to the distributed shared-memory models that provide an explicit acquire/release protocol. However, the GA model acknowledges that remote data is slower to access than local data and allows data locality to be specified by the programmer and hence managed. GA is related to the global address space languages such as UPC, Titanium, and, to a lesser extent, Co-Array Fortran. In addition, by providing a set of data-parallel operations, GA is also related to data-parallel languages such as HPF, ZPL, and Data Parallel C. However, the Global Array programming model is implemented as a library that works with most languages used for technical computing and does not rely on compiler technology for achieving

  8. The Reconstruction Toolkit (RTK), an open-source cone-beam CT reconstruction toolkit based on the Insight Toolkit (ITK)

    NASA Astrophysics Data System (ADS)

    Rit, S.; Vila Oliva, M.; Brousmiche, S.; Labarbe, R.; Sarrut, D.; Sharp, G. C.

    2014-03-01

    We propose the Reconstruction Toolkit (RTK, http://www.openrtk.org), an open-source toolkit for fast cone-beam CT reconstruction, based on the Insight Toolkit (ITK) and using GPU code extracted from Plastimatch. RTK is developed by an open consortium (see affiliations) under the non-contaminating Apache 2.0 license. The quality of the platform is daily checked with regression tests in partnership with Kitware, the company supporting ITK. Several features are already available: Elekta, Varian and IBA inputs, multi-threaded Feldkamp-David-Kress reconstruction on CPU and GPU, Parker short scan weighting, multi-threaded CPU and GPU forward projectors, etc. Each feature is either accessible through command line tools or C++ classes that can be included in independent software. A MIDAS community has been opened to share CatPhan datasets of several vendors (Elekta, Varian and IBA). RTK will be used in the upcoming cone-beam CT scanner developed by IBA for proton therapy rooms. Many features are under development: new input format support, iterative reconstruction, hybrid Monte Carlo / deterministic CBCT simulation, etc. RTK has been built to freely share tomographic reconstruction developments between researchers and is open for new contributions.

  9. Autism Speaks Toolkits: Resources for Busy Physicians.

    PubMed

    Bellando, Jayne; Fussell, Jill J; Lopez, Maya

    2016-02-01

    Given the increased prevalence of autism spectrum disorders (ASD), it is likely that busy primary care providers (PCP) are providing care to individuals with ASD in their practice. Autism Speaks provides a wealth of educational, medical, and treatment/intervention information resources for PCPs and families, including at least 32 toolkits. This article serves to familiarize PCPs and families on the different toolkits that are available on the Autism Speaks website. This article is intended to increase physicians' knowledge on the issues that families with children with ASD frequently encounter, to increase their ability to share evidence-based information to guide treatment and care for affected families in their practice. © The Author(s) 2015.

  10. A toolkit for detecting technical surprise.

    SciTech Connect

    Trahan, Michael Wayne; Foehse, Mark C.

    2010-10-01

    The detection of a scientific or technological surprise within a secretive country or institute is very difficult. The ability to detect such surprises would allow analysts to identify the capabilities that could be a military or economic threat to national security. Sandia's current approach utilizing ThreatView has been successful in revealing potential technological surprises. However, as data sets become larger, it becomes critical to use algorithms as filters along with the visualization environments. Our two-year LDRD had two primary goals. First, we developed a tool, a Self-Organizing Map (SOM), to extend ThreatView and improve our understanding of the issues involved in working with textual data sets. Second, we developed a toolkit for detecting indicators of technical surprise in textual data sets. Our toolkit has been successfully used to perform technology assessments for the Science & Technology Intelligence (S&TI) program.

  11. ECCE Toolkit: Prototyping Sensor-Based Interaction

    PubMed Central

    Bellucci, Andrea; Aedo, Ignacio; Díaz, Paloma

    2017-01-01

    Building and exploring physical user interfaces requires high technical skills and hours of specialized work. The behavior of multiple devices with heterogeneous input/output channels and connectivity has to be programmed in a context where not only the software interface matters, but also the hardware components are critical (e.g., sensors and actuators). Prototyping physical interaction is hindered by the challenges of: (1) programming interactions among physical sensors/actuators and digital interfaces; (2) implementing functionality for different platforms in different programming languages; and (3) building custom electronic-incorporated objects. We present ECCE (Entities, Components, Couplings and Ecosystems), a toolkit for non-programmers that copes with these issues by abstracting from low-level implementations, thus lowering the complexity of prototyping small-scale, sensor-based physical interfaces to support the design process. A user evaluation provides insights and use cases of the kind of applications that can be developed with the toolkit. PMID:28241502

  12. ECCE Toolkit: Prototyping Sensor-Based Interaction.

    PubMed

    Bellucci, Andrea; Aedo, Ignacio; Díaz, Paloma

    2017-02-23

    Building and exploring physical user interfaces requires high technical skills and hours of specialized work. The behavior of multiple devices with heterogeneous input/output channels and connectivity has to be programmed in a context where not only the software interface matters, but also the hardware components are critical (e.g., sensors and actuators). Prototyping physical interaction is hindered by the challenges of: (1) programming interactions among physical sensors/actuators and digital interfaces; (2) implementing functionality for different platforms in different programming languages; and (3) building custom electronic-incorporated objects. We present ECCE (Entities, Components, Couplings and Ecosystems), a toolkit for non-programmers that copes with these issues by abstracting from low-level implementations, thus lowering the complexity of prototyping small-scale, sensor-based physical interfaces to support the design process. A user evaluation provides insights and use cases of the kind of applications that can be developed with the toolkit.

  13. INTELMOD - An Intelligent Satellite Modelling Toolkit

    NASA Astrophysics Data System (ADS)

    Aynsley, M.; Hiden, H.

    This paper describes the development of an intelligent, generic spacecraft modelling toolkit, INTELMOD (INTELligent MODeller). The system has been designed to provide an environment which can efficiently capture spacecraft engineering and operational expertise, coupled with mission or phase-related knowledge. This knowledge can then be applied to support human flight controllers at ESA (European Space Agency) in performing a number of generic monitoring, analytical and diagnostic tasks. INTELMOD has been developed using a RAD (Rapid Application Development) approach, based on the Dynamic Systems Development Methodology (DSDM) and has made extensive use of Commercial Off-The-Shelf (COTS) software products. INTELMOD also incorporates UNiT (Universal Intelligent Toolkit), to provide automatic execution of recovery procedures following fault detection and isolation. Users of INTELMOD require no formal programming experience, as models can be constructed with user-friendly editors that employ a “drag and drop” approach using pre- defined palettes of key components.

  14. Texas Team: Academic Progression and IOM Toolkit.

    PubMed

    Reid, Helen; Tart, Kathryn; Tietze, Mari; Joseph, Nitha Mathew; Easley, Carson

    The Institute of Medicine (IOM) Future of Nursing report, identified eight recommendations for nursing to improve health care for all Americans. The Texas Team for Advancing Health Through Nursing embraced the challenge of implementing the recommendations through two diverse projects. One group conducted a broad, online survey of leadership, practice, and academia, focusing on the IOM recommendations. The other focused specifically on academic progression through the use of CABNET (Consortium for Advancing Baccalaureate Nursing Education in Texas) articulation agreements. The survey revealed a lack of knowledge and understanding of the IOM recommendations, prompting development of an online IOM toolkit. The articulation agreements provide a clear pathway for students to the RN-to-BSN degree students. The toolkit and articulation agreements provide rich resources for implementation of the IOM recommendations.

  15. Texas Team: Academic Progression and IOM Toolkit.

    PubMed

    Reid, Helen; Tart, Kathryn; Tietze, Mari; Joseph, Nitha Mathew; Easley, Carson

    2017-08-04

    The Institute of Medicine (IOM) Future of Nursing report, identified eight recommendations for nursing to improve health care for all Americans. The Texas Team for Advancing Health Through Nursing embraced the challenge of implementing the recommendations through two diverse projects. One group conducted a broad, online survey of leadership, practice, and academia, focusing on the IOM recommendations. The other focused specifically on academic progression through the use of CABNET (Consortium for Advancing Baccalaureate Nursing Education in Texas) articulation agreements. The survey revealed a lack of knowledge and understanding of the IOM recommendations, prompting development of an online IOM toolkit. The articulation agreements provide a clear pathway for students to the RN-to-BSN degree students. The toolkit and articulation agreements provide rich resources for implementation of the IOM recommendations.

  16. The Interactive Learning Toolkit: supporting interactive classrooms

    NASA Astrophysics Data System (ADS)

    Dutta, S.; McCauley, V.; Mazur, E.

    2004-05-01

    Research-based interactive learning techniques have dramatically improved student understanding. We have created the 'Interactive Learning Toolkit' (ILT), a web-based learning management system, to help implement two such pedagogies: Just in Time Teaching and Peer Instruction. Our main goal in developing this toolkit is to save the instructor time and effort and to use technology to facilitate the interaction between the students and the instructor (and between students themselves). After a brief review of both pedagogies, we will demonstrate the many exciting new features of the ILT. We will show how technology can not only implement, but also supplement and improve these pedagogies. We would like acknowdge grants from NSF and DEAS, Harvard University

  17. Application experiences with the Globus toolkit.

    SciTech Connect

    Brunett, S.

    1998-06-09

    The Globus grid toolkit is a collection of software components designed to support the development of applications for high-performance distributed computing environments, or ''computational grids'' [14]. The Globus toolkit is an implementation of a ''bag of services'' architecture, which provides application and tool developers not with a monolithic system but rather with a set of stand-alone services. Each Globus component provides a basic service, such as authentication, resource allocation, information, communication, fault detection, and remote data access. Different applications and tools can combine these services in different ways to construct ''grid-enabled'' systems. The Globus toolkit has been used to construct the Globus Ubiquitous Supercomputing Testbed, or GUSTO: a large-scale testbed spanning 20 sites and included over 4000 compute nodes for a total compute power of over 2 TFLOPS. Over the past six months, we and others have used this testbed to conduct a variety of application experiments, including multi-user collaborative environments (tele-immersion), computational steering, distributed supercomputing, and high throughput computing. The goal of this paper is to review what has been learned from these experiments regarding the effectiveness of the toolkit approach. To this end, we describe two of the application experiments in detail, noting what worked well and what worked less well. The two applications are a distributed supercomputing application, SF-Express, in which multiple supercomputers are harnessed to perform large distributed interactive simulations; and a tele-immersion application, CAVERNsoft, in which the focus is on connecting multiple people to a distributed simulated world.

  18. chemf: A purely functional chemistry toolkit.

    PubMed

    Höck, Stefan; Riedl, Rainer

    2012-12-20

    Although programming in a type-safe and referentially transparent style offers several advantages over working with mutable data structures and side effects, this style of programming has not seen much use in chemistry-related software. Since functional programming languages were designed with referential transparency in mind, these languages offer a lot of support when writing immutable data structures and side-effects free code. We therefore started implementing our own toolkit based on the above programming paradigms in a modern, versatile programming language. We present our initial results with functional programming in chemistry by first describing an immutable data structure for molecular graphs together with a couple of simple algorithms to calculate basic molecular properties before writing a complete SMILES parser in accordance with the OpenSMILES specification. Along the way we show how to deal with input validation, error handling, bulk operations, and parallelization in a purely functional way. At the end we also analyze and improve our algorithms and data structures in terms of performance and compare it to existing toolkits both object-oriented and purely functional. All code was written in Scala, a modern multi-paradigm programming language with a strong support for functional programming and a highly sophisticated type system. We have successfully made the first important steps towards a purely functional chemistry toolkit. The data structures and algorithms presented in this article perform well while at the same time they can be safely used in parallelized applications, such as computer aided drug design experiments, without further adjustments. This stands in contrast to existing object-oriented toolkits where thread safety of data structures and algorithms is a deliberate design decision that can be hard to implement. Finally, the level of type-safety achieved by Scala highly increased the reliability of our code as well as the productivity of

  19. chemf: A purely functional chemistry toolkit

    PubMed Central

    2012-01-01

    Background Although programming in a type-safe and referentially transparent style offers several advantages over working with mutable data structures and side effects, this style of programming has not seen much use in chemistry-related software. Since functional programming languages were designed with referential transparency in mind, these languages offer a lot of support when writing immutable data structures and side-effects free code. We therefore started implementing our own toolkit based on the above programming paradigms in a modern, versatile programming language. Results We present our initial results with functional programming in chemistry by first describing an immutable data structure for molecular graphs together with a couple of simple algorithms to calculate basic molecular properties before writing a complete SMILES parser in accordance with the OpenSMILES specification. Along the way we show how to deal with input validation, error handling, bulk operations, and parallelization in a purely functional way. At the end we also analyze and improve our algorithms and data structures in terms of performance and compare it to existing toolkits both object-oriented and purely functional. All code was written in Scala, a modern multi-paradigm programming language with a strong support for functional programming and a highly sophisticated type system. Conclusions We have successfully made the first important steps towards a purely functional chemistry toolkit. The data structures and algorithms presented in this article perform well while at the same time they can be safely used in parallelized applications, such as computer aided drug design experiments, without further adjustments. This stands in contrast to existing object-oriented toolkits where thread safety of data structures and algorithms is a deliberate design decision that can be hard to implement. Finally, the level of type-safety achieved by Scala highly increased the reliability of our code

  20. Mission Operations and Navigation Toolkit Environment

    NASA Technical Reports Server (NTRS)

    Sunseri, Richard F.; Wu, Hsi-Cheng; Hanna, Robert A.; Mossey, Michael P.; Duncan, Courtney B.; Evans, Scott E.; Evans, James R.; Drain, Theodore R.; Guevara, Michelle M.; Martin Mur, Tomas J.; Attiyah, Ahlam A.

    2009-01-01

    MONTE (Mission Operations and Navigation Toolkit Environment) Release 7.3 is an extensible software system designed to support trajectory and navigation analysis/design for space missions. MONTE is intended to replace the current navigation and trajectory analysis software systems, which, at the time of this reporting, are used by JPL's Navigation and Mission Design section. The software provides an integrated, simplified, and flexible system that can be easily maintained to serve the needs of future missions in need of navigation services.

  1. The Bio* toolkits--a brief overview.

    PubMed

    Mangalam, Harry

    2002-09-01

    Bioinformatics research is often difficult to do with commercial software. The Open Source BioPerl, BioPython and Biojava projects provide toolkits with multiple functionality that make it easier to create customised pipelines or analysis. This review briefly compares the quirks of the underlying languages and the functionality, documentation, utility and relative advantages of the Bio counterparts, particularly from the point of view of the beginning biologist programmer.

  2. SimScape Terrain Modeling Toolkit

    NASA Technical Reports Server (NTRS)

    Jain, Abhinandan; Cameron, Jonathan; Lim, Christopher; Guineau, John

    2006-01-01

    This paper describes the SIMSCAPE middleware toolkit that has been developed recently to provide a common infrastructure for importing terrain model data from multiple data sources and making them available to simulation applications. The SIMSCAPE infrastructure simplifies the overall simulation design by eliminating the traditional need for custom terrain model interfaces to terrain data sources and simulation users. SIMSCAPE provides a collection of libraries and tools to use and manage terrain environment models within a wide range of simulation applications.

  3. HVAC Fault Detection and Diagnosis Toolkit

    SciTech Connect

    Haves, Philip; Xu, Peng; Kim, Moosung

    2004-12-31

    This toolkit supports component-level model-based fault detection methods in commercial building HVAC systems. The toolbox consists of five basic modules: a parameter estimator for model calibration, a preprocessor, an AHU model simulator, a steady-state detector, and a comparator. Each of these modules and the fuzzy logic rules for fault diagnosis are described in detail. The toolbox is written in C++ and also invokes the SPARK simulation program.

  4. SIERRA Toolkit v. 2.0

    SciTech Connect

    Coffey, Todd; Williams, Alan; Bhardwaj, Manoj; Galze, David; Okusanya, Tolulope; Roehrig, Nathaniel; Wilson, Christopher; Crane, Nathan; Xavier, Patrick

    2016-09-14

    The SIERRA Toolkit is a collection of libraries to facilitate the development of parallel engineering analysis applications. These libraries supply basic core services that an engineering application may need such as a parallel distributed and dynamic mesh database (for unstructured meshes), mechanics algorithm support (parallel infrastructure only), interfaces to parallel solvers, parallel mesh and data I/O, and various utilities (timers, diagnostic tools, etc.)

  5. A Racial Equity Toolkit for Midwifery Organizations.

    PubMed

    Gordon, Wendy M

    2016-11-01

    Midwifery associations are increasing awareness and commitment to racial equity in the profession and in the communities we serve. Moving these commitments from words into action may be facilitated by a racial equity toolkit to help guide midwifery organizations to consider all policies, initiatives, and actions with a racial equity lens. Racial equity impact analyses have been used in recent years by various governmental agencies in the United States and abroad with positive results, and emerging literature indicates that nonprofit organizations are having similarly positive results. This article proposes a framework for midwifery organizations to incorporate a racial equity toolkit, starting with explicit intentions of the organization with regard to racial equity in the profession. Indicators of success are elucidated as the next step, followed by the use of a racial equity impact analysis worksheet. This worksheet is applied by teams or committees when considering new policies or initiatives to examine those actions through a racial equity lens. An organizational change team and equity advisory groups are essential in assisting organizational leadership to forecast potential negative and positive impacts. Examples of the components of a midwifery-specific racial equity toolkit are included.

  6. An analytical toolkit for polyploid willow discrimination

    PubMed Central

    Guo, Wei; Hou, Jing; Yin, Tongming; Chen, Yingnan

    2016-01-01

    Polyploid breeding is an important means for creating elite willow cultivars, and therefore provokes an active demand for discriminating the ploidy levels of natural willow stands. In this study, we established an analytical toolkit for polyploid willow identification by combining molecular markers and flow cytometry (FCM). A total of 10 single-copy fully informative SSRs were chosen for marker-aided selection based on a segregation test with a full-sib willow pedigree and a mutability test with a collection of natural willow stands. Aided by these molecular markers, we performed polyploid selection in two tree species and two shrub species of the genus Salix. The ploidy levels of the investigated samples were further examined using a flow cytometer. It was previously shown that results from marker-aided selection were consistent with those from FCM measurements. Based on ploidy level assessment in different willow species, it was found that tree willows were dominantly tetraploid, whereas shrub willows were most frequently diploid. With this analytical toolkit, polyploids can be rapidly screened from a large number of natural stands; thereafter, the exact ploidy levels of the polyploid candidates can be efficiently confirmed by FCM. This analytical toolkit will greatly enhance polyploid breeding programs for willows. PMID:27934953

  7. A Python Interface for the Dakota Iterative Systems Analysis Toolkit

    NASA Astrophysics Data System (ADS)

    Piper, M.; Hutton, E.; Syvitski, J. P.

    2016-12-01

    Uncertainty quantification is required to improve the accuracy, reliability, and accountability of Earth science models. Dakota is a software toolkit, developed at Sandia National Laboratories, that provides an interface between models and a library of analysis methods, including support for sensitivity analysis, uncertainty quantification, optimization, and calibration techniques. Dakota is a powerful tool, but its learning curve is steep: the user not only must understand the structure and syntax of the Dakota input file, but also must develop intermediate code, called an analysis driver, that allows Dakota to run a model. The CSDMS Dakota interface (CDI) is a Python package that wraps and extends Dakota's user interface. It simplifies the process of configuring and running a Dakota experiment. A user can program to the CDI, allowing a Dakota experiment to be scripted. The CDI creates Dakota input files and provides a generic analysis driver. Any model written in Python that exposes a Basic Model Interface (BMI), as well as any model componentized in the CSDMS modeling framework, automatically works with the CDI. The CDI has a plugin architecture, so models written in other languages, or those that don't expose a BMI, can be accessed by the CDI by programmatically extending a template; an example is provided in the CDI distribution. Currently, six Dakota analysis methods have been implemented for examples from the much larger Dakota library. To demonstrate the CDI, we performed an uncertainty quantification experiment with the HydroTrend hydrological water balance and transport model. In the experiment, we evaluated the response of long-term suspended sediment load at the river mouth (Qs) to uncertainty in two input parameters, annual mean temperature (T) and precipitation (P), over a series of 100-year runs, using the polynomial chaos method. Through Dakota, we calculated moments, local and global (Sobol') sensitivity indices, and probability density and

  8. Water Security Toolkit User Manual Version 1.2.

    SciTech Connect

    Klise, Katherine A.; Siirola, John Daniel; Hart, David; Hart, William Eugene; Phillips, Cynthia Ann; Haxton, Terranna; Murray, Regan; Janke, Robert; Taxon, Thomas; Laird, Carl; Seth, Arpan; Hackebeil, Gabriel; McGee, Shawn; Mann, Angelica

    2014-08-01

    The Water Security Toolkit (WST) is a suite of open source software tools that can be used by water utilities to create response strategies to reduce the impact of contamination in a water distribution network . WST includes hydraulic and water quality modeling software , optimizati on methodologies , and visualization tools to identify: (1) sensor locations to detect contamination, (2) locations in the network in which the contamination was introduced, (3) hydrants to remove contaminated water from the distribution system, (4) locations in the network to inject decontamination agents to inactivate, remove, or destroy contaminants, (5) locations in the network to take grab sample s to help identify the source of contamination and (6) valves to close in order to isolate contaminate d areas of the network. This user manual describes the different components of WST , along w ith examples and case studies. License Notice The Water Security Toolkit (WST) v.1.2 Copyright c 2012 Sandia Corporation. Under the terms of Contract DE-AC04-94AL85000, there is a non-exclusive license for use of this work by or on behalf of the U.S. government. This software is distributed under the Revised BSD License (see below). In addition, WST leverages a variety of third-party software packages, which have separate licensing policies: Acro Revised BSD License argparse Python Software Foundation License Boost Boost Software License Coopr Revised BSD License Coverage BSD License Distribute Python Software Foundation License / Zope Public License EPANET Public Domain EPANET-ERD Revised BSD License EPANET-MSX GNU Lesser General Public License (LGPL) v.3 gcovr Revised BSD License GRASP AT&T Commercial License for noncommercial use; includes randomsample and sideconstraints executable files LZMA SDK Public Domain nose GNU Lesser General Public License (LGPL) v.2.1 ordereddict MIT License pip MIT License PLY BSD License PyEPANET Revised BSD License Pyro MIT License PyUtilib Revised BSD License Py

  9. Construction aggregates

    USGS Publications Warehouse

    Tepordei, V.V.

    1994-01-01

    Part of a special section on industrial minerals in 1993. The 1993 production of construction aggregates increased 6.3 percent over the 1992 figure, to reach 2.01 Gt. This represents the highest estimated annual production of combined crushed stone and construction sand and gravel ever recorded in the U.S. The outlook for construction aggregates and the issues facing the industry are discussed.

  10. Capturing Petascale Application Characteristics with the Sequoia Toolkit

    SciTech Connect

    Vetter, Jeffrey S; Bhatia, Nikhil; Grobelny, Eric M; Roth, Philip C

    2005-09-01

    Characterization of the computation, communication, memory, and I/O demands of current scientific applications is crucial for identifying which technologies will enable petascale scientific computing. In this paper, we present the Sequoia Toolkit for characterizing HPC applications. The Sequoia Toolkit consists of the Sequoia trace capture library and the Sequoia Event Analysis Library, or SEAL, that facilitates the development of tools for analyzing Sequoia event traces. Using the Sequoia Toolkit, we have characterized the behavior of application runs with up to 2048 application processes. To illustrate the use of the Sequoia Toolkit, we present a preliminary characterization of LAMMPS, a molecular dynamics application of great interest to the computational biology community.

  11. CGtag: complete genomics toolkit and annotation in a cloud-based Galaxy.

    PubMed

    Hiltemann, Saskia; Mei, Hailiang; de Hollander, Mattias; Palli, Ivo; van der Spek, Peter; Jenster, Guido; Stubbs, Andrew

    2014-01-24

    Complete Genomics provides an open-source suite of command-line tools for the analysis of their CG-formatted mapped sequencing files. Determination of; for example, the functional impact of detected variants, requires annotation with various databases that often require command-line and/or programming experience; thus, limiting their use to the average research scientist. We have therefore implemented this CG toolkit, together with a number of annotation, visualisation and file manipulation tools in Galaxy called CGtag (Complete Genomics Toolkit and Annotation in a Cloud-based Galaxy). In order to provide research scientists with web-based, simple and accurate analytical and visualisation applications for the selection of candidate mutations from Complete Genomics data, we have implemented the open-source Complete Genomics tool set, CGATools, in Galaxy. In addition we implemented some of the most popular command-line annotation and visualisation tools to allow research scientists to select candidate pathological mutations (SNV, and indels). Furthermore, we have developed a cloud-based public Galaxy instance to host the CGtag toolkit and other associated modules. CGtag provides a user-friendly interface to all research scientists wishing to select candidate variants from CG or other next-generation sequencing platforms' data. By using a cloud-based infrastructure, we can also assure sufficient and on-demand computation and storage resources to handle the analysis tasks. The tools are freely available for use from an NBIC/CTMM-TraIT (The Netherlands Bioinformatics Center/Center for Translational Molecular Medicine) cloud-based Galaxy instance, or can be installed to a local (production) Galaxy via the NBIC Galaxy tool shed.

  12. An evaluation capacity building toolkit for principal investigators of undergraduate research experiences: A demonstration of transforming theory into practice.

    PubMed

    Rorrer, Audrey S

    2016-04-01

    This paper describes the approach and process undertaken to develop evaluation capacity among the leaders of a federally funded undergraduate research program. An evaluation toolkit was developed for Computer and Information Sciences and Engineering(1) Research Experiences for Undergraduates(2) (CISE REU) programs to address the ongoing need for evaluation capacity among principal investigators who manage program evaluation. The toolkit was the result of collaboration within the CISE REU community with the purpose being to provide targeted instructional resources and tools for quality program evaluation. Challenges were to balance the desire for standardized assessment with the responsibility to account for individual program contexts. Toolkit contents included instructional materials about evaluation practice, a standardized applicant management tool, and a modulated outcomes measure. Resulting benefits from toolkit deployment were having cost effective, sustainable evaluation tools, a community evaluation forum, and aggregate measurement of key program outcomes for the national program. Lessons learned included the imperative of understanding the evaluation context, engaging stakeholders, and building stakeholder trust. Results from project measures are presented along with a discussion of guidelines for facilitating evaluation capacity building that will serve a variety of contexts.

  13. The ALMA Data Mining Toolkit I: Archive Setup and User Usage

    NASA Astrophysics Data System (ADS)

    Teuben, P.; Pound, M.; Mundy, L.; Looney, L.; Friedel, D. N.

    2014-05-01

    We report on an ALMA development study and project where we employ a novel approach to add data and data descriptors to ALMA archive data and allowing further flexible data mining on retrieved data. We call our toolkit ADMIT (the ALMA Data Mining Toolkit) that works within the Python based CASA environment. What is described here is a design study, with some exiting toy code to prove the concept. After ingestion of science ready datacubes, ADMIT will compute a number of basic and advanced data products, and their descriptors. Examples of such data products are cube statistics, line identification tables, line cubes, moment maps, an integrated spectrum, overlap integrals and feature extraction tables. Together with a descriptive XML file, a small number of visual aids are added to a ZIP file that is deposited into the archive. Large datasets (such as line cubes) will have to be rederived by the user once they have also downloaded the actual ALMA Data Products, or via VO services if available. ADMIT enables the user to rederive all its products with different methods and parameters, and compare archive product with their own.

  14. NGS QC Toolkit: a toolkit for quality control of next generation sequencing data.

    PubMed

    Patel, Ravi K; Jain, Mukesh

    2012-01-01

    Next generation sequencing (NGS) technologies provide a high-throughput means to generate large amount of sequence data. However, quality control (QC) of sequence data generated from these technologies is extremely important for meaningful downstream analysis. Further, highly efficient and fast processing tools are required to handle the large volume of datasets. Here, we have developed an application, NGS QC Toolkit, for quality check and filtering of high-quality data. This toolkit is a standalone and open source application freely available at http://www.nipgr.res.in/ngsqctoolkit.html. All the tools in the application have been implemented in Perl programming language. The toolkit is comprised of user-friendly tools for QC of sequencing data generated using Roche 454 and Illumina platforms, and additional tools to aid QC (sequence format converter and trimming tools) and analysis (statistics tools). A variety of options have been provided to facilitate the QC at user-defined parameters. The toolkit is expected to be very useful for the QC of NGS data to facilitate better downstream analysis.

  15. NGS QC Toolkit: A Toolkit for Quality Control of Next Generation Sequencing Data

    PubMed Central

    Patel, Ravi K.; Jain, Mukesh

    2012-01-01

    Next generation sequencing (NGS) technologies provide a high-throughput means to generate large amount of sequence data. However, quality control (QC) of sequence data generated from these technologies is extremely important for meaningful downstream analysis. Further, highly efficient and fast processing tools are required to handle the large volume of datasets. Here, we have developed an application, NGS QC Toolkit, for quality check and filtering of high-quality data. This toolkit is a standalone and open source application freely available at http://www.nipgr.res.in/ngsqctoolkit.html. All the tools in the application have been implemented in Perl programming language. The toolkit is comprised of user-friendly tools for QC of sequencing data generated using Roche 454 and Illumina platforms, and additional tools to aid QC (sequence format converter and trimming tools) and analysis (statistics tools). A variety of options have been provided to facilitate the QC at user-defined parameters. The toolkit is expected to be very useful for the QC of NGS data to facilitate better downstream analysis. PMID:22312429

  16. Quality Assurance Toolkit for Distance Higher Education Institutions and Programmes

    ERIC Educational Resources Information Center

    Rama, Kondapalli, Ed.; Hope, Andrea, Ed.

    2009-01-01

    The Commonwealth of Learning is proud to partner with the Sri Lankan Ministry of Higher Education and UNESCO to produce this "Quality Assurance Toolkit for Distance Higher Education Institutions and Programmes". The Toolkit has been prepared with three features. First, it is a generic document on quality assurance, complete with a…

  17. Designing and Delivering Intensive Interventions: A Teacher's Toolkit

    ERIC Educational Resources Information Center

    Murray, Christy S.; Coleman, Meghan A.; Vaughn, Sharon; Wanzek, Jeanne; Roberts, Greg

    2012-01-01

    This toolkit provides activities and resources to assist practitioners in designing and delivering intensive interventions in reading and mathematics for K-12 students with significant learning difficulties and disabilities. Grounded in research, this toolkit is based on the Center on Instruction's "Intensive Interventions for Students Struggling…

  18. Veterinary Immunology Committee Toolkit Workshop 2010: Progress and plans

    USDA-ARS?s Scientific Manuscript database

    The Third Veterinary Immunology Committee (VIC) Toolkit Workshop took place at the Ninth International Veterinary Immunology Symposium (IVIS) in Tokyo, Japan on August 18, 2020. The Workshop built on previous Toolkit Workshops and covered various aspects of reagent development, commercialisation an...

  19. Quality Assurance Toolkit for Distance Higher Education Institutions and Programmes

    ERIC Educational Resources Information Center

    Rama, Kondapalli, Ed.; Hope, Andrea, Ed.

    2009-01-01

    The Commonwealth of Learning is proud to partner with the Sri Lankan Ministry of Higher Education and UNESCO to produce this "Quality Assurance Toolkit for Distance Higher Education Institutions and Programmes". The Toolkit has been prepared with three features. First, it is a generic document on quality assurance, complete with a…

  20. 77 FR 73022 - U.S. Environmental Solutions Toolkit

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-07

    ... environmental problems and highlight participating U.S. vendors of relevant U.S. technologies. The Toolkit will... users in foreign markets to U.S. approaches to solving environmental problems and to U.S. companies that... International Trade Administration U.S. Environmental Solutions Toolkit AGENCY: International Trade...

  1. Graph algorithms in the titan toolkit.

    SciTech Connect

    McLendon, William Clarence, III; Wylie, Brian Neil

    2009-10-01

    Graph algorithms are a key component in a wide variety of intelligence analysis activities. The Graph-Based Informatics for Non-Proliferation and Counter-Terrorism project addresses the critical need of making these graph algorithms accessible to Sandia analysts in a manner that is both intuitive and effective. Specifically we describe the design and implementation of an open source toolkit for doing graph analysis, informatics, and visualization that provides Sandia with novel analysis capability for non-proliferation and counter-terrorism.

  2. An Incident Management Preparedness and Coordination Toolkit

    SciTech Connect

    Koch, Daniel B; Payne, Patricia W

    2012-01-01

    Although the use of Geographic Information Systems (GIS) by centrally-located operations staff is well established in the area of emergency response, utilization by first responders in the field is uneven. Cost, complexity, and connectivity are often the deciding factors preventing wider adoption. For the past several years, Oak Ridge National Laboratory (ORNL) has been developing a mobile GIS solution using free and open-source software targeting the needs of front-line personnel. Termed IMPACT, for Incident Management Preparedness and Coordination Toolkit, this ORNL application can complement existing GIS infrastructure and extend its power and capabilities to responders first on the scene of a natural or man-made disaster.

  3. A flexible genetic toolkit for arthropod neurogenesis

    PubMed Central

    Stollewerk, Angelika

    2016-01-01

    Arthropods show considerable variations in early neurogenesis. This includes the pattern of specification, division and movement of neural precursors and progenitors. In all metazoans with nervous systems, including arthropods, conserved genes regulate neurogenesis, which raises the question of how the various morphological mechanisms have emerged and how the same genetic toolkit might generate different morphological outcomes. Here I address this question by comparing neurogenesis across arthropods and show how variations in the regulation and function of the neural genes might explain this phenomenon and how they might have facilitated the evolution of the diverse morphological mechanisms of neurogenesis. PMID:26598727

  4. A digital toolkit to implement and manage a multisite study.

    PubMed

    Lasater, Kathie; Johnson, Elizabeth; Hodson-Carlton, Kay; Siktberg, Linda; Sideras, Stephanie

    2012-03-01

    Calls for multisite studies are increasing in nursing education. However, the challenge of implementing consistent protocols and maintaining rigorous standards across sites can be daunting. One purpose of a recent multisite, collaborative, simulation study was to evaluate a digital toolkit's effectiveness for managing a multisite study. We describe the digital toolkit composed of Web-based technologies used to manage a study involving five sites including one United Kingdom site. The digital toolkit included a wiki, a project Web site to coordinate the protocols and study materials, software to organize study materials, and a secure location for sharing data. Most of these are familiar tools; however, combined as a toolkit, they became a useful management system. Web-based communication strategies and coordinated technical support served as key adjuncts to foster collaboration. This article also offers practical implications and recommendations for using a digital toolkit in other multisite studies. Copyright 2012, SLACK Incorporated.

  5. UTGB toolkit for personalized genome browsers

    PubMed Central

    Saito, Taro L.; Yoshimura, Jun; Sasaki, Shin; Ahsan, Budrul; Sasaki, Atsushi; Kuroshu, Reginaldo; Morishita, Shinichi

    2009-01-01

    The advent of high-throughput DNA sequencers has increased the pace of collecting enormous amounts of genomic information, yielding billions of nucleotides on a weekly basis. This advance represents an improvement of two orders of magnitude over traditional Sanger sequencers in terms of the number of nucleotides per unit time, allowing even small groups of researchers to obtain huge volumes of genomic data over fairly short period. Consequently, a pressing need exists for the development of personalized genome browsers for analyzing these immense amounts of locally stored data. The UTGB (University of Tokyo Genome Browser) Toolkit is designed to meet three major requirements for personalization of genome browsers: easy installation of the system with minimum efforts, browsing locally stored data and rapid interactive design of web interfaces tailored to individual needs. The UTGB Toolkit is licensed under an open source license. Availability: The software is freely available at http://utgenome.org/. Contact: moris@cb.k.u-tokyo.ac.jp PMID:19497937

  6. MIS: A Miriad Interferometry Singledish Toolkit

    NASA Astrophysics Data System (ADS)

    Pound, Marc; Teuben, Peter

    2011-10-01

    MIS is a pipeline toolkit using the package MIRIAD to combine Interferometric and Single Dish data. This was prompted by our observations made with the Combined Array For Research in Millimeter-wave Astronomy (CARMA) interferometer of the star-forming region NGC 1333, a large survey highlighting the new 23-element and singledish observing modes. The project consists of 20 CARMA datasets each containing interferometric as well as simultaneously obtained single dish data, for 3 molecular spectral lines and continuum, in 527 different pointings, covering an area of about 8 by 11 arcminutes. A small group of collaborators then shared this toolkit and their parameters via CVS, and scripts were developed to ensure uniform data reduction across the group. The pipeline was run end-to-end each night that new observations were obtained, producing maps that contained all the data to date. This approach could serve as a model for repeated calibration and mapping of large mixed-mode correlation datasets from ALMA.

  7. ADMIT: The ALMA Data Mining Toolkit

    NASA Astrophysics Data System (ADS)

    Teuben, P.; Pound, M.; Mundy, L.; Rauch, K.; Friedel, D.; Looney, L.; Xu, L.; Kern, J.

    2015-09-01

    ADMIT (ALMA Data Mining ToolkiT), a toolkit for the creation of new science products from ALMA data, is being developed as an ALMA Development Project. It is written in Python and, while specifically targeted for a uniform analysis of the ALMA science products that come out of the ALMA pipeline, it is designed to be generally applicable to (radio) astronomical data. It first provides users with a detailed view of their science products created by ADMIT inside the ALMA pipeline: line identifications, line ‘cutout' cubes, moment maps, emission type analysis (e.g., feature detection). Using descriptor vectors the ALMA data archive is enriched with useful information to make archive data mining possible. Users can also opt to download the (small) ADMIT pipeline product, then fine-tune and re-run the pipeline and inspect their hopefully improved data. By running many projects in a parallel fashion, data mining between many astronomical sources and line transitions will also be possible. Future implementations of ADMIT may include EVLA and other instruments.

  8. MIS: A MIRIAD Interferometry Singledish Toolkit

    NASA Astrophysics Data System (ADS)

    Pound, M. W.; Teuben, P.

    2012-09-01

    Building on the “drPACS” contribution at ADASS XX of a simple Unix pipeline infrastructure, we implemented a pipeline toolkit using the package MIRIAD to combine Interferometric and Single Dish data (MIS). This was prompted by our observations made with the Combined Array For Research in Millimeter-wave Astronomy (CARMA) interferometer of the star-forming region NGC 1333, a large survey highlighting the new 23-element and singledish observing modes. The project consists of 20 CARMA datasets each containing interferometric as well as simultaneously obtained single dish data, for 3 molecular spectral lines and continuum, in 527 different pointings, covering an area of about 8 by 11 arcminutes. A small group of collaborators then shared this toolkit and their parameters via CVS, and scripts were developed to ensure uniform data reduction across the group. The pipeline was run end-to-end each night as new observations were obtained, producing maps that contained all the data to date. We will show examples of the scripts and data products. This approach could serve as a model for repeated calibration and mapping of large mixed-mode correlation datasets from ALMA.

  9. The Virtual Physiological Human ToolKit.

    PubMed

    Cooper, Jonathan; Cervenansky, Frederic; De Fabritiis, Gianni; Fenner, John; Friboulet, Denis; Giorgino, Toni; Manos, Steven; Martelli, Yves; Villà-Freixa, Jordi; Zasada, Stefan; Lloyd, Sharon; McCormack, Keith; Coveney, Peter V

    2010-08-28

    The Virtual Physiological Human (VPH) is a major European e-Science initiative intended to support the development of patient-specific computer models and their application in personalized and predictive healthcare. The VPH Network of Excellence (VPH-NoE) project is tasked with facilitating interaction between the various VPH projects and addressing issues of common concern. A key deliverable is the 'VPH ToolKit'--a collection of tools, methodologies and services to support and enable VPH research, integrating and extending existing work across Europe towards greater interoperability and sustainability. Owing to the diverse nature of the field, a single monolithic 'toolkit' is incapable of addressing the needs of the VPH. Rather, the VPH ToolKit should be considered more as a 'toolbox' of relevant technologies, interacting around a common set of standards. The latter apply to the information used by tools, including any data and the VPH models themselves, and also to the naming and categorizing of entities and concepts involved. Furthermore, the technologies and methodologies available need to be widely disseminated, and relevant tools and services easily found by researchers. The VPH-NoE has thus created an online resource for the VPH community to meet this need. It consists of a database of tools, methods and services for VPH research, with a Web front-end. This has facilities for searching the database, for adding or updating entries, and for providing user feedback on entries. Anyone is welcome to contribute.

  10. The Best Ever Alarm System Toolkit

    SciTech Connect

    Kasemir, Kay; Chen, Xihui; Danilova, Katia

    2009-01-01

    Learning from our experience with the standard Experimental Physics and Industrial Control System (EPICS) alarm handler (ALH) as well as a similar intermediate approach based on script-generated operator screens, we developed the Best Ever Alarm System Toolkit (BEAST). It is based on Java and Eclipse on the Control System Studio (CSS) platform, using a relational database (RDB) to store the configuration and log actions. It employs a Java Message Service (JMS) for communication between the modular pieces of the toolkit, which include an Alarm Server to maintain the current alarm state, an arbitrary number of Alarm Client user interfaces (GUI), and tools to annunciate alarms or log alarm related actions. Web reports allow us to monitor the alarm system performance and spot deficiencies in the alarm configuration. The Alarm Client GUI not only gives the end users various ways to view alarms in tree and table, but also makes it easy to access the guidance information, the related operator displays and other CSS tools. It also allows online configuration to be simply modified from the GUI. Coupled with a good "alarm philosophy" on how to provide useful alarms, we can finally improve the configuration to achieve an effective alarm system.

  11. Construction aggregates

    USGS Publications Warehouse

    Tepordei, V.V.

    1996-01-01

    Part of the Annual Commodities Review 1995. Production of construction aggregates such as crushed stone and construction sand and gravel showed a marginal increase in 1995. Most of the 1995 increases were due to funding for highway construction work. The major areas of concern to the industry included issues relating to wetlands classification and the classification of crystalline silica as a probable human carcinogen. Despite this, an increase in demand is anticipated for 1996.

  12. Construction aggregates

    USGS Publications Warehouse

    Nelson, T.I.; Bolen, W.P.

    2007-01-01

    Construction aggregates, primarily stone, sand and gravel, are recovered from widespread naturally occurring mineral deposits and processed for use primarily in the construction industry. They are mined, crushed, sorted by size and sold loose or combined with portland cement or asphaltic cement to make concrete products to build roads, houses, buildings, and other structures. Much smaller quantities are used in agriculture, cement manufacture, chemical and metallurgical processes, glass production and many other products.

  13. Construction aggregates

    USGS Publications Warehouse

    Tepordei, V.V.

    1993-01-01

    Part of a special section on the market performance of industrial minerals in 1992. Production of construction aggregates increased by 4.6 percent in 1992. This increase was due, in part, to the increased funding for transportation and infrastructure projects. The U.S. produced about 1.05 Gt of crushed stone and an estimated 734 Mt of construction sand and gravel in 1992. Demand is expected to increase by about 5 percent in 1993.

  14. Development and evaluation of an RN/RPN utilization toolkit.

    PubMed

    Blastorah, Margaret; Alvarado, Kim; Duhn, Lenora; Flint, Frances; McGrath, Petrina; Vandevelde-Coke, Susan

    2010-05-01

    To develop and evaluate a toolkit for Registered Nurse/Registered Practical Nurse (RN/RPN) staff mix decision-making based on the College of Nurses of Ontario's practice standard for utilization of RNs and RPNs. Descriptive exploratory. The toolkit was tested in a sample of 2,069 inpatients on 36 medical/surgical units in five academic and two community acute care hospitals in southern Ontario. Survey and focus group data were used to evaluate the toolkit's psychometric properties, feasibility of use and utility. Results support the validity and reliability of the Patient Care Needs Assessment (PCNA) tool and the consensus-based process for conducting patient care reviews. Review participants valued the consensus approach. There was limited evidence for the validity and utility of the Unit Environmental Profile (UEP) tool. Nursing unit leaders reported confidence in planning unit staff mix ratios based on information generated through application of the toolkit, specifically the PCNA, although they were less clear about how to incorporate environmental data into staff mix decisions. Results confirm that the toolkit consistently measured the constructs that it was intended to measure and was useful in informing RN/RPN staff mix decision-making. Further refinement and testing of the UEP is required. Future research is needed to evaluate the quality of decisions resulting from the application of the toolkit, illuminate processes for integrating data into decisions and adapt the toolkit for application in other sectors.

  15. Demonstration of the Health Literacy Universal Precautions Toolkit

    PubMed Central

    Mabachi, Natabhona M.; Cifuentes, Maribel; Barnard, Juliana; Brega, Angela G.; Albright, Karen; Weiss, Barry D.; Brach, Cindy; West, David

    2016-01-01

    The Agency for Healthcare Research and Quality Health Literacy Universal Precautions Toolkit was developed to help primary care practices assess and make changes to improve communication with and support for patients. Twelve diverse primary care practices implemented assigned tools over a 6-month period. Qualitative results revealed challenges practices experienced during implementation, including competing demands, bureaucratic hurdles, technological challenges, limited quality improvement experience, and limited leadership support. Practices used the Toolkit flexibly and recognized the efficiencies of implementing tools in tandem and in coordination with other quality improvement initiatives. Practices recommended reducing Toolkit density and making specific refinements. PMID:27232681

  16. X Window Application Extension With the Andrew Toolkit

    DTIC Science & Technology

    1992-09-01

    application generally communicates with the X network protocol through one or more levels of toolkits. X toolkits are pre-packaged libraries of C...Toolkits exist at three distinct levels. The most widely used low-level interface to X is the standard C language library known as Xlib. Xlib defines a set...and class procedures (Palay, 1988, p. 7). Class is a C language-based system consisting of a small run-time library and preprocessor. An Andrew Class is

  17. The Galley Parallel File System

    NASA Technical Reports Server (NTRS)

    Nieuwejaar, Nils; Kotz, David

    1996-01-01

    Most current multiprocessor file systems are designed to use multiple disks in parallel, using the high aggregate bandwidth to meet the growing I/0 requirements of parallel scientific applications. Many multiprocessor file systems provide applications with a conventional Unix-like interface, allowing the application to access multiple disks transparently. This interface conceals the parallelism within the file system, increasing the ease of programmability, but making it difficult or impossible for sophisticated programmers and libraries to use knowledge about their I/O needs to exploit that parallelism. In addition to providing an insufficient interface, most current multiprocessor file systems are optimized for a different workload than they are being asked to support. We introduce Galley, a new parallel file system that is intended to efficiently support realistic scientific multiprocessor workloads. We discuss Galley's file structure and application interface, as well as the performance advantages offered by that interface.

  18. Monitoring Extreme-scale Lustre Toolkit

    SciTech Connect

    Brim, Michael J; Lothian, Josh

    2015-01-01

    We discuss the design and ongoing development of the Monitoring Extreme-scale Lustre Toolkit (MELT), a unified Lustre performance monitoring and analysis infrastructure that provides continuous, low-overhead summary information on the health and performance of Lustre, as well as on-demand, in-depth problem diagnosis and root-cause analysis. The MELT infrastructure leverages a distributed overlay network to enable monitoring of center-wide Lustre filesystems where clients are located across many network domains. We preview interactive command-line utilities that help administrators and users to observe Lustre performance at various levels of resolution, from individual servers or clients to whole filesystems, including job-level reporting. Finally, we discuss our future plans for automating the root-cause analysis of common Lustre performance problems.

  19. NBII-SAIN Data Management Toolkit

    USGS Publications Warehouse

    Burley, Thomas E.; Peine, John D.

    2009-01-01

    percent of the cost of a spatial information system is associated with spatial data collection and management (U.S. General Accounting Office, 2003). These figures indicate that the resources (time, personnel, money) of many agencies and organizations could be used more efficiently and effectively. Dedicated and conscientious data management coordination and documentation is critical for reducing such redundancy. Substantial cost savings and increased efficiency are direct results of a pro-active data management approach. In addition, details of projects as well as data and information are frequently lost as a result of real-world occurrences such as the passing of time, job turnover, and equipment changes and failure. A standardized, well documented database allows resource managers to identify issues, analyze options, and ultimately make better decisions in the context of adaptive management (National Land and Water Resources Audit and the Australia New Zealand Land Information Council on behalf of the Australian National Government, 2003). Many environmentally focused, scientific, or natural resource management organizations collect and create both spatial and non-spatial data in some form. Data management appropriate for those data will be contingent upon the project goal(s) and objectives and thus will vary on a case-by-case basis. This project and the resulting Data Management Toolkit, hereafter referred to as the Toolkit, is therefore not intended to be comprehensive in terms of addressing all of the data management needs of all projects that contain biological, geospatial, and other types of data. The Toolkit emphasizes the idea of connecting a project's data and the related management needs to the defined project goals and objectives from the outset. In that context, the Toolkit presents and describes the fundamental components of sound data and information management that are common to projects involving biological, geospatial, and other related data

  20. WIST: toolkit for rapid, customized LIMS development

    PubMed Central

    Huang, Y. Wayne; Arkin, Adam P.; Chandonia, John-Marc

    2011-01-01

    Summary: Workflow Information Storage Toolkit (WIST) is a set of application programming interfaces and web applications that allow for the rapid development of customized laboratory information management systems (LIMS). WIST provides common LIMS input components, and allows them to be arranged and configured using a flexible language that specifies each component's visual and semantic characteristics. WIST includes a complete set of web applications for adding, editing and viewing data, as well as a powerful setup tool that can build new LIMS modules by analyzing existing database schema. Availability and implementation: WIST is implemented in Perl and may be obtained from http://vimss.sf.net under the BSD license. Contact: jmchandonia@lbl.gov Supplementary information: Supplementary data are available at Bioinformatics online. PMID:21258060

  1. TEVA-SPOT Toolkit 1.2

    SciTech Connect

    Berry, Jonathan; Riesen, Lee Ann; Hart, William

    2007-07-26

    The TEVA-SPOT Toolkit (SPOT) supports the design of contaminant warning systems (CWSs) that use real-time sensors to detect contaminants in municipal water distribution networks. Specifically, SPOT provides the capability to select the locations for installing sensors in order to maximize the utility and effectiveness of the CWS. SPOT models the sensor placement process as an optimization problem, and the user can specify a wide range of performance objectives for contaminant warning system design, including population health effects, time to detection, extent of contamination, volume consumed and number of failed detections. For example, a SPOT user can integrate expert knowledge during the design process by specigying required sensor placements or designating network locations as forbidden. Further, cost considerations can be integrated by limiting the design with user-specified installation costs at each location.

  2. PRALINE: a versatile multiple sequence alignment toolkit.

    PubMed

    Bawono, Punto; Heringa, Jaap

    2014-01-01

    Profile ALIgNmEnt (PRALINE) is a versatile multiple sequence alignment toolkit. In its main alignment protocol, PRALINE follows the global progressive alignment algorithm. It provides various alignment optimization strategies to address the different situations that call for protein multiple sequence alignment: global profile preprocessing, homology-extended alignment, secondary structure-guided alignment, and transmembrane aware alignment. A number of combinations of these strategies are enabled as well. PRALINE is accessible via the online server http://www.ibi.vu.nl/programs/PRALINEwww/. The server facilitates extensive visualization possibilities aiding the interpretation of alignments generated, which can be written out in pdf format for publication purposes. PRALINE also allows the sequences in the alignment to be represented in a dendrogram to show their mutual relationships according to the alignment. The chapter ends with a discussion of various issues occurring in multiple sequence alignment.

  3. TEVA-SPOT Toolkit 1.2

    SciTech Connect

    Berry, Jonathan; Riesen, Lee Ann; Hart, William

    2007-07-26

    The TEVA-SPOT Toolkit (SPOT) supports the design of contaminant warning systems (CWSs) that use real-time sensors to detect contaminants in municipal water distribution networks. Specifically, SPOT provides the capability to select the locations for installing sensors in order to maximize the utility and effectiveness of the CWS. SPOT models the sensor placement process as an optimization problem, and the user can specify a wide range of performance objectives for contaminant warning system design, including population health effects, time to detection, extent of contamination, volume consumed and number of failed detections. For example, a SPOT user can integrate expert knowledge during the design process by specigying required sensor placements or designating network locations as forbidden. Further, cost considerations can be integrated by limiting the design with user-specified installation costs at each location.

  4. Climate Change Toolkit-Case study: Switzerland

    NASA Astrophysics Data System (ADS)

    Ashraf Vaghefi, Saeid

    2017-04-01

    This paper describes the development of a Climate Change Toolkit (CCT) to rapidly perform tasks needed in a climate change study. CCT consists of five modules: data extraction, global climate data management, bias correction, spatial interpolation, and critical consecutive day analyzer to calculate extreme events. CCT is linked to an archive of big dataset consisting of daily global historic (CRU, 1970-2005), and global GCM data (1960-2099) from 5 models and 4 carbon scenarios. Application of CCT in Switzerland using ensemble results of scenario RCP8.5 showed an increase in Max temperature, and a wide change in precipitation. Frequency of dry periods will likely increase. The frequency of wet periods suggests higher risk of flooding in the country.

  5. Data Exploration Toolkit for serial diffraction experiments

    DOE PAGES

    Zeldin, Oliver B.; Brewster, Aaron S.; Hattne, Johan; ...

    2015-01-23

    Ultrafast diffraction at X-ray free-electron lasers (XFELs) has the potential to yield new insights into important biological systems that produce radiation-sensitive crystals. An unavoidable feature of the 'diffraction before destruction' nature of these experiments is that images are obtained from many distinct crystals and/or different regions of the same crystal. Combined with other sources of XFEL shot-to-shot variation, this introduces significant heterogeneity into the diffraction data, complicating processing and interpretation. To enable researchers to get the most from their collected data, a toolkit is presented that provides insights into the quality of, and the variation present in, serial crystallography datamore » sets. These tools operate on the unmerged, partial intensity integration results from many individual crystals, and can be used on two levels: firstly to guide the experimental strategy during data collection, and secondly to help users make informed choices during data processing.« less

  6. Toolkit for evaluating impacts of public participation in scientific research

    NASA Astrophysics Data System (ADS)

    Bonney, R.; Phillips, T.

    2011-12-01

    The Toolkit for Evaluating Impacts of Public Participation in Scientific Research is being developed to meet a major need in the field of visitor studies: To provide project developers and other professionals, especially those with limited knowledge or understanding of evaluation techniques, with a systematic method for assessing project impact that facilitates longitudinal and cross-project comparisons. The need for the toolkit was first identified at the Citizen Science workshop held at the Cornell Lab of Ornithology in 2007 (McEver et al. 2007) and reaffirmed by a CAISE inquiry group that produced the recent report: "Public Participation in Scientific Research: Defining the Field and Assessing its Potential for Informal Science Education" (Bonney et al. 2009). This presentation will introduce the Toolkit, show how it is intended to be used, and describe ways that project directors can use their programmatic goals and use toolkit materials to outline a plan for evaluating the impacts of their project.

  7. Food: Too Good to Waste Implementation Guide and Toolkit

    EPA Pesticide Factsheets

    The Food: Too Good to Waste (FTGTW) Implementation Guide and Toolkit is designed for community organizations, local governments, households and others interested in reducing wasteful household food management practices.

  8. Resource Toolkit for Working with Education Service Providers

    ERIC Educational Resources Information Center

    National Association of Charter School Authorizers (NJ1), 2005

    2005-01-01

    This resource toolkit for working education service providers contains four sections. Section 1, "Roles Responsibilities, and Relationships," contains: (1) "Purchasing Services from an Educational Management Organization," excerpted from "The Charter School Administrative and Governance Guide" (Massachusetts Dept. of…

  9. The Radar Software Toolkit: Anaylsis software for the ITM community

    NASA Astrophysics Data System (ADS)

    Barnes, R. J.; Greenwald, R.

    2005-05-01

    The Radar Software Toolkit is a collection of data analysis, modelling and visualization tools originally developed for the SuperDARN project. It has evolved over the years into a robust, multi-platform software toolkit for working with a variety of ITM data sets including data from the Polar, TIMED and ACE spacecraft, ground based magnetometers, Incoherrent Scatter Radars, and SuperDARN. The toolkit includes implementations of the Altitude Adjusted Coordinate System (AACGM), the International Geomagnetic Reference Field (IGRF), SGP4 and a set of coordinate transform functions. It also includes a sophisticated XML based data visualization system. The toolkit is written using a combination of ANSI C, Java and the Interactive Data Language (IDL) and has been tested on a variety of platforms.

  10. General relativistic magneto-hydrodynamics with the Einstein Toolkit

    NASA Astrophysics Data System (ADS)

    Moesta, Philipp; Mundim, Bruno; Faber, Joshua; Noble, Scott; Bode, Tanja; Haas, Roland; Loeffler, Frank; Ott, Christian; Reisswig, Christian; Schnetter, Erik

    2013-04-01

    The Einstein Toolkit Consortium is developing and supporting open software for relativistic astrophysics. Its aim is to provide the core computational tools that can enable new science, broaden our community, facilitate interdisciplinary research and take advantage of petascale computers and advanced cyberinfrastructure. The Einstein Toolkit currently consists of an open set of over 100 modules for the Cactus framework, primarily for computational relativity along with associated tools for simulation management and visualization. The toolkit includes solvers for vacuum spacetimes as well as relativistic magneto-hydrodynamics. This talk will present the current capabilities of the Einstein Toolkit with a particular focus on recent improvements made to the general relativistic magneto-hydrodynamics modeling and will point to information how to leverage it for future research.

  11. Charon Message-Passing Toolkit for Scientific Computations

    NASA Technical Reports Server (NTRS)

    VanderWijngarrt, Rob F.; Saini, Subhash (Technical Monitor)

    1998-01-01

    The Charon toolkit for piecemeal development of high-efficiency parallel programs for scientific computing is described. The portable toolkit, callable from C and Fortran, provides flexible domain decompositions and high-level distributed constructs for easy translation of serial legacy code or design to distributed environments. Gradual tuning can subsequently be applied to obtain high performance, possibly by using explicit message passing. Charon also features general structured communications that support stencil-based computations with complex recurrences. Through the separation of partitioning and distribution, the toolkit can also be used for blocking of uni-processor code, and for debugging of parallel algorithms on serial machines. An elaborate review of recent parallelization aids is presented to highlight the need for a toolkit like Charon. Some performance results of parallelizing the NAS Parallel Benchmark SP program using Charon are given, showing good scalability. Some performance results of parallelizing the NAS Parallel Benchmark SP program using Charon are given, showing good scalability.

  12. Charon Message-Passing Toolkit for Scientific Computations

    NASA Technical Reports Server (NTRS)

    VanderWijngaart, Rob F.; Saini, Subhash (Technical Monitor)

    1998-01-01

    The Charon toolkit for piecemeal development of high-efficiency parallel programs for scientific computing is described. The portable toolkit, callable from C and Fortran, provides flexible domain decompositions and high-level distributed constructs for easy translation of serial legacy code or design to distributed environments. Gradual tuning can subsequently be applied to obtain high performance, possibly by using explicit message passing. Charon also features general structured communications that support stencil-based computations with complex recurrences. Through the separation of partitioning and distribution, the toolkit can also be used for blocking of uni-processor code, and for debugging of parallel algorithms on serial machines. An elaborate review of recent parallelization aids is presented to highlight the need for a toolkit like Charon. Some performance results of parallelizing the NAS Parallel Benchmark SP program using Charon are given, showing good scalability.

  13. CHASM and SNVBox: toolkit for detecting biologically important single nucleotide mutations in cancer.

    PubMed

    Wong, Wing Chung; Kim, Dewey; Carter, Hannah; Diekhans, Mark; Ryan, Michael C; Karchin, Rachel

    2011-08-01

    Thousands of cancer exomes are currently being sequenced, yielding millions of non-synonymous single nucleotide variants (SNVs) of possible relevance to disease etiology. Here, we provide a software toolkit to prioritize SNVs based on their predicted contribution to tumorigenesis. It includes a database of precomputed, predictive features covering all positions in the annotated human exome and can be used either stand-alone or as part of a larger variant discovery pipeline. MySQL database, source code and binaries freely available for academic/government use at http://wiki.chasmsoftware.org, Source in Python and C++. Requires 32 or 64-bit Linux system (tested on Fedora Core 8,10,11 and Ubuntu 10), 2.5*≤ Python <3.0*, MySQL server >5.0, 60 GB available hard disk space (50 MB for software and data files, 40 GB for MySQL database dump when uncompressed), 2 GB of RAM.

  14. Quick Way to Port Existing C/C++ Chemoinformatics Toolkits to the Web Using Emscripten.

    PubMed

    Jiang, Chen; Jin, Xi

    2017-09-20

    Emscripten is a special open source compiler that compiles C and C++ code into JavaScript. By utilizing this compiler, some typical C/C++ chemoinformatics toolkits and libraries are quickly ported to to web. The compiled JavaScript files have sizes similar to native programs, and from a series of constructed benchmarks, the performance of the compiled JavaScript codes is also close to that of the native codes and is better than the handwritten JavaScript codes. Therefore, we believe that Emscripten is a feasible and practical tool for reusing existing C/C++ codes on the web, and many other chemoinformatics or molecular calculation software tools can also be easily ported by Emscripten.

  15. The Trick Simulation Toolkit: A NASA/Opensource Framework for Running Time Based Physics Models

    NASA Technical Reports Server (NTRS)

    Penn, John M.

    2016-01-01

    The Trick Simulation Toolkit is a simulation development environment used to create high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. Its purpose is to generate a simulation executable from a collection of user-supplied models and a simulation definition file. For each Trick-based simulation, Trick automatically provides job scheduling, numerical integration, the ability to write and restore human readable checkpoints, data recording, interactive variable manipulation, a run-time interpreter, and many other commonly needed capabilities. This allows simulation developers to concentrate on their domain expertise and the algorithms and equations of their models. Also included in Trick are tools for plotting recorded data and various other supporting utilities and libraries. Trick is written in C/C++ and Java and supports both Linux and MacOSX computer operating systems. This paper describes Trick's design and use at NASA Johnson Space Center.

  16. Multi-Physics Demonstration Problem with the SHARP Reactor Simulation Toolkit

    SciTech Connect

    Merzari, E.; Shemon, E. R.; Yu, Y. Q.; Thomas, J. W.; Obabko, A.; Jain, Rajeev; Mahadevan, Vijay; Tautges, Timothy; Solberg, Jerome; Ferencz, Robert Mark; Whitesides, R.

    2015-12-21

    This report describes to employ SHARP to perform a first-of-a-kind analysis of the core radial expansion phenomenon in an SFR. This effort required significant advances in the framework Multi-Physics Demonstration Problem with the SHARP Reactor Simulation Toolkit used to drive the coupled simulations, manipulate the mesh in response to the deformation of the geometry, and generate the necessary modified mesh files. Furthermore, the model geometry is fairly complex, and consistent mesh generation for the three physics modules required significant effort. Fully-integrated simulations of a 7-assembly mini-core test problem have been performed, and the results are presented here. Physics models of a full-core model of the Advanced Burner Test Reactor have also been developed for each of the three physics modules. Standalone results of each of the three physics modules for the ABTR are presented here, which provides a demonstration of the feasibility of the fully-integrated simulation.

  17. Bio.Phylo: A unified toolkit for processing, analyzing and visualizing phylogenetic trees in Biopython

    PubMed Central

    2012-01-01

    Background Ongoing innovation in phylogenetics and evolutionary biology has been accompanied by a proliferation of software tools, data formats, analytical techniques and web servers. This brings with it the challenge of integrating phylogenetic and other related biological data found in a wide variety of formats, and underlines the need for reusable software that can read, manipulate and transform this information into the various forms required to build computational pipelines. Results We built a Python software library for working with phylogenetic data that is tightly integrated with Biopython, a broad-ranging toolkit for computational biology. Our library, Bio.Phylo, is highly interoperable with existing libraries, tools and standards, and is capable of parsing common file formats for phylogenetic trees, performing basic transformations and manipulations, attaching rich annotations, and visualizing trees. We unified the modules for working with the standard file formats Newick, NEXUS and phyloXML behind a consistent and simple API, providing a common set of functionality independent of the data source. Conclusions Bio.Phylo meets a growing need in bioinformatics for working with heterogeneous types of phylogenetic data. By supporting interoperability with multiple file formats and leveraging existing Biopython features, this library simplifies the construction of phylogenetic workflows. We also provide examples of the benefits of building a community around a shared open-source project. Bio.Phylo is included with Biopython, available through the Biopython website, http://biopython.org. PMID:22909249

  18. The MOLGENIS toolkit: rapid prototyping of biosoftware at the push of a button.

    PubMed

    Swertz, Morris A; Dijkstra, Martijn; Adamusiak, Tomasz; van der Velde, Joeri K; Kanterakis, Alexandros; Roos, Erik T; Lops, Joris; Thorisson, Gudmundur A; Arends, Danny; Byelas, George; Muilu, Juha; Brookes, Anthony J; de Brock, Engbert O; Jansen, Ritsert C; Parkinson, Helen

    2010-12-21

    There is a huge demand on bioinformaticians to provide their biologists with user friendly and scalable software infrastructures to capture, exchange, and exploit the unprecedented amounts of new *omics data. We here present MOLGENIS, a generic, open source, software toolkit to quickly produce the bespoke MOLecular GENetics Information Systems needed. The MOLGENIS toolkit provides bioinformaticians with a simple language to model biological data structures and user interfaces. At the push of a button, MOLGENIS' generator suite automatically translates these models into a feature-rich, ready-to-use web application including database, user interfaces, exchange formats, and scriptable interfaces. Each generator is a template of SQL, JAVA, R, or HTML code that would require much effort to write by hand. This 'model-driven' method ensures reuse of best practices and improves quality because the modeling language and generators are shared between all MOLGENIS applications, so that errors are found quickly and improvements are shared easily by a re-generation. A plug-in mechanism ensures that both the generator suite and generated product can be customized just as much as hand-written software. In recent years we have successfully evaluated the MOLGENIS toolkit for the rapid prototyping of many types of biomedical applications, including next-generation sequencing, GWAS, QTL, proteomics and biobanking. Writing 500 lines of model XML typically replaces 15,000 lines of hand-written programming code, which allows for quick adaptation if the information system is not yet to the biologist's satisfaction. Each application generated with MOLGENIS comes with an optimized database back-end, user interfaces for biologists to manage and exploit their data, programming interfaces for bioinformaticians to script analysis tools in R, Java, SOAP, REST/JSON and RDF, a tab-delimited file format to ease upload and exchange of data, and detailed technical documentation. Existing databases

  19. The MOLGENIS toolkit: rapid prototyping of biosoftware at the push of a button

    PubMed Central

    2010-01-01

    Background There is a huge demand on bioinformaticians to provide their biologists with user friendly and scalable software infrastructures to capture, exchange, and exploit the unprecedented amounts of new *omics data. We here present MOLGENIS, a generic, open source, software toolkit to quickly produce the bespoke MOLecular GENetics Information Systems needed. Methods The MOLGENIS toolkit provides bioinformaticians with a simple language to model biological data structures and user interfaces. At the push of a button, MOLGENIS’ generator suite automatically translates these models into a feature-rich, ready-to-use web application including database, user interfaces, exchange formats, and scriptable interfaces. Each generator is a template of SQL, JAVA, R, or HTML code that would require much effort to write by hand. This ‘model-driven’ method ensures reuse of best practices and improves quality because the modeling language and generators are shared between all MOLGENIS applications, so that errors are found quickly and improvements are shared easily by a re-generation. A plug-in mechanism ensures that both the generator suite and generated product can be customized just as much as hand-written software. Results In recent years we have successfully evaluated the MOLGENIS toolkit for the rapid prototyping of many types of biomedical applications, including next-generation sequencing, GWAS, QTL, proteomics and biobanking. Writing 500 lines of model XML typically replaces 15,000 lines of hand-written programming code, which allows for quick adaptation if the information system is not yet to the biologist’s satisfaction. Each application generated with MOLGENIS comes with an optimized database back-end, user interfaces for biologists to manage and exploit their data, programming interfaces for bioinformaticians to script analysis tools in R, Java, SOAP, REST/JSON and RDF, a tab-delimited file format to ease upload and exchange of data, and detailed technical

  20. 12 CFR 1402.27 - Aggregating requests.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Information § 1402.27 Aggregating requests. A requester may not file multiple requests at the same time, each... in concert, is attempting to break a request down into a series of requests for the purpose of... reasonable is the time period over which the requests have occurred. ...

  1. Behavioral Genetic Toolkits: Toward the Evolutionary Origins of Complex Phenotypes.

    PubMed

    Rittschof, C C; Robinson, G E

    2016-01-01

    The discovery of toolkit genes, which are highly conserved genes that consistently regulate the development of similar morphological phenotypes across diverse species, is one of the most well-known observations in the field of evolutionary developmental biology. Surprisingly, this phenomenon is also relevant for a wide array of behavioral phenotypes, despite the fact that these phenotypes are highly complex and regulated by many genes operating in diverse tissues. In this chapter, we review the use of the toolkit concept in the context of behavior, noting the challenges of comparing behaviors and genes across diverse species, but emphasizing the successes in identifying genetic toolkits for behavior; these successes are largely attributable to the creative research approaches fueled by advances in behavioral genomics. We have two general goals: (1) to acknowledge the groundbreaking progress in this field, which offers new approaches to the difficult but exciting challenge of understanding the evolutionary genetic basis of behaviors, some of the most complex phenotypes known, and (2) to provide a theoretical framework that encompasses the scope of behavioral genetic toolkit studies in order to clearly articulate the research questions relevant to the toolkit concept. We emphasize areas for growth and highlight the emerging approaches that are being used to drive the field forward. Behavioral genetic toolkit research has elevated the use of integrative and comparative approaches in the study of behavior, with potentially broad implications for evolutionary biologists and behavioral ecologists alike.

  2. E-ELT modeling and simulation toolkits: philosophy and progress status

    NASA Astrophysics Data System (ADS)

    Sedghi, B.; Muller, M.; Bonnet, H.; Esselborn, M.; Le Louarn, M.; Clare, R.; Koch, F.

    2011-09-01

    To predict the performance of the E-ELT three sets of toolkits are developed at ESO: i) The main structure and associated optical unit dynamical and feedback control toolkit, ii) Active optics and phasing toolkit, and iii) adaptive optics simulation toolkit. There was a deliberate policy not to integrate all of the systems into a massive model and tool. The dynamical and control time scale differences are used to separate the simulation environments and tools. Therefore, each toolkit contains an appropriate detail of the problem and holds sufficient overlap with the others to ensure the consistency of the results. In this paper, these toolkits together with some examples are presented.

  3. IBiSA_Tools: A Computational Toolkit for Ion-Binding State Analysis in Molecular Dynamics Trajectories of Ion Channels.

    PubMed

    Kasahara, Kota; Kinoshita, Kengo

    2016-01-01

    Ion conduction mechanisms of ion channels are a long-standing conundrum. Although the molecular dynamics (MD) method has been extensively used to simulate ion conduction dynamics at the atomic level, analysis and interpretation of MD results are not straightforward due to complexity of the dynamics. In our previous reports, we proposed an analytical method called ion-binding state analysis to scrutinize and summarize ion conduction mechanisms by taking advantage of a variety of analytical protocols, e.g., the complex network analysis, sequence alignment, and hierarchical clustering. This approach effectively revealed the ion conduction mechanisms and their dependence on the conditions, i.e., ion concentration and membrane voltage. Here, we present an easy-to-use computational toolkit for ion-binding state analysis, called IBiSA_tools. This toolkit consists of a C++ program and a series of Python and R scripts. From the trajectory file of MD simulations and a structure file, users can generate several images and statistics of ion conduction processes. A complex network named ion-binding state graph is generated in a standard graph format (graph modeling language; GML), which can be visualized by standard network analyzers such as Cytoscape. As a tutorial, a trajectory of a 50 ns MD simulation of the Kv1.2 channel is also distributed with the toolkit. Users can trace the entire process of ion-binding state analysis step by step. The novel method for analysis of ion conduction mechanisms of ion channels can be easily used by means of IBiSA_tools. This software is distributed under an open source license at the following URL: http://www.ritsumei.ac.jp/~ktkshr/ibisa_tools/.

  4. IBiSA_Tools: A Computational Toolkit for Ion-Binding State Analysis in Molecular Dynamics Trajectories of Ion Channels

    PubMed Central

    Kasahara, Kota; Kinoshita, Kengo

    2016-01-01

    Ion conduction mechanisms of ion channels are a long-standing conundrum. Although the molecular dynamics (MD) method has been extensively used to simulate ion conduction dynamics at the atomic level, analysis and interpretation of MD results are not straightforward due to complexity of the dynamics. In our previous reports, we proposed an analytical method called ion-binding state analysis to scrutinize and summarize ion conduction mechanisms by taking advantage of a variety of analytical protocols, e.g., the complex network analysis, sequence alignment, and hierarchical clustering. This approach effectively revealed the ion conduction mechanisms and their dependence on the conditions, i.e., ion concentration and membrane voltage. Here, we present an easy-to-use computational toolkit for ion-binding state analysis, called IBiSA_tools. This toolkit consists of a C++ program and a series of Python and R scripts. From the trajectory file of MD simulations and a structure file, users can generate several images and statistics of ion conduction processes. A complex network named ion-binding state graph is generated in a standard graph format (graph modeling language; GML), which can be visualized by standard network analyzers such as Cytoscape. As a tutorial, a trajectory of a 50 ns MD simulation of the Kv1.2 channel is also distributed with the toolkit. Users can trace the entire process of ion-binding state analysis step by step. The novel method for analysis of ion conduction mechanisms of ion channels can be easily used by means of IBiSA_tools. This software is distributed under an open source license at the following URL: http://www.ritsumei.ac.jp/~ktkshr/ibisa_tools/ PMID:27907142

  5. The NOAA Weather and Climate Toolkit

    NASA Astrophysics Data System (ADS)

    Ansari, S.; Hutchins, C.; Del Greco, S.

    2008-12-01

    The NOAA Weather and Climate Toolkit (WCT) is an application that provides simple visualization and data export of weather and climate data archived at the National Climatic Data Center (NCDC) and other organizations. The WCT is built on the Unidata Common Data Model and supports defined feature types such as Grid, Radial, Point, Time Series and Trajectory. Current NCDC datasets supported include NEXRAD Radar data, GOES Satellite imagery, NOMADS Model Data, Integrated Surface Data and the U.S. Drought Monitor (part of the National Integrated Drought Information System (NIDIS)). The WCT Viewer provides tools for displaying custom data overlays, Web Map Services (WMS), animations and basic filters. The export of images and movies is provided in multiple formats. The WCT Data Exporter allows for data export in both vector polygon (Shapefile, Well-Known Text) and raster (GeoTIFF, Arc/Info ASCII Grid, VTK, NetCDF) formats. By decoding and exporting data into multiple common formats, a diverse user community can perform analysis using familiar tools such as ArcGIS, MatLAB and IDL. This brings new users to a vast array of weather and climate data at NCDC.

  6. The microRNA toolkit of insects

    PubMed Central

    Ylla, Guillem; Fromm, Bastian; Piulachs, Maria-Dolors; Belles, Xavier

    2016-01-01

    Is there a correlation between miRNA diversity and levels of organismic complexity? Exhibiting extraordinary levels of morphological and developmental complexity, insects are the most diverse animal class on earth. Their evolutionary success was in particular shaped by the innovation of holometabolan metamorphosis in endopterygotes. Previously, miRNA evolution had been linked to morphological complexity, but astonishing variation in the currently available miRNA complements of insects made this link unclear. To address this issue, we sequenced the miRNA complement of the hemimetabolan Blattella germanica and reannotated that of two other hemimetabolan species, Locusta migratoria and Acyrthosiphon pisum, and of four holometabolan species, Apis mellifera, Tribolium castaneum, Bombyx mori and Drosophila melanogaster. Our analyses show that the variation of insect miRNAs is an artefact mainly resulting from poor sampling and inaccurate miRNA annotation, and that insects share a conserved microRNA toolkit of 65 families exhibiting very low variation. For example, the evolutionary shift toward a complete metamorphosis was accompanied only by the acquisition of three and the loss of one miRNA families. PMID:27883064

  7. UQ Toolkit v 2.0

    SciTech Connect

    2013-10-03

    The Uncertainty Quantification (UQ) Toolkit is a software library for the characterizaton and propagation of uncertainties in computational models. For the characterization of uncertainties, Bayesian inference tools are provided to infer uncertain model parameters, as well as Bayesian compressive sensing methods for discovering sparse representations of high-dimensional input-output response surfaces, and also Karhunen-Loève expansions for representing stochastic processes. Uncertain parameters are treated as random variables and represented with Polynomial Chaos expansions (PCEs). The library implements several spectral basis function types (e.g. Hermite basis functions in terms of Gaussian random variables or Legendre basis functions in terms of uniform random variables) that can be used to represent random variables with PCEs. For propagation of uncertainty, tools are provided to propagate PCEs that describe the input uncertainty through the computational model using either intrusive methods (Galerkin projection of equations onto basis functions) or non-intrusive methods (perform deterministic operation at sampled values of the random values and project the obtained results onto basis functions).

  8. UQ Toolkit v. 3.0

    SciTech Connect

    Sargsyan, Khachik; Safta, Cosmin; Chowdhary, Kenny; de Bord, Sarah; Debusschere, Bert

    2016-09-14

    The Uncertainty Quantification (UQ) Toolkit is a software library for the characterization and propagation of uncertainties in computational models. This library provides Bayesian inference tools to infer uncertain model parameters, Bayesian compressive sensing methods for discovering sparse representations of high-dimensional input-output response surfaces, methods for constructing Karhunen-Loève representations of stochastic processes, and global sensitivity analysis tools used to compute Sobol indices in order to characterize the importance of uncertain inputs or the interactions between them. The basis for many of these methods relies on representing random variables with Polynomial Chaos expansions (PCEs). This library implements several spectral basis function types (e.g. Hermite basis functions in terms of Gaussian random variables or Legendre basis functions in terms of uniform random variables) that can be used to represent random variables with PCEs. For the propagation of uncertainty, tools are provided to propagate PCEs that describe the input uncertainty through the computational model using either intrusive methods (Galerkin projection of equations onto basis functions) or non-intrusive methods (perform deterministic operation at sampled values of the random values and project the obtained results onto basis functions).

  9. Modelling toolkit for simulation of maglev devices

    NASA Astrophysics Data System (ADS)

    Peña-Roche, J.; Badía-Majós, A.

    2017-01-01

    A stand-alone App1 has been developed, focused on obtaining information about relevant engineering properties of magnetic levitation systems. Our modelling toolkit provides real time simulations of 2D magneto-mechanical quantities for superconductor (SC)/permanent magnet structures. The source code is open and may be customised for a variety of configurations. Ultimately, it relies on the variational statement of the critical state model for the superconducting component and has been verified against experimental data for YBaCuO/NdFeB assemblies. On a quantitative basis, the values of the arising forces, induced superconducting currents, as well as a plot of the magnetic field lines are displayed upon selection of an arbitrary trajectory of the magnet in the vicinity of the SC. The stability issues related to the cooling process, as well as the maximum attainable forces for a given material and geometry are immediately observed. Due to the complexity of the problem, a strategy based on cluster computing, database compression, and real-time post-processing on the device has been implemented.

  10. VaST: Variability Search Toolkit

    NASA Astrophysics Data System (ADS)

    Sokolovsky, Kirill V.; Lebedev, Alexandr A.

    2017-04-01

    VaST (Variability Search Toolkit) finds variable objects on a series of astronomical images in FITS format. The software performs object detection and aperture photometry using SExtractor (ascl:1010.064) on each image, cross-matches lists of detected stars, performs magnitude calibration with respect to the first (reference) image and constructs a lightcurve for each object. The sigma-magnitude, Stetson's L variability index, Robust Median Statistic (RoMS) and other plots may be used to visually identify variable star candidates. The two distinguishing features of VaST are its ability to perform accurate aperture photometry of images obtained with non-linear detectors and to handle complex image distortions. VaST can be used in cases of unstable PSF (e.g., bad guiding or with digitized wide-field photographic images), and has been successfully applied to images obtained with telescopes ranging from 0.08 to 2.5m in diameter equipped with a variety of detectors including CCD, CMOS, MIC and photographic plates.

  11. MAGNET TOOLKIT: DESIGN, IMPLEMENTATION, AND EVALUATION.

    SciTech Connect

    Hay, J. R.; Feng, W. C.; Gardner, M. K.

    2001-01-01

    The current trend in constructing high-performance computing systems is to connect a large number of machines via a fast interconnect or a large-scale network such as the Internet, This approach relies on the performance of the interconnect (or Internet) to enable Past, large-scale distributed computing. A detailed understanding of the communication traffic is required in order to optimize the operation of entire system. Network researchers traditionally monitor traffic in the network to gain the insight necessary to optimize network operations. Recent work suggests additional insight can be obtained by also monitoring trafflc at the application level. The Monitor for Application-Generated Network Traffic toolkit (MAGNeT) we describe here monitors application trallic patterns In production systems, thus enabling more highly optimized networks and interconnects for the next generation of high performance computing system. Keywords- monitor, measurement, network protocol, traffic characterization, TCP, MAGNet, traces, application-generated traffic, virtual supercomputing, network-aware applications, computational giids, high-perfomiance computing.

  12. Security Assessment Simulation Toolkit (SAST) Final Report

    SciTech Connect

    Meitzler, Wayne D.; Ouderkirk, Steven J.; Hughes, Chad O.

    2009-11-15

    The Department of Defense Technical Support Working Group (DoD TSWG) investment in the Pacific Northwest National Laboratory (PNNL) Security Assessment Simulation Toolkit (SAST) research planted a technology seed that germinated into a suite of follow-on Research and Development (R&D) projects culminating in software that is used by multiple DoD organizations. The DoD TSWG technology transfer goal for SAST is already in progress. The Defense Information Systems Agency (DISA), the Defense-wide Information Assurance Program (DIAP), the Marine Corps, Office Of Naval Research (ONR) National Center For Advanced Secure Systems Research (NCASSR) and Office Of Secretary Of Defense International Exercise Program (OSD NII) are currently investing to take SAST to the next level. PNNL currently distributes the software to over 6 government organizations and 30 DoD users. For the past five DoD wide Bulwark Defender exercises, the adoption of this new technology created an expanding role for SAST. In 2009, SAST was also used in the OSD NII International Exercise and is currently scheduled for use in 2010.

  13. Asteroids Outreach Toolkit Development: Using Iterative Feedback In Informal Education

    NASA Astrophysics Data System (ADS)

    White, Vivian; Berendsen, M.; Gurton, S.; Dusenbery, P. B.

    2011-01-01

    The Night Sky Network is a collaboration of close to 350 astronomy clubs across the US that actively engage in public outreach within their communities. Since 2004, the Astronomical Society of the Pacific has been creating outreach ToolKits filled with carefully crafted sets of physical materials designed to help these volunteer clubs explain the wonders of the night sky to the public. The effectiveness of the ToolKit activities and demonstrations is the direct result of a thorough testing and vetting process. Find out how this iterative assessment process can help other programs create useful tools for both formal and informal educators. The current Space Rocks Outreach ToolKit focuses on explaining asteroids, comets, and meteorites to the general public using quick, big-picture activities that get audiences involved. Eight previous ToolKits cover a wide range of topics from the Moon to black holes. In each case, amateur astronomers and the public helped direct the development the activities along the way through surveys, focus groups, and active field-testing. The resulting activities have been embraced by the larger informal learning community and are enthusiastically being delivered to millions of people across the US and around the world. Each ToolKit is delivered free of charge to active Night Sky Network astronomy clubs. All activity write-ups are available free to download at the website listed here. Amateur astronomers receive frequent questions from the public about Earth impacts, meteors, and comets so this set of activities will help them explain the dynamics of these phenomena to the public. The Space Rocks ToolKit resources complement the Great Balls of Fire museum exhibit produced by Space Science Institute's National Center for Interactive Learning and scheduled for release in 2011. NSF has funded this national traveling exhibition and outreach ToolKit under Grant DRL-0813528.

  14. Census of Population and Housing, 1980: Summary Tape File 1F, School Districts [machine-readable data file].

    ERIC Educational Resources Information Center

    Bureau of the Census (DOC), Washington, DC. Data User Services Div.

    The 1980 Census of Population and Housing Summary Tape File 1F--the School Districts File--is presented. The file contains complete-count data of population and housing aggregated by school district. Population items tabulated include age, race (provisional data), sex, marital status, Spanish origin (provisional data), household type, and…

  15. The Configuration Space Toolkit (C-Space Toolkit or CSTK) Ver. 2.5 beta

    SciTech Connect

    Chen, Pang-Chieh; Hwang, Yong; Xavier, Patrick; Lewis, Christopher; Lafarge, Robert; & Watterberg, Peter

    2010-02-24

    The C-Space Toolkit provides a software library that makes it easier to program motion planning, simulation, robotics, and virtual reality codes using the Configuration Space abstraction. Key functionality (1) enables the user to special create representations of movable and stationary rigid geometric objects, and (2) perform fast distance, interference (clash) detection, collision detection, closest-feature pairs, and contact queries in terms of object configuration. Not only can queries be computed at any given point in configuration space, but they can be done exactly over linear-translational path segments and approximately for rotational path segments. Interference detection and distance computations can be done with respect to the Minkowski sum of the original geometry and a piece of convex geometry. The Toolkit takes as raw model input (1) collections of convex polygons that form the boundaries of models and (2) convex polyhedra, cones, cylinders, and discs that are models and model components. Configurations are given in terms of homogeneous transforms. A simple OpenGL-based system for displaying and animating the geometric objects is included in the implementation. This version, 2.5 Beta, incorporates feature additions and enhancements, improvements in algorithms, improved robustness, bug fixes and cleaned-up source code, better compliance with standards and recent programming convention, changes to the build process for the software, support for more recent hardware and software platforms, and improvements to documentation and source-code comments.

  16. A Toolkit for Eye Recognition of LAMOST Spectroscopy

    NASA Astrophysics Data System (ADS)

    Yuan, H.; Zhang, H.; Zhang, Y.; Lei, Y.; Dong, Y.; Zhao, Y.

    2014-05-01

    The Large sky Area Multi-Object fiber Spectroscopic Telescope (LAMOST, also named the Guo Shou Jing Telescope) has finished the pilot survey and now begun the normal survey by the end of 2012 September. There have already been millions of targets observed, including thousands of quasar candidates. Because of the difficulty in the automatic identification of quasar spectra, eye recognition is always necessary and efficient. However massive spectra identification by eye is a huge job. In order to improve the efficiency and effectiveness of spectra , a toolkit for eye recognition of LAMOST spectroscopy is developed. Spectral cross-correlation templates from the Sloan Digital Sky Survey (SDSS) are applied as references, including O star, O/B transition star, B star, A star, F/A transition star, F star, G star, K star, M1 star, M3 star,M5 star,M8 star, L1 star, magnetic white dwarf, carbon star, white dwarf, B white dwarf, low metallicity K sub-dwarf, "Early-type" galaxy, galaxy, "Later-type" galaxy, Luminous Red Galaxy, QSO, QSO with some BAL activity and High-luminosity QSO. By adjusting the redshift and flux ratio of the template spectra in an interactive graphic interface, the spectral type of the target can be discriminated in a easy and feasible way and the redshift is estimated at the same time with a precision of about millesimal. The advantage of the tool in dealing with low quality spectra is indicated. Spectra from the Pilot Survey of LAMSOT are applied as examples and spectra from SDSS are also tested from comparison. Target spectra in both image format and fits format are supported. For convenience several spectra accessing manners are provided. All the spectra from LAMOST pilot survey can be located and acquired via the VOTable files on the internet as suggested by International Virtual Observatory Alliance (IVOA). After the construction of the Simple Spectral Access Protocol (SSAP) service by the Chinese Astronomical Data Center (CAsDC), spectra can be

  17. Hydrogen bonding asymmetric star-shape derivative of bile acid leads to supramolecular fibrillar aggregates that wrap into micrometer spheres† †Electronic supplementary information (ESI) available: Materials and methods, experimental section, and characterization. See DOI: 10.1039/c6sm01329e Click here for additional data file. Click here for additional data file. Click here for additional data file.

    PubMed Central

    Myllymäki, Teemu T. T.; Yang, Hongjun; Liljeström, Ville; Kostiainen, Mauri A.; Malho, Jani-Markus; Zhu, X. X.

    2016-01-01

    We report that star-shaped molecules with cholic acid cores asymmetrically grafted by low molecular weight polymers with hydrogen bonding end-groups undergo aggregation to nanofibers, which subsequently wrap into micrometer spherical aggregates with low density cores. Therein the facially amphiphilic cholic acid (CA) is functionalized by four flexible allyl glycidyl ether (AGE) side chains, which are terminated with hydrogen bonding 2-ureido-4[1H]pyrimidinone (UPy) end-groups as connected by hexyl spacers, denoted as CA(AGE6-C6H12-UPy)4. This wedge-shaped molecule is expected to allow the formation of a rich variety of solvent-dependent structures due to the complex interplay of interactions, enabled by its polar/nonpolar surface-active structure, the hydrophobicity of the CA in aqueous medium, and the possibility to control hydrogen bonding between UPy molecules by solvent selection. In DMSO, the surfactant-like CA(AGE6-C6H12-UPy)4 self-assembles into nanometer scale micelles, as expected due to its nonpolar CA apexes, solubilized AGE6-C6H12-UPy chains, and suppressed mutual hydrogen bonds between the UPys. Dialysis in water leads to nanofibers with lateral dimensions of 20–50 nm. This is explained by promoted aggregation as the hydrogen bonds between UPy molecules start to become activated, the reduced solvent dispersibility of the AGE-chains, and the hydrophobicity of CA. Finally, in pure water the nanofibers wrap into micrometer spheres having low density cores. In this case, strong complementary hydrogen bonds between UPy molecules of different molecules can form, thus promoting lateral interactions between the nanofibers, as allowed by the hydrophobic hexyl spacers. The wrapping is illustrated by transmission electron microscopy tomographic 3D reconstructions. More generally, we foresee hierarchically structured matter bridging the length scales from molecular to micrometer scale by sequentially triggering supramolecular interactions. PMID:27491728

  18. The MPI Bioinformatics Toolkit for protein sequence analysis.

    PubMed

    Biegert, Andreas; Mayer, Christian; Remmert, Michael; Söding, Johannes; Lupas, Andrei N

    2006-07-01

    The MPI Bioinformatics Toolkit is an interactive web service which offers access to a great variety of public and in-house bioinformatics tools. They are grouped into different sections that support sequence searches, multiple alignment, secondary and tertiary structure prediction and classification. Several public tools are offered in customized versions that extend their functionality. For example, PSI-BLAST can be run against regularly updated standard databases, customized user databases or selectable sets of genomes. Another tool, Quick2D, integrates the results of various secondary structure, transmembrane and disorder prediction programs into one view. The Toolkit provides a friendly and intuitive user interface with an online help facility. As a key feature, various tools are interconnected so that the results of one tool can be forwarded to other tools. One could run PSI-BLAST, parse out a multiple alignment of selected hits and send the results to a cluster analysis tool. The Toolkit framework and the tools developed in-house will be packaged and freely available under the GNU Lesser General Public Licence (LGPL). The Toolkit can be accessed at http://toolkit.tuebingen.mpg.de.

  19. A medical imaging and visualization toolkit in Java

    NASA Astrophysics Data System (ADS)

    Huang, Su; Baimouratov, Rafail; Xiao, Pengdong; Ananthasubramaniam, Anand; Nowinski, Wieslaw L.

    2004-05-01

    Medical imaging research and clinical applications usually require combination and integration of different technology from image processing to realistic visualization to user-friendly interaction. Researchers with different background and from various research areas have been using numerous types of hardware, software and environments to produce their research results. It is unusual that students must build their working and testing tools from scratch again and again. A generic and flexible medical imaging and visualization toolkit would be helpful in medical research and educational institutes to reduce redundant development work and hence prompt their research efficiency. In our lab, we have developed a Medical Imaging and Visualization Toolkit (BIL-kit), which is a set of comprehensive libraries as well as a number of interactive tools. It covers a wide range of fundamental functions from image conversion and transformation, image segmentation and analysis, to geometric model generation and manipulation, all the way up to 3D visualization and interactive simulation. The toolkit design and implementation emphasize the reusability and flexibility. BIL-kit is implemented by using Java language because of its advantage in platform independent, so that the toolkit will work in hybrid and dynamics research and educational environments. This also allows the toolkit to extend its usage in web based application development. BILkit is a suitable platform for researchers and students to develop visualization and simulation prototypes as well as it can also be used for development of clinical applications.

  20. The Einstein Toolkit: a community computational infrastructure for relativistic astrophysics

    NASA Astrophysics Data System (ADS)

    Löffler, Frank; Faber, Joshua; Bentivegna, Eloisa; Bode, Tanja; Diener, Peter; Haas, Roland; Hinder, Ian; Mundim, Bruno C.; Ott, Christian D.; Schnetter, Erik; Allen, Gabrielle; Campanelli, Manuela; Laguna, Pablo

    2012-06-01

    We describe the Einstein Toolkit, a community-driven, freely accessible computational infrastructure intended for use in numerical relativity, relativistic astrophysics, and other applications. The toolkit, developed by a collaboration involving researchers from multiple institutions around the world, combines a core set of components needed to simulate astrophysical objects such as black holes, compact objects, and collapsing stars, as well as a full suite of analysis tools. The Einstein Toolkit is currently based on the Cactus framework for high-performance computing and the Carpet adaptive mesh refinement driver. It implements spacetime evolution via the BSSN evolution system and general relativistic hydrodynamics in a finite-volume discretization. The toolkit is under continuous development and contains many new code components that have been publicly released for the first time and are described in this paper. We discuss the motivation behind the release of the toolkit, the philosophy underlying its development, and the goals of the project. A summary of the implemented numerical techniques is included, as are results of numerical test covering a variety of sample astrophysical problems.

  1. Teaching and learning "on the run": ready-to-use toolkits in busy clinical settings.

    PubMed

    Cleary, Michelle; Walter, Garry

    2010-06-01

    Clinicians should strongly consider using toolkits in their workplaces with students on clinical placement. These toolkits could include brief quizzes, crossword puzzles, vignettes, role-playing, storytelling, or reflective activities to engage students in context-specific, collaborative learning.

  2. Dosimetry applications in GATE Monte Carlo toolkit.

    PubMed

    Papadimitroulas, Panagiotis

    2017-02-21

    Monte Carlo (MC) simulations are a well-established method for studying physical processes in medical physics. The purpose of this review is to present GATE dosimetry applications on diagnostic and therapeutic simulated protocols. There is a significant need for accurate quantification of the absorbed dose in several specific applications such as preclinical and pediatric studies. GATE is an open-source MC toolkit for simulating imaging, radiotherapy (RT) and dosimetry applications in a user-friendly environment, which is well validated and widely accepted by the scientific community. In RT applications, during treatment planning, it is essential to accurately assess the deposited energy and the absorbed dose per tissue/organ of interest, as well as the local statistical uncertainty. Several types of realistic dosimetric applications are described including: molecular imaging, radio-immunotherapy, radiotherapy and brachytherapy. GATE has been efficiently used in several applications, such as Dose Point Kernels, S-values, Brachytherapy parameters, and has been compared against various MC codes which are considered as standard tools for decades. Furthermore, the presented studies show reliable modeling of particle beams when comparing experimental with simulated data. Examples of different dosimetric protocols are reported for individualized dosimetry and simulations combining imaging and therapy dose monitoring, with the use of modern computational phantoms. Personalization of medical protocols can be achieved by combining GATE MC simulations with anthropomorphic computational models and clinical anatomical data. This is a review study, covering several dosimetric applications of GATE, and the different tools used for modeling realistic clinical acquisitions with accurate dose assessment. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  3. 78 FR 45464 - Broadband Data Improvement Act; Eligible Entities Aggregate Form 477 Data

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-29

    ... Communications Commission adopts rules interpreting and implementing sections of the Broadband Data Improvement.... Once loaded onto a computer, the files containing aggregate data shall be password protected immediately. The aggregate data may not be stored on a computer after being analyzed. Consequently, aggregate...

  4. An epigenetic toolkit allows for diverse genome architectures in eukaryotes.

    PubMed

    Maurer-Alcalá, Xyrus X; Katz, Laura A

    2015-12-01

    Genome architecture varies considerably among eukaryotes in terms of both size and structure (e.g. distribution of sequences within the genome, elimination of DNA during formation of somatic nuclei). The diversity in eukaryotic genome architectures and the dynamic processes are only possible due to the well-developed epigenetic toolkit, which probably existed in the Last Eukaryotic Common Ancestor (LECA). This toolkit may have arisen as a means of navigating the genomic conflict that arose from the expansion of transposable elements within the ancestral eukaryotic genome. This toolkit has been coopted to support the dynamic nature of genomes in lineages across the eukaryotic tree of life. Here we highlight how the changes in genome architecture in diverse eukaryotes are regulated by epigenetic processes, such as DNA elimination, genome rearrangements, and adaptive changes to genome architecture. The ability to epigenetically modify and regulate genomes has contributed greatly to the diversity of eukaryotes observed today. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Validation of Power Output for the WIND Toolkit

    SciTech Connect

    King, J.; Clifton, A.; Hodge, B. M.

    2014-09-01

    Renewable energy integration studies require wind data sets of high quality with realistic representations of the variability, ramping characteristics, and forecast performance for current wind power plants. The Wind Integration National Data Set (WIND) Toolkit is meant to be an update for and expansion of the original data sets created for the weather years from 2004 through 2006 during the Western Wind and Solar Integration Study and the Eastern Wind Integration Study. The WIND Toolkit expands these data sets to include the entire continental United States, increasing the total number of sites represented, and it includes the weather years from 2007 through 2012. In addition, the WIND Toolkit has a finer resolution for both the temporal and geographic dimensions. Three separate data sets will be created: a meteorological data set, a wind power data set, and a forecast data set. This report describes the validation of the wind power data set.

  6. Measure Up Pressure Down: Provider Toolkit to Improve Hypertension Control.

    PubMed

    Torres, Jennifer

    2016-05-01

    Hypertension is one of the most important risk factors for heart disease, stroke, kidney failure, and diabetes complications. Nearly one in three Americans adults has high blood pressure, and the cost associated with treating this condition is staggering. The Measure Up Pressure Down: Provider Toolkit to Improve Hypertension Control is a resource developed by the American Medical Group Foundation in partnership with the American Medical Group Association. The goal of this toolkit is to mobilize health care practitioners to work together through team-based approaches to achieve an 80% control rate of high blood pressure among their patient population. The toolkit can be used by health educators, clinic administrators, physicians, students, and other clinic staff as a step-by-step resource for developing the infrastructure needed to better identify and treat individuals with high blood pressure or other chronic conditions.

  7. An epigenetic toolkit allows for diverse genome architectures in eukaryotes

    PubMed Central

    Maurer-Alcalá, Xyrus X.; Katz, Laura A.

    2015-01-01

    Genome architecture varies considerably among eukaryotes in terms of both size and structure (e.g. distribution of sequences within the genome, elimination of DNA during formation of somatic nuclei). The diversity in eukaryotic genome architectures and the dynamic processes that they undergo are only possible due to the well-developed nature of an epigenetic toolkit, which likely existed in the Last Eukaryotic Common Ancestor (LECA). This toolkit may have arisen as a means of navigating the genomic conflict that arose from the expansion of transposable elements within the ancestral eukaryotic genome. This toolkit has been coopted to support the dynamic nature of genomes in lineages across the eukaryotic tree of life. Here we highlight how the changes in genome architecture in diverse eukaryotes are regulated by epigenetic processes by focusing on DNA elimination, genome rearrangements, and adaptive changes to genome architecture. The ability to epigenetically modify and regulate genomes has contributed greatly to the diversity of eukaryotes observed today. PMID:26649755

  8. Comparison of open-source visual analytics toolkits

    NASA Astrophysics Data System (ADS)

    Harger, John R.; Crossno, Patricia J.

    2012-01-01

    We present the results of the first stage of a two-stage evaluation of open source visual analytics packages. This stage is a broad feature comparison over a range of open source toolkits. Although we had originally intended to restrict ourselves to comparing visual analytics toolkits, we quickly found that very few were available. So we expanded our study to include information visualization, graph analysis, and statistical packages. We examine three aspects of each toolkit: visualization functions, analysis capabilities, and development environments. With respect to development environments, we look at platforms, language bindings, multi-threading/parallelism, user interface frameworks, ease of installation, documentation, and whether the package is still being actively developed.

  9. Cyber Security Audit and Attack Detection Toolkit

    SciTech Connect

    Peterson, Dale

    2012-05-31

    This goal of this project was to develop cyber security audit and attack detection tools for industrial control systems (ICS). Digital Bond developed and released a tool named Bandolier that audits ICS components commonly used in the energy sector against an optimal security configuration. The Portaledge Project developed a capability for the PI Historian, the most widely used Historian in the energy sector, to aggregate security events and detect cyber attacks.

  10. SatelliteDL: a Toolkit for Analysis of Heterogeneous Satellite Datasets

    NASA Astrophysics Data System (ADS)

    Galloy, M. D.; Fillmore, D.

    2014-12-01

    SatelliteDL is an IDL toolkit for the analysis of satellite Earth observations from a diverse set of platforms and sensors. The core function of the toolkit is the spatial and temporal alignment of satellite swath and geostationary data. The design features an abstraction layer that allows for easy inclusion of new datasets in a modular way. Our overarching objective is to create utilities that automate the mundane aspects of satellite data analysis, are extensible and maintainable, and do not place limitations on the analysis itself. IDL has a powerful suite of statistical and visualization tools that can be used in conjunction with SatelliteDL. Toward this end we have constructed SatelliteDL to include (1) HTML and LaTeX API document generation,(2) a unit test framework,(3) automatic message and error logs,(4) HTML and LaTeX plot and table generation, and(5) several real world examples with bundled datasets available for download. For ease of use, datasets, variables and optional workflows may be specified in a flexible format configuration file. Configuration statements may specify, for example, a region and date range, and the creation of images, plots and statistical summary tables for a long list of variables. SatelliteDL enforces data provenance; all data should be traceable and reproducible. The output NetCDF file metadata holds a complete history of the original datasets and their transformations, and a method exists to reconstruct a configuration file from this information. Release 0.1.0 distributes with ingest methods for GOES, MODIS, VIIRS and CERES radiance data (L1) as well as select 2D atmosphere products (L2) such as aerosol and cloud (MODIS and VIIRS) and radiant flux (CERES). Future releases will provide ingest methods for ocean and land surface products, gridded and time averaged datasets (L3 Daily, Monthly and Yearly), and support for 3D products such as temperature and water vapor profiles. Emphasis will be on NPP Sensor, Environmental and

  11. BuddySuite: Command-line toolkits for manipulating sequences, alignments, and phylogenetic trees.

    PubMed

    Bond, Stephen R; Keat, Karl E; Barreira, Sofia N; Baxevanis, Andreas D

    2017-02-25

    The ability to manipulate sequence, alignment, and phylogenetic tree files has become an increasingly important skill in the life sciences, whether to generate summary information or to prepare data for further downstream analysis. The command line can be an extremely powerful environment for interacting with these resources, but only if the user has the appropriate general-purpose tools on hand. BuddySuite is a collection of four independent yet interrelated command-line toolkits that facilitate each step in the workflow of sequence discovery, curation, alignment, and phylogenetic reconstruction. Most common sequence, alignment, and tree file formats are automatically detected and parsed, and over 100 tools have been implemented for manipulating these data. The project has been engineered to easily accommodate the addition of new tools, it is written in the popular programming language Python, and is hosted on the Python Package Index and GitHub to maximize accessibility. Documentation for each BuddySuite tool, including usage examples, is available at http://tiny.cc/buddysuite wiki. All software is open source and freely available through http://research.nhgri.nih.gov/software/BuddySuite.

  12. The PRIDE (Partnership to Improve Diabetes Education) Toolkit

    PubMed Central

    Wolff, Kathleen; Chambers, Laura; Bumol, Stefan; White, Richard O.; Gregory, Becky Pratt; Davis, Dianne; Rothman, Russell L.

    2016-01-01

    Purpose Patients with low literacy, low numeracy, and/or linguistic needs can experience challenges understanding diabetes information and applying concepts to their self-management. The authors designed a toolkit of education materials that are sensitive to patients' literacy and numeracy levels, language preferences, and cultural norms and that encourage shared goal setting to improve diabetes self-management and health outcomes. The Partnership to Improve Diabetes Education (PRIDE) toolkit was developed to facilitate diabetes self-management education and support. Methods The PRIDE toolkit includes a comprehensive set of 30 interactive education modules in English and Spanish to support diabetes self-management activities. The toolkit builds upon the authors' previously validated Diabetes Literacy and Numeracy Education Toolkit (DLNET) by adding a focus on shared goal setting, addressing the needs of Spanish-speaking patients, and including a broader range of diabetes management topics. Each PRIDE module was evaluated using the Suitability Assessment of Materials (SAM) instrument to determine the material's cultural appropriateness and its sensitivity to the needs of patients with low literacy and low numeracy. Reading grade level was also assessed using the Automated Readability Index (ARI), Coleman-Liau, Flesch-Kincaid, Fry, and SMOG formulas. Conclusions The average reading grade level of the materials was 5.3 (SD 1.0), with a mean SAM of 91.2 (SD 5.4). All of the 30 modules received a “superior” score (SAM >70%) when evaluated by 2 independent raters. The PRIDE toolkit modules can be used by all members of a multidisciplinary team to assist patients with low literacy and low numeracy in managing their diabetes. PMID:26647414

  13. RAVE—a Detector-independent vertex reconstruction toolkit

    NASA Astrophysics Data System (ADS)

    Waltenberger, Wolfgang; Mitaroff, Winfried; Moser, Fabian

    2007-10-01

    A detector-independent toolkit for vertex reconstruction (RAVE ) is being developed, along with a standalone framework (VERTIGO ) for testing, analyzing and debugging. The core algorithms represent state of the art for geometric vertex finding and fitting by both linear (Kalman filter) and robust estimation methods. Main design goals are ease of use, flexibility for embedding into existing software frameworks, extensibility, and openness. The implementation is based on modern object-oriented techniques, is coded in C++ with interfaces for Java and Python, and follows an open-source approach. A beta release is available. VERTIGO = "vertex reconstruction toolkit and interface to generic objects".

  14. Deconstructing the toolkit: creativity and risk in the NHS workforce.

    PubMed

    Allen, Von; Brodzinski, Emma

    2009-12-01

    Deconstructing the Toolkit explores the current desire for toolkits that promise failsafe structures to facilitate creative success. The paper examines this cultural phenomenon within the context of the risk-averse workplace-with particular focus on the NHS. The writers draw on Derrida and deconstructionism to reflect upon the principles of creativity and the possibilities for being creative within the workplace. Through reference to The Extra Mile project facilitated by Open Art, the paper examines the importance of engaging with an aesthetic of creativity and embracing a more holistic approach to the problems and potential of the creative process.

  15. A Machine Learning and Optimization Toolkit for the Swarm

    DTIC Science & Technology

    2014-11-17

    Machine   Learning  and  Op0miza0on   Toolkit  for  the  Swarm   Ilge  Akkaya,  Shuhei  Emoto...3. DATES COVERED 00-00-2014 to 00-00-2014 4. TITLE AND SUBTITLE A Machine Learning and Optimization Toolkit for the Swarm 5a. CONTRACT NUMBER... machine   learning   methodologies  by  providing  the  right  interfaces  between   machine   learning  tools  and

  16. Methods for Evaluating Text Extraction Toolkits: An Exploratory Investigation

    DTIC Science & Technology

    2015-01-22

    M T R 1 4 0 4 4 3 R 2 M I T R E T E C H N I C A L R E P O R T Methods for Evaluating Text Extraction Toolkits: An...JAN 2015 2. REPORT TYPE 3. DATES COVERED 00-00-2015 to 00-00-2015 4. TITLE AND SUBTITLE Methods for Evaluating Text Extraction Toolkits: An...DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release; distribution unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT Text extraction

  17. User's manual for the two-dimensional transputer graphics toolkit

    NASA Technical Reports Server (NTRS)

    Ellis, Graham K.

    1988-01-01

    The user manual for the 2-D graphics toolkit for a transputer based parallel processor is presented. The toolkit consists of a package of 2-D display routines that can be used for the simulation visualizations. It supports multiple windows, double buffered screens for animations, and simple graphics transformations such as translation, rotation, and scaling. The display routines are written in occam to take advantage of the multiprocessing features available on transputers. The package is designed to run on a transputer separate from the graphics board.

  18. A clinical research analytics toolkit for cohort study.

    PubMed

    Yu, Yiqin; Zhu, Yu; Sun, Xingzhi; Tao, Ying; Zhang, Shuo; Xu, Linhao; Pan, Yue

    2012-01-01

    This paper presents a clinical informatics toolkit that can assist physicians to conduct cohort studies effectively and efficiently. The toolkit has three key features: 1) support of procedures defined in epidemiology, 2) recommendation of statistical methods in data analysis, and 3) automatic generation of research reports. On one hand, our system can help physicians control research quality by leveraging the integrated knowledge of epidemiology and medical statistics; on the other hand, it can improve productivity by reducing the complexities for physicians during their cohort studies.

  19. PyCogent: a toolkit for making sense from sequence

    PubMed Central

    Knight, Rob; Maxwell, Peter; Birmingham, Amanda; Carnes, Jason; Caporaso, J Gregory; Easton, Brett C; Eaton, Michael; Hamady, Micah; Lindsay, Helen; Liu, Zongzhi; Lozupone, Catherine; McDonald, Daniel; Robeson, Michael; Sammut, Raymond; Smit, Sandra; Wakefield, Matthew J; Widmann, Jeremy; Wikman, Shandy; Wilson, Stephanie; Ying, Hua; Huttley, Gavin A

    2007-01-01

    We have implemented in Python the COmparative GENomic Toolkit, a fully integrated and thoroughly tested framework for novel probabilistic analyses of biological sequences, devising workflows, and generating publication quality graphics. PyCogent includes connectors to remote databases, built-in generalized probabilistic techniques for working with biological sequences, and controllers for third-party applications. The toolkit takes advantage of parallel architectures and runs on a range of hardware and operating systems, and is available under the general public license from . PMID:17708774

  20. RAPID Toolkit Creates Smooth Flow Toward New Projects

    SciTech Connect

    Levine, Aaron; Young, Katherine

    2016-07-01

    Uncertainty about the duration and outcome of the permitting process has historically been seen as a deterrent to investment in renewable energy projects, including new hydropower projects. What if the process were clearer, smoother, faster? That's the purpose of the Regulatory and Permitting Information Desktop (RAPID) Toolkit, developed by the National Renewable Energy Laboratory (NREL) with funding from the U.S. Department of Energy (DOE) Office of Energy Efficiency and Renewable Energy and the Western Governors' Association. Now, the RAPID Toolkit is being expanded to include information about developing and permitting hydropower projects, with initial outreach and information gathering occurring during 2015.

  1. A Toolkit of Systems Gaming Techniques

    NASA Astrophysics Data System (ADS)

    Finnigan, David; McCaughey, Jamie W.

    2017-04-01

    Decision-makers facing natural hazard crises need a broad set of cognitive tools to help them grapply with complexity. Systems gaming can act as a kind of 'flight simulator for decision making' enabling us to step through real life complex scenarios of the kind that beset us in natural disaster situations. Australian science-theatre ensemble Boho Interactive is collaborating with the Earth Observatory Singapore to develop an in-person systems game modelling an unfolding natural hazard crisis (volcanic unrest or an approaching typhoon) impacting an Asian city. Through a combination of interactive mechanisms drawn from boardgaming and participatory theatre, players will make decisions and assign resources in response to the unfolding crisis. In this performance, David Finnigan from Boho will illustrate some of the participatory techniques that Boho use to illustrate key concepts from complex systems science. These activities are part of a toolkit which can be adapted to fit a range of different contexts and scenarios. In this session, David will present short activities that demonstrate a range of systems principles including common-pool resource challenges (the Tragedy of the Commons), interconnectivity, unintended consequences, tipping points and phase transitions, and resilience. The interactive mechanisms for these games are all deliberately lo-fi rather than digital, for three reasons. First, the experience of a tactile, hands-on game is more immediate and engaging. It brings the focus of the participants into the room and facilitates engagement with the concepts and with each other, rather than with individual devices. Second, the mechanics of the game are laid bare. This is a valuable way to illustrate that complex systems are all around us, and are not merely the domain of hi-tech systems. Finally, these games can be used in a wide variety of contexts by removing computer hardware requirements and instead using materials and resources that are easily found in

  2. Incident Management Preparedness and Coordination Toolkit

    SciTech Connect

    Koch, Daniel B.

    2013-04-01

    As with many professions, safety planners and first responders tend to be specialists in certain areas. To be truly useful, tools should be tailored to meet their specific needs. Thus, general software suites aimed at the professional geographic information system (GIS) community might not be the best solution for a first responder with little training in GIS terminology and techniques. On the other hand, commonly used web-based map viewers may not have the capability to be customized for the planning, response, and recovery (PR&R) mission. Data formats should be open and foster easy information flow among local, state, and federal partners. Tools should be free or low-cost to address real-world budget constraints at the local level. They also need to work both with and without a network connection to be robust. The Incident Management Preparedness and Coordination Toolkit (IMPACT) can satisfy many of these needs while working in harmony with established systems at the local, state, and federal levels. The IMPACT software framework, termed the Geospatial Integrated Problem Solving Environment (GIPSE), organizes tasks, tools, and resources for the end user. It uses the concept of software wizards to both customize and extend its functionality. On the Tasks panel are a number of buttons used to initiate various operations. Similar to macros, these task buttons launch scripts that utilize the full functionality of the underlying foundational components such as the SQL spatial database and ORNL-developed map editor. The user is presented with a series of instruction pages which are implemented with HTML for interactivity. On each page are links which initiate specific actions such as creating a map showing various features. Additional tasks may be quickly programmed and added to the panel. The end user can customize the graphical interface to faciltate its use during an emergency. One of the major components of IMPACT is the ORNL Geospatial Viewer (OGV). It is used to

  3. ';Best' Practices for Aggregating Subset Results from Archived Datasets

    NASA Astrophysics Data System (ADS)

    Baskin, W. E.; Perez, J.

    2013-12-01

    In response to the exponential growth in science data analysis and visualization capabilities Data Centers have been developing new delivery mechanisms to package and deliver large volumes of aggregated subsets of archived data. New standards are evolving to help data providers and application programmers deal with growing needs of the science community. These standards evolve from the best practices gleaned from new products and capabilities. The NASA Atmospheric Sciences Data Center (ASDC) has developed and deployed production provider-specific search and subset web applications for the CALIPSO, CERES, TES, and MOPITT missions. This presentation explores several use cases that leverage aggregated subset results and examines the standards and formats ASDC developers applied to the delivered files as well as the implementation strategies for subsetting and processing the aggregated products. The following topics will be addressed: - Applications of NetCDF CF conventions to aggregated level 2 satellite subsets - Data-Provider-Specific format requirements vs. generalized standards - Organization of the file structure of aggregated NetCDF subset output - Global Attributes of individual subsetted files vs. aggregated results - Specific applications and framework used for subsetting and delivering derivative data files

  4. Geospatial Toolkits and Resource Maps for Selected Countries from the National Renewable Energy Laboratory (NREL)

    DOE Data Explorer

    NREL developed the Geospatial Toolkit (GsT), a map-based software application that integrates resource data and geographic information systems (GIS) for integrated resource assessment. A variety of agencies within countries, along with global datasets, provided country-specific data. Originally developed in 2005, the Geospatial Toolkit was completely redesigned and re-released in November 2010 to provide a more modern, easier-to-use interface with considerably faster analytical querying capabilities. Toolkits are available for 21 countries and each one can be downloaded separately. The source code for the toolkit is also available. [Taken and edited from http://www.nrel.gov/international/geospatial_toolkits.html

  5. The Archivists' Toolkit: Another Step toward Streamlined Archival Processing

    ERIC Educational Resources Information Center

    Westbrook, Bradley D.; Mandell, Lee; Shepherd, Kelcy; Stevens, Brian; Varghese, Jason

    2006-01-01

    The Archivists' Toolkit is a software application currently in development and designed to support the creation and management of archival information. This article summarizes the development of the application, including some of the problems the application is designed to resolve. Primary emphasis is placed on describing the application's…

  6. New MISR Toolkit Version 1.4.1 Available

    Atmospheric Science Data Center

    2014-09-03

    ... of the MISR Toolkit (MTK) is now available from the The Open Channel Foundation .  The MTK is a simplified programming interface built upon HDF-EOS to access MISR Level 1B2, Level 2, and ancillary data products. It also handles ...

  7. The MPI Bioinformatics Toolkit for protein sequence analysis

    PubMed Central

    Biegert, Andreas; Mayer, Christian; Remmert, Michael; Söding, Johannes; Lupas, Andrei N.

    2006-01-01

    The MPI Bioinformatics Toolkit is an interactive web service which offers access to a great variety of public and in-house bioinformatics tools. They are grouped into different sections that support sequence searches, multiple alignment, secondary and tertiary structure prediction and classification. Several public tools are offered in customized versions that extend their functionality. For example, PSI-BLAST can be run against regularly updated standard databases, customized user databases or selectable sets of genomes. Another tool, Quick2D, integrates the results of various secondary structure, transmembrane and disorder prediction programs into one view. The Toolkit provides a friendly and intuitive user interface with an online help facility. As a key feature, various tools are interconnected so that the results of one tool can be forwarded to other tools. One could run PSI-BLAST, parse out a multiple alignment of selected hits and send the results to a cluster analysis tool. The Toolkit framework and the tools developed in-house will be packaged and freely available under the GNU Lesser General Public Licence (LGPL). The Toolkit can be accessed at . PMID:16845021

  8. The Complete Guide to RTI: An Implementation Toolkit

    ERIC Educational Resources Information Center

    Burton, Dolores; Kappenberg, John

    2012-01-01

    This comprehensive toolkit will bring you up to speed on why RTI is one of the most important educational initiatives in recent history and sets the stage for its future role in teacher education and practice. The authors demonstrate innovative ways to use RTI to inform instruction and guide curriculum development in inclusive classroom settings.…

  9. Jazz: An Extensible Zoomable User Interface Graphics Toolkit in Java

    DTIC Science & Technology

    2000-01-01

    example supports object-oriented 2D graphics, though it has no hierarchies or extensibility. Amulet [19] is a toolkit that supports widgets and custom...McDaniel, R. G., Miller, R. C., Ferrency, A. S., Faulring, A., Kyle, B. D., Mickish, A., Klimovitski, A., & Doane, P. (1997). The Amulet

  10. Practitioner Toolkit: Working with Adult English Language Learners.

    ERIC Educational Resources Information Center

    Lieshoff, Sylvia Cobos; Aguilar, Noemi; McShane, Susan; Burt, Miriam; Peyton, Joy Kreeft; Terrill, Lynda; Van Duzer, Carol

    2004-01-01

    This document is designed to give support to adult education and family literacy instructors who are new to serving adult English language learners and their families in rural, urban, and faith- and community-based programs. The Toolkit is designed to have a positive impact on the teaching and learning in these programs. The results of two…

  11. Using an Assistive Technology Toolkit to Promote Inclusion

    ERIC Educational Resources Information Center

    Judge, Sharon; Floyd, Kim; Jeffs, Tara

    2008-01-01

    Although the use of assistive technology for young children is increasing, the lack of awareness and the lack of training continue to act as major barriers to providers using assistive technology. This article describes an assistive technology toolkit designed for use with young children with disabilities that can be easily assembled and…

  12. Evaluation Toolkit: A Tailored Approach to Evaluation for Parenting Projects.

    ERIC Educational Resources Information Center

    Shaw, Catherine

    This toolkit presents a collection of accessible guidelines, measures, and tools to guide and implement evaluation of parenting education and support interventions. Designed primarily for people who are new to evaluation, it contains additional advice and guidance for those with a higher level of understanding or knowledge and it may also be…

  13. Evaluating Teaching Development Activities in Higher Education: A Toolkit

    ERIC Educational Resources Information Center

    Kneale, Pauline; Winter, Jennie; Turner, Rebecca; Spowart, Lucy; Hughes, Jane; McKenna, Colleen; Muneer, Reema

    2016-01-01

    This toolkit is developed as a resource for providers of teaching-related continuing professional development (CPD) in higher education (HE). It focuses on capturing the longer-term value and impact of CPD for teachers and learners, and moving away from immediate satisfaction measures. It is informed by the literature on evaluating higher…

  14. The Data Toolkit: Ten Tools for Supporting School Improvement

    ERIC Educational Resources Information Center

    Hess, Robert T.; Robbins, Pam

    2012-01-01

    Using data for school improvement is a key goal of Race to the Top, and now is the time to make data-driven school improvement a priority. However, many educators are drowning in data. Boost your professional learning community's ability to translate data into action with this new book from Pam Robbins and Robert T. Hess. "The Data Toolkit"…

  15. 78 FR 58520 - U.S. Environmental Solutions Toolkit

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-24

    ... International Trade Administration U.S. Environmental Solutions Toolkit AGENCY: International Trade... from U.S. businesses capable of exporting their goods or services relevant to (a) arsenic removal... wastewater treatment. The Department of Commerce continues to develop the web-based U.S. Environmental...

  16. Educating Globally Competent Citizens: A Toolkit. Second Edition

    ERIC Educational Resources Information Center

    Elliott-Gower, Steven; Falk, Dennis R.; Shapiro, Martin

    2012-01-01

    Educating Globally Competent Citizens, a product of AASCU's American Democracy Project and its Global Engagement Initiative, introduces readers to a set of global challenges facing society based on the Center for Strategic and International Studies' 7 Revolutions. The toolkit is designed to aid faculty in incorporating global challenges into new…

  17. Simulation toolkit with CMOS detector in the framework of hadrontherapy

    NASA Astrophysics Data System (ADS)

    Rescigno, R.; Finck, Ch.; Juliani, D.; Baudot, J.; Dauvergne, D.; Dedes, G.; Krimmer, J.; Ray, C.; Reithinger, V.; Rousseau, M.; Testa, E.; Winter, M.

    2014-03-01

    Proton imaging can be seen as a powerful technique for on-line monitoring of ion range during carbon ion therapy irradiation. The protons detection technique uses, as three-dimensional tracking system, a set of CMOS sensor planes. A simulation toolkit based on GEANT4 and ROOT is presented including detector response and reconstruction algorithm.

  18. Using AASL's "Health and Wellness" and "Crisis Toolkits"

    ERIC Educational Resources Information Center

    Logan, Debra Kay

    2009-01-01

    Whether a school library program is the picture of good health in a state that mandates a professionally staffed library media center in every building or is suffering in a low-wealth district that is facing drastic cuts, the recently launched toolkits by the American Association of School Librarians (AASL) are stocked with useful strategies and…

  19. Cubit Mesh Generation Toolkit V11.1

    SciTech Connect

    HANKS, BYRON; KERR, ROBERT; KNUPP, PATRICK; MAEZ, JONATHAN; WHITE, DAVID; MITCHELL, SCOTT; OWEN, STEVEN; SHEPHERD, JASON; TAUTGES, TIMOTHY; MELANDER, DARRYL; BLACKER, TEDDY; BORDEN, MICHAEL; BREWER, MICHAEL; CLARK, BRETT; FORTIER, LESLIE; KALLAHER, JENNA; PEBAY, PHILIPPE; STATEN, MATTHEW; VINEYARD, CRAIG; GROVER, BENJAMIN; BENZLEY, STEVEN; SIMPSON, CLINTON; NIELSON, ERIC; KOPP, JOEL; STORM, STEVE; NUGENT, MARK; WALTON, KIRK; BORDEN, MIKE; ERNST, CORY; FOWLER, JOHN; KRAFTCHECL, JASON; STEPHNSON, MIKE; YEOU, RAMMAGAY; MERKLEY, KARL; METERS, RAY; DEWET, MARK; RICHARDS, SARA; PENDLEY, KEVIN; MORRIS, RANDY; RICHARDSON, MARK; VYAS, VED; SHOWMAN, SAM; HAYS, ALEX; TIDWELL, BOYD; MILLAR, ALEX

    2009-03-25

    CUBIT prepares models to be used in computer-based simulation of real-world events. CUBIT is a full-featured software toolkit for robust generation of two- and three-dimensional finite element meshes (grids) and geometry preparation. Its main goal is to reduce the time to generate meshes, particularly large hex meshes of complicated, interlocking assemblies.

  20. ELCAT: An E-Learning Content Adaptation Toolkit

    ERIC Educational Resources Information Center

    Clements, Iain; Xu, Zhijie

    2005-01-01

    Purpose: The purpose of this paper is to present an e-learning content adaptation toolkit--ELCAT--that helps to achieve the objectives of the KTP project No. 3509. Design/methodology/approach: The chosen methodology is absolutely practical. The tool was put into motion and results were observed as university and the collaborating company members…

  1. Roles of the Volunteer in Development: Toolkits for Building Capacity.

    ERIC Educational Resources Information Center

    Slater, Marsha; Allsman, Ava; Savage, Ron; Havens, Lani; Blohm, Judee; Raftery, Kate

    This document, which was developed to assist Peace Corps volunteers and those responsible for training them, presents an introductory booklet and six toolkits for use in the training provided to and by volunteers involved in community development. All the materials emphasize long-term participatory approaches to sustainable development and a…

  2. Automated Generation of Web Services for Visualization Toolkits

    NASA Astrophysics Data System (ADS)

    Jensen, P. A.; Yuen, D. A.; Erlebacher, G.; Bollig, E. F.; Kigelman, D. G.; Shukh, E. A.

    2005-12-01

    The recent explosion in the size and complexity of geophysical data and an increasing trend for collaboration across large geographical areas demand the use of remote, full featured visualization toolkits. As the scientific community shifts toward grid computing to handle these increased demands, new web services are needed to assemble powerful distributed applications. Recent research has established the possibility of converting toolkits such as VTK [1] and Matlab [2] into remote visualization services. We are investigating an automated system to allow these toolkits to export their functions as web services under the standardized protocols SOAP and WSDL using pre-existing software (gSOAP [3]) and a custom compiler for Tcl-based scripts. The compiler uses a flexible parser and type inferring mechanism to convert the Tcl into a C++ program that allows the desired Tcl procedures to be exported as SOAP-accessible functions and the VTK rendering window to be captured offscreen and encapsulated for forwarding through a web service. Classes for a client-side Java applet to access the rendering window remotely are also generated. We will use this system to demonstrate the streamlined generation of a standards-compliant web service (suitable for grid deployment) from a Tcl script for VTK. References: [1] The Visualization Toolkit, http://www.vtk.org [2] Matlab, http://www.mathworks.com [3] gSOAP, http://www.cs.fsu.edu/~engelen/soap.html

  3. Using Toolkits to Achieve STEM Enterprise Learning Outcomes

    ERIC Educational Resources Information Center

    Watts, Carys A.; Wray, Katie

    2012-01-01

    Purpose: The purpose of this paper is to evaluate the effectiveness of using several commercial tools in science, technology, engineering and maths (STEM) subjects for enterprise education at Newcastle University, UK. Design/methodology/approach: The paper provides an overview of existing toolkit use in higher education, before reviewing where and…

  4. Educating Globally Competent Citizens: A Toolkit. Second Edition

    ERIC Educational Resources Information Center

    Elliott-Gower, Steven; Falk, Dennis R.; Shapiro, Martin

    2012-01-01

    Educating Globally Competent Citizens, a product of AASCU's American Democracy Project and its Global Engagement Initiative, introduces readers to a set of global challenges facing society based on the Center for Strategic and International Studies' 7 Revolutions. The toolkit is designed to aid faculty in incorporating global challenges into new…

  5. Using AASL's "Health and Wellness" and "Crisis Toolkits"

    ERIC Educational Resources Information Center

    Logan, Debra Kay

    2009-01-01

    Whether a school library program is the picture of good health in a state that mandates a professionally staffed library media center in every building or is suffering in a low-wealth district that is facing drastic cuts, the recently launched toolkits by the American Association of School Librarians (AASL) are stocked with useful strategies and…

  6. A Toolkit to Implement Graduate Attributes in Geography Curricula

    ERIC Educational Resources Information Center

    Spronken-Smith, Rachel; McLean, Angela; Smith, Nell; Bond, Carol; Jenkins, Martin; Marshall, Stephen; Frielick, Stanley

    2016-01-01

    This article uses findings from a project on engagement with graduate outcomes across higher education institutions in New Zealand to produce a toolkit for implementing graduate attributes in geography curricula. Key facets include strong leadership; academic developers to facilitate conversations about graduate attributes and teaching towards…

  7. The Archivists' Toolkit: Another Step toward Streamlined Archival Processing

    ERIC Educational Resources Information Center

    Westbrook, Bradley D.; Mandell, Lee; Shepherd, Kelcy; Stevens, Brian; Varghese, Jason

    2006-01-01

    The Archivists' Toolkit is a software application currently in development and designed to support the creation and management of archival information. This article summarizes the development of the application, including some of the problems the application is designed to resolve. Primary emphasis is placed on describing the application's…

  8. A Toolkit to Implement Graduate Attributes in Geography Curricula

    ERIC Educational Resources Information Center

    Spronken-Smith, Rachel; McLean, Angela; Smith, Nell; Bond, Carol; Jenkins, Martin; Marshall, Stephen; Frielick, Stanley

    2016-01-01

    This article uses findings from a project on engagement with graduate outcomes across higher education institutions in New Zealand to produce a toolkit for implementing graduate attributes in geography curricula. Key facets include strong leadership; academic developers to facilitate conversations about graduate attributes and teaching towards…

  9. The Data Toolkit: Ten Tools for Supporting School Improvement

    ERIC Educational Resources Information Center

    Hess, Robert T.; Robbins, Pam

    2012-01-01

    Using data for school improvement is a key goal of Race to the Top, and now is the time to make data-driven school improvement a priority. However, many educators are drowning in data. Boost your professional learning community's ability to translate data into action with this new book from Pam Robbins and Robert T. Hess. "The Data Toolkit"…

  10. Capturing and Using Knowledge about the Use of Visualization Toolkits

    SciTech Connect

    Del Rio, Nicholas R.; Pinheiro da Silva, Paulo

    2012-11-02

    When constructing visualization pipelines using toolkits such as Visualization Toolkit (VTK) and Generic Mapping Tools (GMT), developers must understand (1) what toolkit operators will transform their data from its raw state to some required view state and (2) what viewers are available to present the generated view. Traditionally, developers learn about how to construct visualization pipelines by reading documentation and inspecting code examples, which can be costly in terms of the time and effort expended. Once an initial pipeline is constructed, developers may still have to undergo a trial and error process before a satisfactory visualization is generated. This paper presents the Visualization Knowledge Project (VisKo) that is built on a knowledge base of visualization toolkit operators and how they can be piped together to form visualization pipelines. Developers may now rely on VisKo to guide them when constructing visualization pipelines and in some cases, when VisKo has complete knowledge about some set of operators (i.e., sequencing and parameter settings), automatically generate a fully functional visualization pipeline.

  11. 77 FR 73023 - U.S. Environmental Solutions Toolkit

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-07

    ... that will outline U.S. approaches to a series of environmental problems and highlight participating U.S.... approaches to solving environmental problems and to U.S. companies that can export related technologies. The... International Trade Administration U.S. Environmental Solutions Toolkit AGENCY: International Trade...

  12. 77 FR 73023 - U.S. Environmental Solutions Toolkit

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-07

    ... environmental technologies that will outline U.S. ] approaches to a series of environmental problems and... markets to U.S. approaches to solving environmental problems and to U.S. companies that can export related... International Trade Administration U.S. Environmental Solutions Toolkit AGENCY: International Trade...

  13. A Beginning Rural Principal's Toolkit: A Guide for Success

    ERIC Educational Resources Information Center

    Ashton, Brian; Duncan, Heather E.

    2012-01-01

    The purpose of this article is to explore both the challenges and skills needed to effectively assume a leadership position and thus to create an entry plan or "toolkit" for a new rural school leader. The entry plan acts as a guide beginning principals may use to navigate the unavoidable confusion that comes with leadership. It also assists…

  14. Policy to Performance Toolkit: Transitioning Adults to Opportunity

    ERIC Educational Resources Information Center

    Alamprese, Judith A.; Limardo, Chrys

    2012-01-01

    The "Policy to Performance Toolkit" is designed to provide state adult education staff and key stakeholders with guidance and tools to use in developing, implementing, and monitoring state policies and their associated practices that support an effective state adult basic education (ABE) to postsecondary education and training transition…

  15. The Complete Guide to RTI: An Implementation Toolkit

    ERIC Educational Resources Information Center

    Burton, Dolores; Kappenberg, John

    2012-01-01

    This comprehensive toolkit will bring you up to speed on why RTI is one of the most important educational initiatives in recent history and sets the stage for its future role in teacher education and practice. The authors demonstrate innovative ways to use RTI to inform instruction and guide curriculum development in inclusive classroom settings.…

  16. Dataset of aggregate producers in New Mexico

    USGS Publications Warehouse

    Orris, Greta J.

    2000-01-01

    This report presents data, including latitude and longitude, for aggregate sites in New Mexico that were believed to be active in the period 1997-1999. The data are presented in paper form in Part A of this report and as Microsoft Excel 97 and Data Interchange Format (DIF) files in Part B. The work was undertaken as part of the effort to update information for the National Atlas. This compilation includes data from: the files of U.S. Geological Survey (USGS); company contacts; the New Mexico Bureau of Mines and Mineral Resources, New Mexico Bureau of Mine Inspection, and the Mining and Minerals Division of the New Mexico Energy, Minerals and Natural Resources Department (Hatton and others, 1998); the Bureau of Land Management Information; and direct communications with some of the aggregate operators. Additional information on most of the sites is available in Hatton and others (1998).

  17. Aggregate R-R-V Analysis

    EPA Pesticide Factsheets

    The excel file contains time series data of flow rates, concentrations of alachlor , atrazine, ammonia, total phosphorus, and total suspended solids observed in two watersheds in Indiana from 2002 to 2007. The aggregate time series data corresponding or representative to all these parameters was obtained using a specialized, data-driven technique. The aggregate data is hypothesized in the published paper to represent the overall health of both watersheds with respect to various potential water quality impairments. The time series data for each of the individual water quality parameters were used to compute corresponding risk measures (Rel, Res, and Vul) that are reported in Table 4 and 5. The aggregation of the risk measures, which is computed from the aggregate time series and water quality standards in Table 1, is also reported in Table 4 and 5 of the published paper. Values under column heading uncertainty reports uncertainties associated with reconstruction of missing records of the water quality parameters. Long-term records of the water quality parameters were reconstructed in order to estimate the (R-R-V) and corresponding aggregate risk measures. This dataset is associated with the following publication:Hoque, Y., S. Tripathi, M. Hantush , and R. Govindaraju. Aggregate Measures of Watershed Health from Reconstructed Water Quality Data with Uncertainty. Ed Gregorich JOURNAL OF ENVIRONMENTAL QUALITY. American Society of Agronomy, MADISON, WI,

  18. Thermodynamics of Protein Aggregation

    NASA Astrophysics Data System (ADS)

    Osborne, Kenneth L.; Barz, Bogdan; Bachmann, Michael; Strodel, Birgit

    Amyloid protein aggregation characterizes many neurodegenerative disorders, including Alzheimer's, Parkinson's, and Creutz- feldt-Jakob disease. Evidence suggests that amyloid aggregates may share similar aggregation pathways, implying simulation of full-length amyloid proteins is not necessary for understanding amyloid formation. In this study we simulate GNNQQNY, the N-terminal prion-determining domain of the yeast protein Sup35 to investigate the thermodynamics of structural transitions during aggregation. We use a coarse-grained model with replica-exchange molecular dynamics to investigate the association of 3-, 6-, and 12-chain GNNQQNY systems and we determine the aggregation pathway by studying aggregation states of GN- NQQNY. We find that the aggregation of the hydrophilic GNNQQNY sequence is mainly driven by H-bond formation, leading to the formation of /3-sheets from the very beginning of the assembly process. Condensation (aggregation) and ordering take place simultaneously, which is underpinned by the occurrence of a single heat capacity peak only.

  19. Dissemination of Earth Remote Sensing Data for Use in the NOAA/NWS Damage Assessment Toolkit

    NASA Technical Reports Server (NTRS)

    Molthan, Andrew; Burks, Jason; Camp, Parks; McGrath, Kevin; Bell, Jordan

    2015-01-01

    The National Weather Service has developed the Damage Assessment Toolkit (DAT), an application for smartphones and tablets that allows for the collection, geolocation, and aggregation of various damage indicators that are collected during storm surveys. The DAT supports the often labor-intensive process where meteorologists venture into the storm-affected area, allowing them to acquire geotagged photos of the observed damage while also assigning estimated EF-scale categories based upon their observations. Once the data are collected, the DAT infrastructure aggregates the observations into a server that allows other meteorologists to perform quality control and other analysis steps before completing their survey and making the resulting data available to the public. In addition to in-person observations, Earth remote sensing from operational, polar-orbiting satellites can support the damage assessment process by identifying portions of damage tracks that may be missed due to road limitations, access to private property, or time constraints. Products resulting from change detection techniques can identify damage to vegetation and the land surface, aiding in the survey process. In addition, higher resolution commercial imagery can corroborate ground-based surveys by examining higher-resolution commercial imagery. As part of an ongoing collaboration, NASA and NOAA are working to integrate near real-time Earth remote sensing observations into the NOAA/NWS Damage Assessment Toolkit. This presentation will highlight recent developments in a streamlined approach for disseminating Earth remote sensing data via web mapping services and a new menu interface that has been integrated within the DAT. A review of current and future products will be provided, including products derived from MODIS and VIIRS for preliminary track identification, along with conduits for higher-resolution Landsat, ASTER, and commercial imagery as they become available. In addition to tornado damage

  20. G EANT4—a simulation toolkit

    NASA Astrophysics Data System (ADS)

    Agostinelli, S.; Allison, J.; Amako, K.; Apostolakis, J.; Araujo, H.; Arce, P.; Asai, M.; Axen, D.; Banerjee, S.; Barrand, G.; Behner, F.; Bellagamba, L.; Boudreau, J.; Broglia, L.; Brunengo, A.; Burkhardt, H.; Chauvie, S.; Chuma, J.; Chytracek, R.; Cooperman, G.; Cosmo, G.; Degtyarenko, P.; Dell'Acqua, A.; Depaola, G.; Dietrich, D.; Enami, R.; Feliciello, A.; Ferguson, C.; Fesefeldt, H.; Folger, G.; Foppiano, F.; Forti, A.; Garelli, S.; Giani, S.; Giannitrapani, R.; Gibin, D.; Gómez Cadenas, J. J.; González, I.; Gracia Abril, G.; Greeniaus, G.; Greiner, W.; Grichine, V.; Grossheim, A.; Guatelli, S.; Gumplinger, P.; Hamatsu, R.; Hashimoto, K.; Hasui, H.; Heikkinen, A.; Howard, A.; Ivanchenko, V.; Johnson, A.; Jones, F. W.; Kallenbach, J.; Kanaya, N.; Kawabata, M.; Kawabata, Y.; Kawaguti, M.; Kelner, S.; Kent, P.; Kimura, A.; Kodama, T.; Kokoulin, R.; Kossov, M.; Kurashige, H.; Lamanna, E.; Lampén, T.; Lara, V.; Lefebure, V.; Lei, F.; Liendl, M.; Lockman, W.; Longo, F.; Magni, S.; Maire, M.; Medernach, E.; Minamimoto, K.; Mora de Freitas, P.; Morita, Y.; Murakami, K.; Nagamatu, M.; Nartallo, R.; Nieminen, P.; Nishimura, T.; Ohtsubo, K.; Okamura, M.; O'Neale, S.; Oohata, Y.; Paech, K.; Perl, J.; Pfeiffer, A.; Pia, M. G.; Ranjard, F.; Rybin, A.; Sadilov, S.; Di Salvo, E.; Santin, G.; Sasaki, T.; Savvas, N.; Sawada, Y.; Scherer, S.; Sei, S.; Sirotenko, V.; Smith, D.; Starkov, N.; Stoecker, H.; Sulkimo, J.; Takahata, M.; Tanaka, S.; Tcherniaev, E.; Safai Tehrani, E.; Tropeano, M.; Truscott, P.; Uno, H.; Urban, L.; Urban, P.; Verderi, M.; Walkden, A.; Wander, W.; Weber, H.; Wellisch, J. P.; Wenaus, T.; Williams, D. C.; Wright, D.; Yamada, T.; Yoshida, H.; Zschiesche, D.; G EANT4 Collaboration

    2003-07-01

    G EANT4 is a toolkit for simulating the passage of particles through matter. It includes a complete range of functionality including tracking, geometry, physics models and hits. The physics processes offered cover a comprehensive range, including electromagnetic, hadronic and optical processes, a large set of long-lived particles, materials and elements, over a wide energy range starting, in some cases, from 250 eV and extending in others to the TeV energy range. It has been designed and constructed to expose the physics models utilised, to handle complex geometries, and to enable its easy adaptation for optimal use in different sets of applications. The toolkit is the result of a worldwide collaboration of physicists and software engineers. It has been created exploiting software engineering and object-oriented technology and implemented in the C++ programming language. It has been used in applications in particle physics, nuclear physics, accelerator design, space engineering and medical physics.

  1. Geological hazards: from early warning systems to public health toolkits.

    PubMed

    Samarasundera, Edgar; Hansell, Anna; Leibovici, Didier; Horwell, Claire J; Anand, Suchith; Oppenheimer, Clive

    2014-11-01

    Extreme geological events, such as earthquakes, are a significant global concern and sometimes their consequences can be devastating. Geographic information plays a critical role in health protection regarding hazards, and there are a range of initiatives using geographic information to communicate risk as well as to support early warning systems operated by geologists. Nevertheless we consider there to remain shortfalls in translating information on extreme geological events into health protection tools, and suggest that social scientists have an important role to play in aiding the development of a new generation of toolkits aimed at public health practitioners. This viewpoint piece reviews the state of the art in this domain and proposes potential contributions different stakeholder groups, including social scientists, could bring to the development of new toolkits. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. A toolkit for epithermal neutron beam characterisation in BNCT.

    PubMed

    Auterinen, Iiro; Serén, Tom; Uusi-Simola, Jouni; Kosunen, Antti; Savolainen, Sauli

    2004-01-01

    Methods for dosimetry of epithermal neutron beams used in boron neutron capture therapy (BNCT) have been developed and utilised within the Finnish BNCT project as well as within a European project for a code of practise for the dosimetry of BNCT. One outcome has been a travelling toolkit for BNCT dosimetry. It consists of activation detectors and ionisation chambers. The free-beam neutron spectrum is measured with a set of activation foils of different isotopes irradiated both in a Cd-capsule and without it. Neutron flux (thermal and epithermal) distribution in phantoms is measured using activation of Mn and Au foils, and Cu wire. Ionisation chamber (IC) measurements are performed both in-free-beam and in-phantom for determination of the neutron and gamma dose components. This toolkit has also been used at other BNCT facilities in Europe, the USA, Argentina and Japan.

  3. Developing Mixed Reality Educational Applications: The Virtual Touch Toolkit

    PubMed Central

    Mateu, Juan; Lasala, María José; Alamán, Xavier

    2015-01-01

    In this paper, we present Virtual Touch, a toolkit that allows the development of educational activities through a mixed reality environment such that, using various tangible elements, the interconnection of a virtual world with the real world is enabled. The main goal of Virtual Touch is to facilitate the installation, configuration and programming of different types of technologies, abstracting the creator of educational applications from the technical details involving the use of tangible interfaces and virtual worlds. Therefore, it is specially designed to enable teachers to themselves create educational activities for their students in a simple way, taking into account that teachers generally lack advanced knowledge in computer programming and electronics. The toolkit has been used to develop various educational applications that have been tested in two secondary education high schools in Spain. PMID:26334275

  4. Developing Mixed Reality Educational Applications: The Virtual Touch Toolkit.

    PubMed

    Mateu, Juan; Lasala, María José; Alamán, Xavier

    2015-08-31

    In this paper, we present Virtual Touch, a toolkit that allows the development of educational activities through a mixed reality environment such that, using various tangible elements, the interconnection of a virtual world with the real world is enabled. The main goal of Virtual Touch is to facilitate the installation, configuration and programming of different types of technologies, abstracting the creator of educational applications from the technical details involving the use of tangible interfaces and virtual worlds. Therefore, it is specially designed to enable teachers to themselves create educational activities for their students in a simple way, taking into account that teachers generally lack advanced knowledge in computer programming and electronics. The toolkit has been used to develop various educational applications that have been tested in two secondary education high schools in Spain.

  5. On mean type aggregation.

    PubMed

    Yager, R R

    1996-01-01

    We introduce and define the concept of mean aggregation of a collection of n numbers. We point out that the lack of associativity of this operation compounds the problem of the extending mean of n numbers to n+1 numbers. The closely related concepts of self identity and the centering property are introduced as one imperative for extending mean aggregation operators. The problem of weighted mean aggregation is studied. A new concept of prioritized mean aggregation is then introduced. We next show that the technique of selecting an element based upon the performance of a random experiment can be considered as a mean aggregation operation.

  6. A framework for a teaching toolkit in entrepreneurship education.

    PubMed

    Fellnhofer, Katharina

    2017-01-01

    Despite mounting interest in entrepreneurship education (EE), innovative approaches such as multimedia, web-based toolkits including entrepreneurial storytelling have been largely ignored in the EE discipline. Therefore, this conceptual contribution introduces eight propositions as a fruitful basis for assessing a 'learning-through-real-multimedia-entrepreneurial-narratives' pedagogical approach. These recommendations prepare the grounds for a future, empirical investigation of this currently under-researched topic, which could be essential for multiple domains including academic, business and society.

  7. Object Toolkit Version 4.2 Users Manual

    DTIC Science & Technology

    2014-10-31

    Figure 2. Nascap-2k Model of the MESSENGER Spacecraft , Showing Biased Solar Array Surfaces...The Panel Is 4.4 m in Width and 7 m in Length. It Is Seven Elements Wide and Fifteen Elements Long. The Front Side Is Material Solar Cells and the...While all of these figures show spacecraft , the object generated is not limited to spacecraft . A single instrument can be an Object Toolkit object as

  8. Integrated Architectural Level Power-Performance Modeling Toolkit

    DTIC Science & Technology

    2004-08-20

    laptop) systems. We utilize the MET/ Turandot toolkit originally developed at IBM TJ Watson Research Center as the underlying PowerPC...microarchitecture performance simulator [3]. Turandot is flexible enough to model a broad range of microarchitectures and has undergone extensive validation [3...In addition, Turandot has been augmented with power models to explore power-performance tradeoffs in an internal IBM tool called PowerTimer [4

  9. A framework for a teaching toolkit in entrepreneurship education

    PubMed Central

    Fellnhofer, Katharina

    2017-01-01

    Despite mounting interest in entrepreneurship education (EE), innovative approaches such as multimedia, web-based toolkits including entrepreneurial storytelling have been largely ignored in the EE discipline. Therefore, this conceptual contribution introduces eight propositions as a fruitful basis for assessing a ‘learning-through-real-multimedia-entrepreneurial-narratives’ pedagogical approach. These recommendations prepare the grounds for a future, empirical investigation of this currently under-researched topic, which could be essential for multiple domains including academic, business and society. PMID:28680372

  10. Business plans--tips from the toolkit 6.

    PubMed

    Steer, Neville

    2010-07-01

    General practice is a business. Most practices can stay afloat by having appointments, billing patients, managing the administration processes and working long hours. What distinguishes the high performance organisation from the average organisation is a business plan. This article examines how to create a simple business plan that can be applied to the general practice setting and is drawn from material contained in The Royal Australian College of General Practitioners' 'General practice management toolkit'.

  11. Risk of resource failure and toolkit variation in small-scale farmers and herders.

    PubMed

    Collard, Mark; Ruttle, April; Buchanan, Briggs; O'Brien, Michael J

    2012-01-01

    Recent work suggests that global variation in toolkit structure among hunter-gatherers is driven by risk of resource failure such that as risk of resource failure increases, toolkits become more diverse and complex. Here we report a study in which we investigated whether the toolkits of small-scale farmers and herders are influenced by risk of resource failure in the same way. In the study, we applied simple linear and multiple regression analysis to data from 45 small-scale food-producing groups to test the risk hypothesis. Our results were not consistent with the hypothesis; none of the risk variables we examined had a significant impact on toolkit diversity or on toolkit complexity. It appears, therefore, that the drivers of toolkit structure differ between hunter-gatherers and small-scale food-producers.

  12. Guide to Using the WIND Toolkit Validation Code

    SciTech Connect

    Lieberman-Cribbin, W.; Draxl, C.; Clifton, A.

    2014-12-01

    In response to the U.S. Department of Energy's goal of using 20% wind energy by 2030, the Wind Integration National Dataset (WIND) Toolkit was created to provide information on wind speed, wind direction, temperature, surface air pressure, and air density on more than 126,000 locations across the United States from 2007 to 2013. The numerical weather prediction model output, gridded at 2-km and at a 5-minute resolution, was further converted to detail the wind power production time series of existing and potential wind facility sites. For users of the dataset it is important that the information presented in the WIND Toolkit is accurate and that errors are known, as then corrective steps can be taken. Therefore, we provide validation code written in R that will be made public to provide users with tools to validate data of their own locations. Validation is based on statistical analyses of wind speed, using error metrics such as bias, root-mean-square error, centered root-mean-square error, mean absolute error, and percent error. Plots of diurnal cycles, annual cycles, wind roses, histograms of wind speed, and quantile-quantile plots are created to visualize how well observational data compares to model data. Ideally, validation will confirm beneficial locations to utilize wind energy and encourage regional wind integration studies using the WIND Toolkit.

  13. MAVEN IDL Toolkit: Integrated Data Access and Visualization

    NASA Astrophysics Data System (ADS)

    Larsen, K. W.; Martin, J.; De Wolfe, A. W.; Brain, D. A.

    2014-12-01

    The Mars Atmosphere and Volatile EvolutioN (MAVEN) mission has arrived at Mars and begun its investigations into the state of the upper atmosphere. Because atmospheric processes are subject to a wide variety of internal and external variables, understanding the overall forces driving the composition, structure, and dynamics requires an integrated analysis from all the available data. Eight instruments on the spacecraft are collecting in-situ and remote sensing data on the fields and particles, neutral and ionized, that make up Mars' upper atmosphere. As the scientific questions MAVEN is designed to answer require an understanding of the data from multiple instruments, the project has designed a software toolkit to facilitate the access, analysis, and visualization of the disparate data. The toolkit provides mission scientists with easy access to the data from within the IDL environment, designed to ensure that users are always working with the most recent data available and to eliminate the usual difficulties of data from a variety of data sources and formats. The Toolkit also includes 1-, 2-, and interactive 3-D visualizations to enable the scientists to examine the inter-relations between data from all instruments, as well as from external models. The data and visualizations have been designed to be as flexible and extensible as possible, allowing the scientists to rapidly and easily examine and manipulate the data in the context of the mission and their wider research programs.

  14. The WMTSA Wavelet Toolkit for Data Analysis in the Geosciences

    NASA Astrophysics Data System (ADS)

    Cornish, C. R.; Percival, D. B.; Bretherton, C. S.

    2003-12-01

    Whereas Fourier analysis and similar spectral techniques are widely used for data analysis in the geosciences, their application is based on the assumption that the analyzed signal is stationary and well-sampled. However, many phenomena of interest in the natural environment are transitory and non-stationary. Furthermore, limited sampling of observations results in datasets that are incomplete and vary in sampling rates and durations. Wavelet decomposition techniques do not require the assumption of signal stationary. Additionally wavelet analysis methods can accommodate data series of any length, be used for signal filtering and reconstruction, and allow the localization of spectral signatures in time. We present an overview of the WMTSA toolkit, which is an implementation of the wavelet methods for time series analysis presented by Percival and Walden (2000). The WMTSA toolkit is being developed for multiple programming platforms (including Matlab, R, C) and being made available to the greater scientific community to use in their data analysis applications. We will demonstrate an application and results of using the WMTSA toolkit to the study of turbulence in the atmospheric boundary layer. Reference: D. B. Percival and A. T. Walden (2000), Wavelet Methods for Time Series Analysis. Cambridge, England: Cambridge University Press.

  15. On combining computational differentiation and toolkits for parallel scientific computing.

    SciTech Connect

    Bischof, C. H.; Buecker, H. M.; Hovland, P. D.

    2000-06-08

    Automatic differentiation is a powerful technique for evaluating derivatives of functions given in the form of a high-level programming language such as Fortran, C, or C++. The program is treated as a potentially very long sequence of elementary statements to which the chain rule of differential calculus is applied over and over again. Combining automatic differentiation and the organizational structure of toolkits for parallel scientific computing provides a mechanism for evaluating derivatives by exploiting mathematical insight on a higher level. In these toolkits, algorithmic structures such as BLAS-like operations, linear and nonlinear solvers, or integrators for ordinary differential equations can be identified by their standardized interfaces and recognized as high-level mathematical objects rather than as a sequence of elementary statements. In this note, the differentiation of a linear solver with respect to some parameter vector is taken as an example. Mathematical insight is used to reformulate this problem into the solution of multiple linear systems that share the same coefficient matrix but differ in their right-hand sides. The experiments reported here use ADIC, a tool for the automatic differentiation of C programs, and PETSC, an object-oriented toolkit for the parallel solution of scientific problems modeled by partial differential equations.

  16. The PAX Toolkit and Its Applications at Tevatron and LHC

    NASA Astrophysics Data System (ADS)

    Kappler, S.; Erdmann, M.; Felzmann, U.; Hirschbuhl, D.; Kirsch, M.; Quast, G.; Schmidt, A.; Weng, J.

    2006-04-01

    At the CHEP03 conference we launched the Physics Analysis eXpert (PAX), a C++ toolkit released for the use in advanced high energy physics (HEP) analyses. This toolkit allows to define a level of abstraction beyond detector reconstruction by providing a general, persistent container model for HEP events. Physics objects such as particles, vertices and collisions can easily be stored, accessed and manipulated. Bookkeeping of relations between these objects (like decay trees, vertex and collision separation, etc.) including deep copies is fully provided by the relation management. Event container and associated objects represent a uniform interface for algorithms and facilitate the parallel development and evaluation of different physics interpretations of individual events. So-called analysis factories, which actively identify and distinguish different physics processes and study systematic uncertainties, can easily be realized with the PAX toolkit. PAX is officially released to experiments at Tevatron and LHC. Being explored by a growing user community, it is applied in a number of complex physics analyses, two of which are presented here. We report the successful application in studies of t-tbar production at the Tevatron and Higgs searches in the channel t-tbar-Higgs at the LHC and give a short outlook on further developments.

  17. The GBIF integrated publishing toolkit: facilitating the efficient publishing of biodiversity data on the internet.

    PubMed

    Robertson, Tim; Döring, Markus; Guralnick, Robert; Bloom, David; Wieczorek, John; Braak, Kyle; Otegui, Javier; Russell, Laura; Desmet, Peter

    2014-01-01

    The planet is experiencing an ongoing global biodiversity crisis. Measuring the magnitude and rate of change more effectively requires access to organized, easily discoverable, and digitally-formatted biodiversity data, both legacy and new, from across the globe. Assembling this coherent digital representation of biodiversity requires the integration of data that have historically been analog, dispersed, and heterogeneous. The Integrated Publishing Toolkit (IPT) is a software package developed to support biodiversity dataset publication in a common format. The IPT's two primary functions are to 1) encode existing species occurrence datasets and checklists, such as records from natural history collections or observations, in the Darwin Core standard to enhance interoperability of data, and 2) publish and archive data and metadata for broad use in a Darwin Core Archive, a set of files following a standard format. Here we discuss the key need for the IPT, how it has developed in response to community input, and how it continues to evolve to streamline and enhance the interoperability, discoverability, and mobilization of new data types beyond basic Darwin Core records. We close with a discussion how IPT has impacted the biodiversity research community, how it enhances data publishing in more traditional journal venues, along with new features implemented in the latest version of the IPT, and future plans for more enhancements.

  18. PAT: a protein analysis toolkit for integrated biocomputing on the web

    PubMed Central

    Gracy, Jérôme; Chiche, Laurent

    2005-01-01

    PAT, for Protein Analysis Toolkit, is an integrated biocomputing server. The main goal of its design was to facilitate the combination of different processing tools for complex protein analyses and to simplify the automation of repetitive tasks. The PAT server provides a standardized web interface to a wide range of protein analysis tools. It is designed as a streamlined analysis environment that implements many features which strongly simplify studies dealing with protein sequences and structures and improve productivity. PAT is able to read and write data in many bioinformatics formats and to create any desired pipeline by seamlessly sending the output of a tool to the input of another tool. PAT can retrieve protein entries from identifier-based queries by using pre-computed database indexes. Users can easily formulate complex queries combining different analysis tools with few mouse clicks, or via a dedicated macro language, and a web session manager provides direct access to any temporary file generated during the user session. PAT is freely accessible on the Internet at . PMID:15980554

  19. FASTAptamer: A Bioinformatic Toolkit for High-throughput Sequence Analysis of Combinatorial Selections

    PubMed Central

    Alam, Khalid K; Chang, Jonathan L; Burke, Donald H

    2015-01-01

    High-throughput sequence (HTS) analysis of combinatorial selection populations accelerates lead discovery and optimization and offers dynamic insight into selection processes. An underlying principle is that selection enriches high-fitness sequences as a fraction of the population, whereas low-fitness sequences are depleted. HTS analysis readily provides the requisite numerical information by tracking the evolutionary trajectory of individual sequences in response to selection pressures. Unlike genomic data, for which a number of software solutions exist, user-friendly tools are not readily available for the combinatorial selections field, leading many users to create custom software. FASTAptamer was designed to address the sequence-level analysis needs of the field. The open source FASTAptamer toolkit counts, normalizes and ranks read counts in a FASTQ file, compares populations for sequence distribution, generates clusters of sequence families, calculates fold-enrichment of sequences throughout the course of a selection and searches for degenerate sequence motifs. While originally designed for aptamer selections, FASTAptamer can be applied to any selection strategy that can utilize next-generation DNA sequencing, such as ribozyme or deoxyribozyme selections, in vivo mutagenesis and various surface display technologies (peptide, antibody fragment, mRNA, etc.). FASTAptamer software, sample data and a user's guide are available for download at http://burkelab.missouri.edu/fastaptamer.html. PMID:25734917

  20. The GBIF Integrated Publishing Toolkit: Facilitating the Efficient Publishing of Biodiversity Data on the Internet

    PubMed Central

    Robertson, Tim; Döring, Markus; Guralnick, Robert; Bloom, David; Wieczorek, John; Braak, Kyle; Otegui, Javier; Russell, Laura; Desmet, Peter

    2014-01-01

    The planet is experiencing an ongoing global biodiversity crisis. Measuring the magnitude and rate of change more effectively requires access to organized, easily discoverable, and digitally-formatted biodiversity data, both legacy and new, from across the globe. Assembling this coherent digital representation of biodiversity requires the integration of data that have historically been analog, dispersed, and heterogeneous. The Integrated Publishing Toolkit (IPT) is a software package developed to support biodiversity dataset publication in a common format. The IPT’s two primary functions are to 1) encode existing species occurrence datasets and checklists, such as records from natural history collections or observations, in the Darwin Core standard to enhance interoperability of data, and 2) publish and archive data and metadata for broad use in a Darwin Core Archive, a set of files following a standard format. Here we discuss the key need for the IPT, how it has developed in response to community input, and how it continues to evolve to streamline and enhance the interoperability, discoverability, and mobilization of new data types beyond basic Darwin Core records. We close with a discussion how IPT has impacted the biodiversity research community, how it enhances data publishing in more traditional journal venues, along with new features implemented in the latest version of the IPT, and future plans for more enhancements. PMID:25099149

  1. A Critical Review on the Use of Support Values in Tree Viewers and Bioinformatics Toolkits

    PubMed Central

    Huerta-Cepas, Jaime; Stamatakis, Alexandros

    2017-01-01

    Abstract Phylogenetic trees are routinely visualized to present and interpret the evolutionary relationships of species. Most empirical evolutionary data studies contain a visualization of the inferred tree with branch support values. Ambiguous semantics in tree file formats can lead to erroneous tree visualizations and therefore to incorrect interpretations of phylogenetic analyses. Here, we discuss problems that arise when displaying branch values on trees after rerooting. Branch values are typically stored as node labels in the widely-used Newick tree format. However, such values are attributes of branches. Storing them as node labels can therefore yield errors when rerooting trees. This depends on the mostly implicit semantics that tools deploy to interpret node labels. We reviewed ten tree viewers and ten bioinformatics toolkits that can display and reroot trees. We found that 14 out of 20 of these tools do not permit users to select the semantics of node labels. Thus, unaware users might obtain incorrect results when rooting trees. We illustrate such incorrect mappings for several test cases and real examples taken from the literature. This review has already led to improvements in eight tools. We suggest tools should provide options that explicitly force users to define the semantics of node labels. PMID:28369572

  2. A Critical Review on the Use of Support Values in Tree Viewers and Bioinformatics Toolkits.

    PubMed

    Czech, Lucas; Huerta-Cepas, Jaime; Stamatakis, Alexandros

    2017-03-22

    Phylogenetic trees are routinely visualized to present and interpret the evolutionary relationships of species. Most empirical evolutionary data studies contain a visualization of the inferred tree with branch support values. Ambiguous semantics in tree file formats can lead to erroneous tree visualizations and therefore to incorrect interpretations of phylogenetic analyses. Here, we discuss problems that arise when displaying branch values on trees after rerooting. Branch values are typically stored as node labels in the widely-used Newick tree format. However, such values are attributes of branches. Storing them as node labels can therefore yield errors when rerooting trees. This depends on the mostly implicit semantics that tools deploy to interpret node labels. We reviewed ten tree viewers and ten bioinformatics toolkits that can display and reroot trees. We found that 14 out of 20 of these tools do not permit users to select the semantics of node labels. Thus, unaware users might obtain incorrect results when rooting trees. We illustrate such incorrect mappings for several test cases and real examples taken from the literature. This review has already led to improvements in eight tools. We suggest tools should provide options that explicitly force users to define the semantics of node labels.

  3. molSimplify: A toolkit for automating discovery in inorganic chemistry.

    PubMed

    Ioannidis, Efthymios I; Gani, Terry Z H; Kulik, Heather J

    2016-08-15

    We present an automated, open source toolkit for the first-principles screening and discovery of new inorganic molecules and intermolecular complexes. Challenges remain in the automatic generation of candidate inorganic molecule structures due to the high variability in coordination and bonding, which we overcome through a divide-and-conquer tactic that flexibly combines force-field preoptimization of organic fragments with alignment to first-principles-trained metal-ligand distances. Exploration of chemical space is enabled through random generation of ligands and intermolecular complexes from large chemical databases. We validate the generated structures with the root mean squared (RMS) gradients evaluated from density functional theory (DFT), which are around 0.02 Ha/au across a large 150 molecule test set. Comparison of molSimplify results to full optimization with the universal force field reveals that RMS DFT gradients are improved by 40%. Seamless generation of input files, preparation and execution of electronic structure calculations, and post-processing for each generated structure aids interpretation of underlying chemical and energetic trends. © 2016 Wiley Periodicals, Inc.

  4. Matlab based Toolkits used to Interface with Optical Design Software for NASA's James Webb Space Telescope

    NASA Technical Reports Server (NTRS)

    Howard, Joseph

    2007-01-01

    The viewgraph presentation provides an introduction to the James Webb Space Telescope (JWST). The first part provides a brief overview of Matlab toolkits including CodeV, OSLO, and Zemax Toolkits. The toolkit overview examines purpose, layout, how Matlab gets data from CodeV, function layout, and using cvHELP. The second part provides examples of use with JWST, including wavefront sensitivities and alignment simulations.

  5. Census of Population and Housing, 1980: Summary Tape File 1F, School Districts. Technical Documentation.

    ERIC Educational Resources Information Center

    Bureau of the Census (DOC), Washington, DC. Data User Services Div.

    This report provides technical documentation associated with a 1980 Census of Population and Housing Summary Tape File 1F--the School Districts File. The file contains complete-count data of population and housing aggregated by school district. Population items tabulated include age, race (provisional data), sex, marital status, Spanish origin…

  6. Cross-File Searching: How Vendors Help--And Don't Help--Improve Compatability.

    ERIC Educational Resources Information Center

    Milstead, Jessica L.

    1999-01-01

    Reports how a cross-section of database producers, search services, and aggregators are using vocabulary management to facilitate cross-file searching. Discusses the range of subject areas and audiences; indexing; vocabulary control within databases; machine aids to indexing; and aids to cross-file searching. A chart contains sources of files and…

  7. 26 CFR 1.1502-75 - Filing of consolidated returns.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... liability of the group for such year relative to what the aggregate tax liability would be if the members of... consolidated tax liability for 1967, relative to what the aggregate tax liability would be if the members of the group filed separate returns for 1967, the difference between the tax liability of the...

  8. 26 CFR 1.1502-75 - Filing of consolidated returns.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... liability of the group for such year relative to what the aggregate tax liability would be if the members of... consolidated tax liability for 1967, relative to what the aggregate tax liability would be if the members of the group filed separate returns for 1967, the difference between the tax liability of the...

  9. 42 CFR 418.309 - Hospice aggregate cap.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... (CONTINUED) MEDICARE PROGRAM HOSPICE CARE Payment for Hospice Care § 418.309 Hospice aggregate cap. A hospice...— (1) In the case in which a beneficiary received care from only one hospice, the hospice includes in... included in the calculation of any hospice cap, and who have filed an election to receive hospice......

  10. 42 CFR 418.309 - Hospice aggregate cap.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... (CONTINUED) MEDICARE PROGRAM (CONTINUED) HOSPICE CARE Payment for Hospice Care § 418.309 Hospice aggregate... calculation— (1) In the case in which a beneficiary received care from only one hospice, the hospice includes... included in the calculation of any hospice cap, and who have filed an election to receive hospice......

  11. 42 CFR 418.309 - Hospice aggregate cap.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... (CONTINUED) MEDICARE PROGRAM (CONTINUED) HOSPICE CARE Payment for Hospice Care § 418.309 Hospice aggregate... calculation— (1) In the case in which a beneficiary received care from only one hospice, the hospice includes... included in the calculation of any hospice cap, and who have filed an election to receive hospice......

  12. Using the PhenX Toolkit to Add Standard Measures to a Study.

    PubMed

    Hendershot, Tabitha; Pan, Huaqin; Haines, Jonathan; Harlan, William R; Marazita, Mary L; McCarty, Catherine A; Ramos, Erin M; Hamilton, Carol M

    2015-07-01

    The PhenX (consensus measures for Phenotypes and eXposures) Toolkit (https://www.phenxtoolkit.org/) offers high-quality, well-established measures of phenotypes and exposures for use by the scientific community. The goal is to promote the use of standard measures, enhance data interoperability, and help investigators identify opportunities for collaborative and translational research. The Toolkit contains 395 measures drawn from 22 research domains (fields of research), along with additional collections of measures for Substance Abuse and Addiction (SAA) research, Mental Health Research (MHR), and Tobacco Regulatory Research (TRR). Additional measures for TRR that are expected to be released in 2015 include Obesity, Eating Disorders, and Sickle Cell Disease. Measures are selected by working groups of domain experts using a consensus process that includes input from the scientific community. The Toolkit provides a description of each PhenX measure, the rationale for including it in the Toolkit, protocol(s) for collecting the measure, and supporting documentation. Users can browse measures in the Toolkit or can search the Toolkit using the Smart Query Tool or a full text search. PhenX Toolkit users select measures of interest to add to their Toolkit. Registered Toolkit users can save their Toolkit and return to it later to revise or complete. They then have options to download a customized Data Collection Worksheet that specifies the data to be collected, and a Data Dictionary that describes each variable included in the Data Collection Worksheet. The Toolkit also has a Register Your Study feature that facilitates cross-study collaboration by allowing users to find other investigators using the same PhenX measures. Copyright © 2015 John Wiley & Sons, Inc.

  13. Adoption of Test Driven Development and Continuous Integration for the Development of the Trick Simulation Toolkit

    NASA Technical Reports Server (NTRS)

    Penn, John M.

    2013-01-01

    This paper describes the adoption of a Test Driven Development approach and a Continuous Integration System in the development of the Trick Simulation Toolkit, a generic simulation development environment for creating high fidelity training and engineering simulations at the NASA/Johnson Space Center and many other NASA facilities. It describes what was learned and the significant benefits seen, such as fast, thorough, and clear test feedback every time code is checked-in to the code repository. It also describes a system that encourages development of code that is much more flexible, maintainable, and reliable. The Trick Simulation Toolkit development environment provides a common architecture for user-defined simulations. Trick builds executable simulations using user-supplied simulation-definition files (S_define) and user supplied "model code". For each Trick-based simulation, Trick automatically provides job scheduling, checkpoint / restore, data-recording, interactive variable manipulation (variable server), and an input-processor. Also included are tools for plotting recorded data and various other supporting tools and libraries. Trick is written in C/C++ and Java and supports both Linux and MacOSX. Prior to adopting this new development approach, Trick testing consisted primarily of running a few large simulations, with the hope that their complexity and scale would exercise most of Trick's code and expose any recently introduced bugs. Unsurprising, this approach yielded inconsistent results. It was obvious that a more systematic, thorough approach was required. After seeing examples of some Java-based projects that used the JUnit test framework, similar test frameworks for C and C++ were sought. Several were found, all clearly inspired by JUnit. Googletest, a freely available Open source testing framework, was selected as the most appropriate and capable. The new approach was implemented while rewriting the Trick memory management component, to eliminate a

  14. 12 CFR 723.17 - Are there any exceptions to the aggregate loan limit?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... comprise the largest portion of the credit union's loan portfolio (as evidenced in any call report filed... 12 Banks and Banking 6 2011-01-01 2011-01-01 false Are there any exceptions to the aggregate loan... AFFECTING CREDIT UNIONS MEMBER BUSINESS LOANS § 723.17 Are there any exceptions to the aggregate loan...

  15. 75 FR 11953 - Self-Regulatory Organizations; Notice of Filing and Immediate Effectiveness of Proposed Rule...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-12

    ... filing, the Exchange would amend its Fee Schedule, effective March 1, 2010, to permit the aggregation of... certain fees. A Participant must request the aggregation of affiliate activity by submitting an... implement the aggregation policy effective March 1, 2010. 2. Statutory Basis The Exchange believes that the...

  16. The advanced computational testing and simulation toolkit (ACTS)

    SciTech Connect

    Drummond, L.A.; Marques, O.

    2002-05-21

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  17. A toolkit for determining historical eco-hydrological interactions

    NASA Astrophysics Data System (ADS)

    Singer, M. B.; Sargeant, C. I.; Evans, C. M.; Vallet-Coulomb, C.

    2016-12-01

    Contemporary climate change is predicted to result in perturbations to hydroclimatic regimes across the globe, with some regions forecast to become warmer and drier. Given that water is a primary determinant of vegetative health and productivity, we can expect shifts in the availability of this critical resource to have significant impacts on forested ecosystems. The subject is particularly complex in environments where multiple sources of water are potentially available to vegetation and which may also exhibit spatial and temporal variability. To anticipate how subsurface hydrological partitioning may evolve in the future and impact overlying vegetation, we require well constrained, historical data and a modelling framework for assessing the dynamics of subsurface hydrology. We outline a toolkit to retrospectively investigate dynamic water use by trees. We describe a synergistic approach, which combines isotope dendrochronology of tree ring cellulose with a biomechanical model, detailed climatic and isotopic data in endmember waters to assess the mean isotopic composition of source water used in annual tree rings. We identify the data requirements and suggest three versions of the toolkit based on data availability. We present sensitivity analyses in order to identify the key variables required to constrain model predictions and then develop empirical relationships for constraining these parameters based on climate records. We demonstrate our methodology within a Mediterranean riparian forest site and show how it can be used along with subsurface hydrological modelling to validate source water determinations, which are fundamental to understanding climatic fluctuations and trends in subsurface hydrology. We suggest that the utility of our toolkit is applicable in riparian zones and in a range of forest environments where distinct isotopic endmembers are present.

  18. Developing Climate Resilience Toolkit Decision Support Training Sectio

    NASA Astrophysics Data System (ADS)

    Livezey, M. M.; Herring, D.; Keck, J.; Meyers, J. C.

    2014-12-01

    The Climate Resilience Toolkit (CRT) is a Federal government effort to address the U.S. President's Climate Action Plan and Executive Order for Climate Preparedness. The toolkit will provide access to tools and products useful for climate-sensitive decision making. To optimize the user experience, the toolkit will also provide access to training materials. The National Oceanic and Atmospheric Administration (NOAA) has been building a climate training capability for 15 years. The target audience for the training has historically been mainly NOAA staff with some modified training programs for external users and stakeholders. NOAA is now using this climate training capacity for the CRT. To organize the CRT training section, we collaborated with the Association of Climate Change Officers to determine the best strategy and identified four additional complimentary skills needed for successful decision making: climate literacy, environmental literacy, risk assessment and management, and strategic execution and monitoring. Developing the climate literacy skills requires knowledge of climate variability and change, as well as an introduction to the suite of available products and services. For the development of an environmental literacy category, specific topics needed include knowledge of climate impacts on specific environmental systems. Climate risk assessment and management introduces a process for decision making and provides knowledge on communication of climate information and integration of climate information in planning processes. The strategic execution and monitoring category provides information on use of NOAA climate products, services, and partnership opportunities for decision making. In order to use the existing training modules, it was necessary to assess their level of complexity, catalog them, and develop guidance for users on a curriculum to take advantage of the training resources to enhance their learning experience. With the development of this CRT

  19. Building Emergency Contraception Awareness among Adolescents. A Toolkit for Schools and Community-Based Organizations.

    ERIC Educational Resources Information Center

    Simkin, Linda; Radosh, Alice; Nelsesteun, Kari; Silverstein, Stacy

    This toolkit presents emergency contraception (EC) as a method to help adolescent women avoid pregnancy and abortion after unprotected sexual intercourse. The sections of this toolkit are designed to help increase your knowledge of EC and stay up to date. They provide suggestions for increasing EC awareness in the workplace, whether it is a school…

  20. The Customer Flow Toolkit: A Framework for Designing High Quality Customer Services.

    ERIC Educational Resources Information Center

    New York Association of Training and Employment Professionals, Albany.

    This document presents a toolkit to assist staff involved in the design and development of New York's one-stop system. Section 1 describes the preplanning issues to be addressed and the intended outcomes that serve as the framework for creation of the customer flow toolkit. Section 2 outlines the following strategies to assist in designing local…

  1. Toolkit of Available EPA Green Infrastructure Modeling Software. National Stormwater Calculator

    EPA Science Inventory

    This webinar will present a toolkit consisting of five EPA green infrastructure models and tools, along with communication material. This toolkit can be used as a teaching and quick reference resource for use by planners and developers when making green infrastructure implementat...

  2. A Data Audit and Analysis Toolkit To Support Assessment of the First College Year.

    ERIC Educational Resources Information Center

    Paulson, Karen

    This "toolkit" provides a process by which institutions can identify and use information resources to enhance the experiences and outcomes of first-year students. The toolkit contains a "Technical Manual" designed for use by the technical personnel who will be conducting the data audit and associated analyses. Administrators who want more…

  3. Toolkit for Professional Developers: Training Targets 3?6 Grade Teachers

    ERIC Educational Resources Information Center

    McMunn, Nancy; Dunnivant, Michael; Williamson, Jan; Reagan, Hope

    2004-01-01

    The professional development CAR Toolkit is focused on the assessment of reading process at the text level, rather than at the word level. Most students in grades 3-6 generally need support in comprehending text, not just decoding words. While the assessment of reading methods in the CAR Toolkit will help teachers pinpoint difficulties at the word…

  4. Language Access Toolkit: An Organizing and Advocacy Resource for Community-Based Youth Programs

    ERIC Educational Resources Information Center

    Beyersdorf, Mark Ro

    2013-01-01

    Asian American Legal Defense and Education Fund (AALDEF) developed this language access toolkit to share the expertise and experiences of National Asian American Education Advocates Network (NAAEA) member organizations with other community organizations interested in developing language access campaigns. This toolkit includes an overview of…

  5. Spread the word, not the germs: a toolkit for faith communities.

    PubMed

    Reilly, Janet Resop; Hovarter, Rebecca; Mrochek, Tracy; Mittelstadt-Lock, Kay; Schmitz, Sue; Nett, Sue; Turner, Mary Jo; Moore, Ellen; Howden, Mary; Laabs, Cheryl; Behm, Linda

    2011-01-01

    A volunteer workgroup of public health personnel and parish nurses in Wisconsin collaborated to develop the Infection Control and Emergency Preparedness Toolkit for the Faith Community to help prepare congregations for health emergencies and prevent the spread of disease. This article reports a pilot study of the toolkit with 30 parishes/churches, focusing on the infection control portion of the materials.

  6. Toolkit for Evaluating Alignment of Instructional and Assessment Materials to the Common Core State Standards

    ERIC Educational Resources Information Center

    Achieve, Inc., 2014

    2014-01-01

    In joint partnership, Achieve, The Council of Chief State School Officers, and Student Achievement Partners have developed a Toolkit for Evaluating the Alignment of Instructional and Assessment Materials to the Common Core State Standards (CCSS). The Toolkit is a set of interrelated, freely available instruments for evaluating alignment to the…

  7. Model Analyst’s Toolkit User Guide, Version 7.1.0

    DTIC Science & Technology

    2015-08-01

    Model Analyst’s Toolkit ........................................................................... 14 3 Tutorial: What Causes Increased Crime ? 15...you choose. For example, you might theorize that increased poverty leads to increased crime . MAT lets you combine poverty and crime data to validate...computer to uninstall it. Model Analyst’s Toolkit Version 7.1.0 15 3 TUTORIAL: WHAT CAUSES INCREASED CRIME ? In this tutorial, you will

  8. Toolkit for Evaluating Alignment of Instructional and Assessment Materials to the Common Core State Standards

    ERIC Educational Resources Information Center

    Achieve, Inc., 2014

    2014-01-01

    In joint partnership, Achieve, The Council of Chief State School Officers, and Student Achievement Partners have developed a Toolkit for Evaluating the Alignment of Instructional and Assessment Materials to the Common Core State Standards. The Toolkit is a set of interrelated, freely available instruments for evaluating alignment to the CCSS; each…

  9. Growing and Sustaining Parent Engagement: A Toolkit for Parents and Community Partners

    ERIC Educational Resources Information Center

    Center for the Study of Social Policy, 2010

    2010-01-01

    The Toolkit is a quick and easy guide to help support and sustain parent engagement. It provides how to's for implementing three powerful strategies communities can use to maintain and grow parent engagement work that is already underway: Creating a Parent Engagement 1) Roadmap, 2) Checklist and 3) Support Network. This toolkit includes…

  10. The Development of a Curriculum Toolkit with American Indian and Alaska Native Communities

    ERIC Educational Resources Information Center

    Thompson, Nicole L.; Hare, Dwight; Sempier, Tracie T.; Grace, Cathy

    2008-01-01

    This article explains the creation of the "Growing and Learning with Young Native Children" curriculum toolkit. The curriculum toolkit was designed to give American Indian and Alaska Native early childhood educators who work in a variety of settings the framework for developing a research-based, developmentally appropriate, tribally…

  11. Toolkit for a Workshop on Building a Culture of Data Use. REL 2015-063

    ERIC Educational Resources Information Center

    Gerzon, Nancy; Guckenburg, Sarah

    2015-01-01

    The Culture of Data Use Workshop Toolkit helps school and district teams apply research to practice as they establish and support a culture of data use in their educational setting. The field-tested workshop toolkit guides teams through a set of structured activities to develop an understanding of data-use research in schools and to analyze…

  12. Capacity Building Indicators & Dissemination Strategies: Designing and Delivering Intensive Interventions--A Teacher's Toolkit

    ERIC Educational Resources Information Center

    Center on Instruction, 2012

    2012-01-01

    This toolkit provides activities and resources to assist practitioners in designing and delivering intensive interventions in reading and mathematics for K-12 students with significant learning difficulties and disabilities. Grounded in research, this toolkit is based on the Center on Instruction's "Intensive Interventions for Students Struggling…

  13. Toolkit of Available EPA Green Infrastructure Modeling Software. National Stormwater Calculator

    EPA Science Inventory

    This webinar will present a toolkit consisting of five EPA green infrastructure models and tools, along with communication material. This toolkit can be used as a teaching and quick reference resource for use by planners and developers when making green infrastructure implementat...

  14. Charon Toolkit for Parallel, Implicit Structured-Grid Computations: Functional Design

    NASA Technical Reports Server (NTRS)

    VanderWijngaart, Rob F.; Kutler, Paul (Technical Monitor)

    1997-01-01

    Charon is a software toolkit that enables engineers to develop high-performing message-passing programs in a convenient and piecemeal fashion. Emphasis is on rapid program development and prototyping. In this report a detailed description of the functional design of the toolkit is presented. It is illustrated by the stepwise parallelization of two representative code examples.

  15. Practitioner Data Use in Schools: Workshop Toolkit. REL 2015-043

    ERIC Educational Resources Information Center

    Bocala, Candice; Henry, Susan F.; Mundry, Susan; Morgan, Claire

    2014-01-01

    The "Practitioner Data Use in Schools: Workshop Toolkit" is designed to help practitioners systematically and accurately use data to inform their teaching practice. The toolkit includes an agenda, slide deck, participant workbook, and facilitator's guide and covers the following topics: developing data literacy, engaging in a cycle of…

  16. 78 FR 14774 - U.S. Environmental Solutions Toolkit-Universal Waste

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-07

    ... technologies that will outline U.S. approaches to a series of environmental problems and highlight... Toolkit will refer users in foreign markets to U.S. approaches to solving environmental problems and to U... International Trade Administration U.S. Environmental Solutions Toolkit--Universal Waste AGENCY: International...

  17. 78 FR 14773 - U.S. Environmental Solutions Toolkit-Landfill Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-07

    ... environmental technologies that will outline U.S. approaches to a series of environmental problems and highlight... Toolkit will refer users in foreign markets to U.S. approaches to solving environmental problems and to U... International Trade Administration U.S. Environmental Solutions Toolkit--Landfill Standards AGENCY...

  18. 78 FR 14773 - U.S. Environmental Solutions Toolkit-Medical Waste

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-07

    ... technologies that will outline U.S. approaches to a series of environmental problems and highlight... Toolkit will refer users in foreign markets to U.S. approaches to solving environmental problems and to U... International Trade Administration U.S. Environmental Solutions Toolkit--Medical Waste AGENCY: International...

  19. The RAVE/VERTIGO vertex reconstruction toolkit and framework

    NASA Astrophysics Data System (ADS)

    Waltenberger, W.; Mitaroff, W.; Moser, F.; Pflugfelder, B.; Riedel, H. V.

    2008-07-01

    A detector-independent toolkit for vertex reconstruction (RAVE1) is being developed, along with a standalone framework (VERTIGO2) for testing, analyzing and debugging. The core algorithms represent state-of-the-art for geometric vertex finding and fitting by both linear (Kalman filter) and robust estimation methods. Main design goals are ease of use, flexibility for embedding into existing software frameworks, extensibility, and openness. The implementation is based on modern object-oriented techniques, is coded in C++ with interfaces for Java and Python, and follows an open-source approach. A beta release is available.

  20. A unified toolkit for information and scientific visualization

    NASA Astrophysics Data System (ADS)

    Wylie, Brian; Baumes, Jeffrey

    2009-01-01

    We present an expansion of the popular open source Visualization Toolkit (VTK) to support the ingestion, processing, and display of informatics data. The result is a flexible, component-based pipeline framework for the integration and deployment of algorithms in the scientific and informatics fields. This project, code named "Titan", is one of the first efforts to address the unification of information and scientific visualization in a systematic fashion. The result includes a wide range of informatics-oriented functionality: database access, graph algorithms, graph layouts, views, charts, UI components and more. Further, the data distribution, parallel processing and client/server capabilities of VTK provide an excellent platform for scalable analysis.

  1. Benchmarking the Collocation Stand-Alone Library and Toolkit (CSALT)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven; Knittel, Jeremy; Shoan, Wendy; Kim, Youngkwang; Conway, Claire; Conway, Darrel J.

    2017-01-01

    This paper describes the processes and results of Verification and Validation (VV) efforts for the Collocation Stand Alone Library and Toolkit (CSALT). We describe the test program and environments, the tools used for independent test data, and comparison results. The VV effort employs classical problems with known analytic solutions, solutions from other available software tools, and comparisons to benchmarking data available in the public literature. Presenting all test results are beyond the scope of a single paper. Here we present high-level test results for a broad range of problems, and detailed comparisons for selected problems.

  2. Benchmarking the Collocation Stand-Alone Library and Toolkit (CSALT)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven; Knittel, Jeremy; Shoan, Wendy (Compiler); Kim, Youngkwang; Conway, Claire (Compiler); Conway, Darrel

    2017-01-01

    This paper describes the processes and results of Verification and Validation (V&V) efforts for the Collocation Stand Alone Library and Toolkit (CSALT). We describe the test program and environments, the tools used for independent test data, and comparison results. The V&V effort employs classical problems with known analytic solutions, solutions from other available software tools, and comparisons to benchmarking data available in the public literature. Presenting all test results are beyond the scope of a single paper. Here we present high-level test results for a broad range of problems, and detailed comparisons for selected problems.

  3. The Wind Integration National Dataset (WIND) toolkit (Presentation)

    SciTech Connect

    Caroline Draxl: NREL

    2014-01-01

    Regional wind integration studies require detailed wind power output data at many locations to perform simulations of how the power system will operate under high penetration scenarios. The wind datasets that serve as inputs into the study must realistically reflect the ramping characteristics, spatial and temporal correlations, and capacity factors of the simulated wind plants, as well as being time synchronized with available load profiles.As described in this presentation, the WIND Toolkit fulfills these requirements by providing a state-of-the-art national (US) wind resource, power production and forecast dataset.

  4. A Toolkit to Enable Hydrocarbon Conversion in Aqueous Environments

    PubMed Central

    Brinkman, Eva K.; Schipper, Kira; Bongaerts, Nadine; Voges, Mathias J.; Abate, Alessandro; Wahl, S. Aljoscha

    2012-01-01

    This work puts forward a toolkit that enables the conversion of alkanes by Escherichia coli and presents a proof of principle of its applicability. The toolkit consists of multiple standard interchangeable parts (BioBricks)9 addressing the conversion of alkanes, regulation of gene expression and survival in toxic hydrocarbon-rich environments. A three-step pathway for alkane degradation was implemented in E. coli to enable the conversion of medium- and long-chain alkanes to their respective alkanols, alkanals and ultimately alkanoic-acids. The latter were metabolized via the native β-oxidation pathway. To facilitate the oxidation of medium-chain alkanes (C5-C13) and cycloalkanes (C5-C8), four genes (alkB2, rubA3, rubA4and rubB) of the alkane hydroxylase system from Gordonia sp. TF68,21 were transformed into E. coli. For the conversion of long-chain alkanes (C15-C36), theladA gene from Geobacillus thermodenitrificans was implemented. For the required further steps of the degradation process, ADH and ALDH (originating from G. thermodenitrificans) were introduced10,11. The activity was measured by resting cell assays. For each oxidative step, enzyme activity was observed. To optimize the process efficiency, the expression was only induced under low glucose conditions: a substrate-regulated promoter, pCaiF, was used. pCaiF is present in E. coli K12 and regulates the expression of the genes involved in the degradation of non-glucose carbon sources. The last part of the toolkit - targeting survival - was implemented using solvent tolerance genes, PhPFDα and β, both from Pyrococcus horikoshii OT3. Organic solvents can induce cell stress and decreased survivability by negatively affecting protein folding. As chaperones, PhPFDα and β improve the protein folding process e.g. under the presence of alkanes. The expression of these genes led to an improved hydrocarbon tolerance shown by an increased growth rate (up to 50%) in the presences of 10% n-hexane in the culture

  5. A flexible open-source toolkit for lava flow simulations

    NASA Astrophysics Data System (ADS)

    Mossoux, Sophie; Feltz, Adelin; Poppe, Sam; Canters, Frank; Kervyn, Matthieu

    2014-05-01

    Lava flow hazard modeling is a useful tool for scientists and stakeholders confronted with imminent or long term hazard from basaltic volcanoes. It can improve their understanding of the spatial distribution of volcanic hazard, influence their land use decisions and improve the city evacuation during a volcanic crisis. Although a range of empirical, stochastic and physically-based lava flow models exists, these models are rarely available or require a large amount of physical constraints. We present a GIS toolkit which models lava flow propagation from one or multiple eruptive vents, defined interactively on a Digital Elevation Model (DEM). It combines existing probabilistic (VORIS) and deterministic (FLOWGO) models in order to improve the simulation of lava flow spatial spread and terminal length. Not only is this toolkit open-source, running in Python, which allows users to adapt the code to their needs, but it also allows users to combine the models included in different ways. The lava flow paths are determined based on the probabilistic steepest slope (VORIS model - Felpeto et al., 2001) which can be constrained in order to favour concentrated or dispersed flow fields. Moreover, the toolkit allows including a corrective factor in order for the lava to overcome small topographical obstacles or pits. The lava flow terminal length can be constrained using a fixed length value, a Gaussian probability density function or can be calculated based on the thermo-rheological properties of the open-channel lava flow (FLOWGO model - Harris and Rowland, 2001). These slope-constrained properties allow estimating the velocity of the flow and its heat losses. The lava flow stops when its velocity is zero or the lava temperature reaches the solidus. Recent lava flows of Karthala volcano (Comoros islands) are here used to demonstrate the quality of lava flow simulations with the toolkit, using a quantitative assessment of the match of the simulation with the real lava flows. The

  6. GENFIT — a Generic Track-Fitting Toolkit

    NASA Astrophysics Data System (ADS)

    Rauch, Johannes; Schlüter, Tobias

    2015-05-01

    GENFIT is an experiment-independent track-fitting toolkit that combines fitting algorithms, track representations, and measurement geometries into a modular framework. We report on a significantly improved version of GENFIT, based on experience gained in the Belle II, P¯ANDA, and FOPI experiments. Improvements concern the implementation of additional track-fitting algorithms, enhanced implementations of Kalman fitters, enhanced visualization capabilities, and additional implementations of measurement types suited for various kinds of tracking detectors. The data model has been revised, allowing for efficient track merging, smoothing, residual calculation, alignment, and storage.

  7. TECA: A Parallel Toolkit for Extreme Climate Analysis

    SciTech Connect

    Prabhat, Mr; Ruebel, Oliver; Byna, Surendra; Wu, Kesheng; Li, Fuyu; Wehner, Michael; Bethel, E. Wes

    2012-03-12

    We present TECA, a parallel toolkit for detecting extreme events in large climate datasets. Modern climate datasets expose parallelism across a number of dimensions: spatial locations, timesteps and ensemble members. We design TECA to exploit these modes of parallelism and demonstrate a prototype implementation for detecting and tracking three classes of extreme events: tropical cyclones, extra-tropical cyclones and atmospheric rivers. We process a modern TB-sized CAM5 simulation dataset with TECA, and demonstrate good runtime performance for the three case studies.

  8. Aggregations in Flatworms.

    ERIC Educational Resources Information Center

    Liffen, C. L.; Hunter, M.

    1980-01-01

    Described is a school project to investigate aggregations in flatworms which may be influenced by light intensity, temperature, and some form of chemical stimulus released by already aggregating flatworms. Such investigations could be adopted to suit many educational levels of science laboratory activities. (DS)

  9. Aggregations in Flatworms.

    ERIC Educational Resources Information Center

    Liffen, C. L.; Hunter, M.

    1980-01-01

    Described is a school project to investigate aggregations in flatworms which may be influenced by light intensity, temperature, and some form of chemical stimulus released by already aggregating flatworms. Such investigations could be adopted to suit many educational levels of science laboratory activities. (DS)

  10. Evaluation of an Extension-Delivered Resource for Accelerating Progress in Childhood Obesity Prevention: The BEPA-Toolkit

    ERIC Educational Resources Information Center

    Gunter, Katherine B.; Abi Nader, Patrick; Armington, Amanda; Hicks, John C.; John, Deborah

    2017-01-01

    The Balanced Energy Physical Activity Toolkit, or the BEPA-Toolkit, supports physical activity (PA) programming via Extension in elementary schools. In a pilot study, we evaluated the effectiveness of the BEPA-Toolkit as used by teachers through Supplemental Nutrition Assistance Program Education partnerships. We surveyed teachers (n = 57)…

  11. Obesity and Tobacco Cessation Toolkits: Practical Tips and Tools to Save Lives.

    PubMed

    Crowe, Susan D; Gregg, Laurie C; DeFrancesco, Mark S

    2016-12-01

    Both obesity and smoking are public health burdens that together contribute to approximately one third of the deaths annually in the United States. In 2015, under the direction of Dr. Mark DeFrancesco, the American College of Obstetricians and Gynecologists convened two workgroups with the purpose of creating toolkits that bring together information that the obstetrician-gynecologist can use to address these preventable health problems. An Obesity Prevention and Treatment Workgroup and a Tobacco and Nicotine Cessation Workgroup developed toolkits on Obesity Prevention and Treatment (www.acog.org/ObesityToolkit)andTobaccoandNicotineCessation(www.acog.org/TobaccoToolkit). The toolkits contain specific talking points, counseling methods, and algorithms to address these health concerns in a supportive, efficient, and effective manner. By including these methods in practice, clinicians can help prevent the tragedy of early deaths caused by obesity, tobacco, and nicotine use.

  12. A case control study to improve accuracy of an electronic fall prevention toolkit.

    PubMed

    Dykes, Patricia C; I-Ching, Evita Hou; Soukup, Jane R; Chang, Frank; Lipsitz, Stuart

    2012-01-01

    Patient falls are a serious and commonly report adverse event in hospitals. In 2009, our team conducted the first randomized control trial of a health information technology-based intervention that significantly reduced falls in acute care hospitals. However, some patients on intervention units with access to the electronic toolkit fell. The purpose of this case control study was to use data mining and modeling techniques to identify the factors associated with falls in hospitalized patients when the toolkit was in place. Our ultimate aim was to apply our findings to improve the toolkit logic and to generate practice recommendations. The results of our evaluation suggest that the fall prevention toolkit logic is accurate but strategies are needed to improve adherence with the fall prevention intervention recommendations generated by the electronic toolkit.

  13. Marine Synechococcus Aggregation

    NASA Astrophysics Data System (ADS)

    Neuer, S.; Deng, W.; Cruz, B. N.; Monks, L.

    2016-02-01

    Cyanobacteria are considered to play an important role in the oceanic biological carbon pump, especially in oligotrophic regions. But as single cells are too small to sink, their carbon export has to be mediated by aggregate formation and possible consumption by zooplankton producing sinking fecal pellets. Here we report results on the aggregation of the ubiquitous marine pico-cyanobacterium Synechococcus as a model organism. We first investigated the mechanism behind such aggregation by studying the potential role of transparent exopolymeric particles (TEP) and the effects of nutrient (nitrogen or phosphorus) limitation on the TEP production and aggregate formation of these pico-cyanobacteria. We further studied the aggregation and subsequent settling in roller tanks and investigated the effects of the clays kaolinite and bentonite in a series of concentrations. Our results show that despite of the lowered growth rates, Synechococcus in nutrient limited cultures had larger cell-normalized TEP production, formed a greater volume of aggregates, and resulted in higher settling velocities compared to results from replete cultures. In addition, we found that despite their small size and lack of natural ballasting minerals, Synechococcus cells could still form aggregates and sink at measureable velocities in seawater. Clay minerals increased the number and reduced the size of aggregates, and their ballasting effects increased the sinking velocity and carbon export potential of aggregates. In comparison with the Synechococcus, we will also present results of the aggregation of the pico-cyanobacterium Prochlorococcus in roller tanks. These results contribute to our understanding in the physiology of marine Synechococcus as well as their role in the ecology and biogeochemistry in oligotrophic oceans.

  14. Computer files.

    PubMed

    Malik, M

    1995-02-01

    From what has been said, several recommendations can be made for users of small personal computers regardless of which operating system they use. If your computer has a large hard disk not specially required by any single application, organize the disk into a small number of volumes. You will then be using the computer as if it had several smaller disks, which will help you to create a logical file structure. The size of individual volumes has to be selected carefully with respect to the files kept in each volume. Otherwise, it may be that you will have too much space in one volume and not enough in another. In each volume, organize the structure of directories and subdirectories logically so that they correspond to the logic of your file content. Be aware of the fact that the directories suggested as default when installing new software are often not the optimum. For instance, it is better to put different graphics packages under a common subdirectory rather than to install them at the same level as all other packages including statistics, text processors, etc. Create a special directory for each task you use the computer. Note that it is a bad practice to keep many different and logically unsorted files in the root directory of any of your volumes. Only system and important service files should be kept there. Although any file may be written all over the disk, access to it will be faster if it is written over the minimum number of cylinders. From time to time, use special programs that reorganize your files in this way.(ABSTRACT TRUNCATED AT 250 WORDS)

  15. Compress Your Files

    ERIC Educational Resources Information Center

    Branzburg, Jeffrey

    2005-01-01

    File compression enables data to be squeezed together, greatly reducing file size. Why would someone want to do this? Reducing file size enables the sending and receiving of files over the Internet more quickly, the ability to store more files on the hard drive, and the ability pack many related files into one archive (for example, all files…

  16. Compress Your Files

    ERIC Educational Resources Information Center

    Branzburg, Jeffrey

    2005-01-01

    File compression enables data to be squeezed together, greatly reducing file size. Why would someone want to do this? Reducing file size enables the sending and receiving of files over the Internet more quickly, the ability to store more files on the hard drive, and the ability pack many related files into one archive (for example, all files…

  17. The Bioperl toolkit: Perl modules for the life sciences.

    PubMed

    Stajich, Jason E; Block, David; Boulez, Kris; Brenner, Steven E; Chervitz, Stephen A; Dagdigian, Chris; Fuellen, Georg; Gilbert, James G R; Korf, Ian; Lapp, Hilmar; Lehväslaiho, Heikki; Matsalla, Chad; Mungall, Chris J; Osborne, Brian I; Pocock, Matthew R; Schattner, Peter; Senger, Martin; Stein, Lincoln D; Stupka, Elia; Wilkinson, Mark D; Birney, Ewan

    2002-10-01

    The Bioperl project is an international open-source collaboration of biologists, bioinformaticians, and computer scientists that has evolved over the past 7 yr into the most comprehensive library of Perl modules available for managing and manipulating life-science information. Bioperl provides an easy-to-use, stable, and consistent programming interface for bioinformatics application programmers. The Bioperl modules have been successfully and repeatedly used to reduce otherwise complex tasks to only a few lines of code. The Bioperl object model has been proven to be flexible enough to support enterprise-level applications such as EnsEMBL, while maintaining an easy learning curve for novice Perl programmers. Bioperl is capable of executing analyses and processing results from programs such as BLAST, ClustalW, or the EMBOSS suite. Interoperation with modules written in Python and Java is supported through the evolving BioCORBA bridge. Bioperl provides access to data stores such as GenBank and SwissProt via a flexible series of sequence input/output modules, and to the emerging common sequence data storage format of the Open Bioinformatics Database Access project. This study describes the overall architecture of the toolkit, the problem domains that it addresses, and gives specific examples of how the toolkit can be used to solve common life-sciences problems. We conclude with a discussion of how the open-source nature of the project has contributed to the development effort.

  18. Developing an evidence-based, multimedia group counseling curriculum toolkit

    PubMed Central

    Brooks, Adam C.; DiGuiseppi, Graham; Laudet, Alexandre; Rosenwasser, Beth; Knoblach, Dan; Carpenedo, Carolyn M.; Carise, Deni; Kirby, Kimberly C.

    2013-01-01

    Training community-based addiction counselors in empirically supported treatments (ESTs) far exceeds the ever-decreasing resources of publicly funded treatment agencies. This feasibility study describes the development and pilot testing of a group counseling toolkit (an approach adapted from the education field) focused on relapse prevention (RP). When counselors (N = 17) used the RP toolkit after 3 hours of training, their content adherence scores on “coping with craving” and “drug refusal skills” showed significant improvement, as indicated by very large effect sizes (Cohen’s d = 1.49 and 1.34, respectively). Counselor skillfulness, in the “adequate-to-average” range at baseline, did not change. Although this feasibility study indicates some benefit to counselor EST acquisition, it is important to note that the impact of the curriculum on client outcomes is unknown. Because a majority of addiction treatment is delivered in group format, a multimedia curriculum approach may assist counselors in applying ESTs in the context of actual service delivery. PMID:22301082

  19. Regulatory and Permitting Information Desktop (RAPID) Toolkit (Poster)

    SciTech Connect

    Young, K. R.; Levine, A.

    2014-09-01

    The Regulatory and Permitting Information Desktop (RAPID) Toolkit combines the former Geothermal Regulatory Roadmap, National Environmental Policy Act (NEPA) Database, and other resources into a Web-based tool that gives the regulatory and utility-scale geothermal developer communities rapid and easy access to permitting information. RAPID currently comprises five tools - Permitting Atlas, Regulatory Roadmap, Resource Library, NEPA Database, and Best Practices. A beta release of an additional tool, the Permitting Wizard, is scheduled for late 2014. Because of the huge amount of information involved, RAPID was developed in a wiki platform to allow industry and regulatory agencies to maintain the content in the future so that it continues to provide relevant and accurate information to users. In 2014, the content was expanded to include regulatory requirements for utility-scale solar and bulk transmission development projects. Going forward, development of the RAPID Toolkit will focus on expanding the capabilities of current tools, developing additional tools, including additional technologies, and continuing to increase stakeholder involvement.

  20. svmPRAT: SVM-based Protein Residue Annotation Toolkit

    PubMed Central

    2009-01-01

    Background Over the last decade several prediction methods have been developed for determining the structural and functional properties of individual protein residues using sequence and sequence-derived information. Most of these methods are based on support vector machines as they provide accurate and generalizable prediction models. Results We present a general purpose protein residue annotation toolkit (svmPRAT) to allow biologists to formulate residue-wise prediction problems. svmPRAT formulates the annotation problem as a classification or regression problem using support vector machines. One of the key features of svmPRAT is its ease of use in incorporating any user-provided information in the form of feature matrices. For every residue svmPRAT captures local information around the reside to create fixed length feature vectors. svmPRAT implements accurate and fast kernel functions, and also introduces a flexible window-based encoding scheme that accurately captures signals and pattern for training effective predictive models. Conclusions In this work we evaluate svmPRAT on several classification and regression problems including disorder prediction, residue-wise contact order estimation, DNA-binding site prediction, and local structure alphabet prediction. svmPRAT has also been used for the development of state-of-the-art transmembrane helix prediction method called TOPTMH, and secondary structure prediction method called YASSPP. This toolkit developed provides practitioners an efficient and easy-to-use tool for a wide variety of annotation problems. Availability: http://www.cs.gmu.edu/~mlbio/svmprat PMID:20028521

  1. Clinical Trial of a Home Safety Toolkit for Alzheimer's Disease

    PubMed Central

    Trudeau, Scott A.; Rudolph, James L.; Trudeau, Paulette A.; Duffy, Mary E.; Berlowitz, Dan

    2013-01-01

    This randomized clinical trial tested a new self-directed educational intervention to improve caregiver competence to create a safer home environment for persons with dementia living in the community. The sample included 108 patient/caregiver dyads: the intervention group (n = 60) received the Home Safety Toolkit (HST), including a new booklet based on health literacy principles, and sample safety items to enhance self-efficacy to make home safety modifications. The control group (n = 48) received customary care. Participants completed measures at baseline and at twelve-week follow-up. Multivariate Analysis of Covariance (MANCOVA) was used to test for significant group differences. All caregiver outcome variables improved in the intervention group more than in the control. Home safety was significant at P ≤ 0.001, caregiver strain at P ≤ 0.001, and caregiver self-efficacy at P = 0.002. Similarly, the care receiver outcome of risky behaviors and accidents was lower in the intervention group (P ≤ 0.001). The self-directed use of this Home Safety Toolkit activated the primary family caregiver to make the home safer for the person with dementia of Alzheimer's type (DAT) or related disorder. Improving the competence of informal caregivers is especially important for patients with DAT in light of all stakeholders reliance on their unpaid care. PMID:24195007

  2. Charged Dust Aggregate Interactions

    NASA Astrophysics Data System (ADS)

    Matthews, Lorin; Hyde, Truell

    2015-11-01

    A proper understanding of the behavior of dust particle aggregates immersed in a complex plasma first requires a knowledge of the basic properties of the system. Among the most important of these are the net electrostatic charge and higher multipole moments on the dust aggregate as well as the manner in which the aggregate interacts with the local electrostatic fields. The formation of elongated, fractal-like aggregates levitating in the sheath electric field of a weakly ionized RF generated plasma discharge has recently been observed experimentally. The resulting data has shown that as aggregates approach one another, they can both accelerate and rotate. At equilibrium, aggregates are observed to levitate with regular spacing, rotating about their long axis aligned parallel to the sheath electric field. Since gas drag tends to slow any such rotation, energy must be constantly fed into the system in order to sustain it. A numerical model designed to analyze this motion provides both the electrostatic charge and higher multipole moments of the aggregate while including the forces due to thermophoresis, neutral gas drag, and the ion wakefield. This model will be used to investigate the ambient conditions leading to the observed interactions. This research is funded by NSF Grant 1414523.

  3. Aggregate and the environment

    USGS Publications Warehouse

    Langer, William H.; Drew, Lawrence J.; Sachs, J.S.

    2004-01-01

    This book is designed to help you understand our aggregate resources-their importance, where they come from, how they are processed for our use, the environmental concerns related to their mining and processing, how those concerns are addressed, and the policies and regulations designed to safeguard workers, neighbors, and the environment from the negative impacts of aggregate mining. We hope this understanding will help prepare you to be involved in decisions that need to be made-individually and as a society-to be good stewards of our aggregate resources and our living planet.

  4. Protein Colloidal Aggregation Project

    NASA Technical Reports Server (NTRS)

    Oliva-Buisson, Yvette J. (Compiler)

    2014-01-01

    To investigate the pathways and kinetics of protein aggregation to allow accurate predictive modeling of the process and evaluation of potential inhibitors to prevalent diseases including cataract formation, chronic traumatic encephalopathy, Alzheimer's Disease, Parkinson's Disease and others.

  5. Propagation of Tau aggregates.

    PubMed

    Goedert, Michel; Spillantini, Maria Grazia

    2017-05-30

    Since 2009, evidence has accumulated to suggest that Tau aggregates form first in a small number of brain cells, from where they propagate to other regions, resulting in neurodegeneration and disease. Propagation of Tau aggregates is often called prion-like, which refers to the capacity of an assembled protein to induce the same abnormal conformation in a protein of the same kind, initiating a self-amplifying cascade. In addition, prion-like encompasses the release of protein aggregates from brain cells and their uptake by neighbouring cells. In mice, the intracerebral injection of Tau inclusions induced the ordered assembly of monomeric Tau, followed by its spreading to distant brain regions. Short fibrils constituted the major species of seed-competent Tau. The existence of several human Tauopathies with distinct fibril morphologies has led to the suggestion that different molecular conformers (or strains) of aggregated Tau exist.

  6. Marine aggregate dynamics

    NASA Astrophysics Data System (ADS)

    The direction and scope of the Office of Naval Research's Marine Aggregate Dynamics Accelerated Research Initiative will be the topic of an open-house style meeting February 14, 7:30-10:00 P.M. in Ballroom D of the Hyatt Regency New Orleans at the Louisiana Superdome. This meeting is scheduled during the AGU/American Society of Limnology and Oceanography Ocean Sciences Meeting February 12-16 in New Orleans.The critical focus of the ARI is the measurement and modeling of the dynamics of the biological, physical, chemical and molecular processes that drive aggregation and produce aggregates. This new ARI will provide funding in Fiscal Years 1991-1995 to identify and quantify mechanisms that determine the distribution, abundance and size spectrum of aggregated particulate matter in the ocean.

  7. Aggregation and Averaging.

    ERIC Educational Resources Information Center

    Siegel, Irving H.

    The arithmetic processes of aggregation and averaging are basic to quantitative investigations of employment, unemployment, and related concepts. In explaining these concepts, this report stresses need for accuracy and consistency in measurements, and describes tools for analyzing alternative measures. (BH)

  8. A lightweight, flow-based toolkit for parallel and distributed bioinformatics pipelines.

    PubMed

    Cieślik, Marcin; Mura, Cameron

    2011-02-25

    Bioinformatic analyses typically proceed as chains of data-processing tasks. A pipeline, or 'workflow', is a well-defined protocol, with a specific structure defined by the topology of data-flow interdependencies, and a particular functionality arising from the data transformations applied at each step. In computer science, the dataflow programming (DFP) paradigm defines software systems constructed in this manner, as networks of message-passing components. Thus, bioinformatic workflows can be naturally mapped onto DFP concepts. To enable the flexible creation and execution of bioinformatics dataflows, we have written a modular framework for parallel pipelines in Python ('PaPy'). A PaPy workflow is created from re-usable components connected by data-pipes into a directed acyclic graph, which together define nested higher-order map functions. The successive functional transformations of input data are evaluated on flexibly pooled compute resources, either local or remote. Input items are processed in batches of adjustable size, all flowing one to tune the trade-off between parallelism and lazy-evaluation (memory consumption). An add-on module ('NuBio') facilitates the creation of bioinformatics workflows by providing domain specific data-containers (e.g., for biomolecular sequences, alignments, structures) and functionality (e.g., to parse/write standard file formats). PaPy offers a modular framework for the creation and deployment of parallel and distributed data-processing workflows. Pipelines derive their functionality from user-written, data-coupled components, so PaPy also can be viewed as a lightweight toolkit for extensible, flow-based bioinformatics data-processing. The simplicity and flexibility of distributed PaPy pipelines may help users bridge the gap between traditional desktop/workstation and grid computing. PaPy is freely distributed as open-source Python code at http://muralab.org/PaPy, and includes extensive documentation and annotated usage

  9. A lightweight, flow-based toolkit for parallel and distributed bioinformatics pipelines

    PubMed Central

    2011-01-01

    Background Bioinformatic analyses typically proceed as chains of data-processing tasks. A pipeline, or 'workflow', is a well-defined protocol, with a specific structure defined by the topology of data-flow interdependencies, and a particular functionality arising from the data transformations applied at each step. In computer science, the dataflow programming (DFP) paradigm defines software systems constructed in this manner, as networks of message-passing components. Thus, bioinformatic workflows can be naturally mapped onto DFP concepts. Results To enable the flexible creation and execution of bioinformatics dataflows, we have written a modular framework for parallel pipelines in Python ('PaPy'). A PaPy workflow is created from re-usable components connected by data-pipes into a directed acyclic graph, which together define nested higher-order map functions. The successive functional transformations of input data are evaluated on flexibly pooled compute resources, either local or remote. Input items are processed in batches of adjustable size, all flowing one to tune the trade-off between parallelism and lazy-evaluation (memory consumption). An add-on module ('NuBio') facilitates the creation of bioinformatics workflows by providing domain specific data-containers (e.g., for biomolecular sequences, alignments, structures) and functionality (e.g., to parse/write standard file formats). Conclusions PaPy offers a modular framework for the creation and deployment of parallel and distributed data-processing workflows. Pipelines derive their functionality from user-written, data-coupled components, so PaPy also can be viewed as a lightweight toolkit for extensible, flow-based bioinformatics data-processing. The simplicity and flexibility of distributed PaPy pipelines may help users bridge the gap between traditional desktop/workstation and grid computing. PaPy is freely distributed as open-source Python code at http://muralab.org/PaPy, and includes extensive

  10. A prototype forensic toolkit for industrial-control-systems incident response

    NASA Astrophysics Data System (ADS)

    Carr, Nickolas B.; Rowe, Neil C.

    2015-05-01

    Industrial control systems (ICSs) are an important part of critical infrastructure in cyberspace. They are especially vulnerable to cyber-attacks because of their legacy hardware and software and the difficulty of changing it. We first survey the history of intrusions into ICSs, the more serious of which involved a continuing adversary presence on an ICS network. We discuss some common vulnerabilities and the categories of possible attacks, noting the frequent use of software written a long time ago. We propose a framework for designing ICS incident response under the constraints that no new software must be required and that interventions cannot impede the continuous processing that is the norm for such systems. We then discuss a prototype toolkit we built using the Windows Management Instrumentation Command-Line tool for host-based analysis and the Bro intrusion-detection software for network-based analysis. Particularly useful techniques we used were learning the historical range of parameters of numeric quantities so as to recognize anomalies, learning the usual addresses of connections to a node, observing Internet addresses (usually rare), observing anomalous network protocols such as unencrypted data transfers, observing unusual scheduled tasks, and comparing key files through registry entries and hash values to find malicious modifications. We tested our methods on actual data from ICSs including publicly-available data, voluntarily-submitted data, and researcher-provided "advanced persistent threat" data. We found instances of interesting behavior in our experiments. Intrusions were generally easy to see because of the repetitive nature of most processing on ICSs, but operators need to be motivated to look.

  11. Aggregation of retail stores

    NASA Astrophysics Data System (ADS)

    Jensen, Pablo; Boisson, Jean; Larralde, Hernán

    2005-06-01

    We propose a simple model to understand the economic factors that induce aggregation of some businesses over small geographical regions. The model incorporates price competition with neighboring stores, transportation costs and the satisfaction probability of finding the desired product. We show that aggregation is more likely for stores selling expensive products and/or stores carrying only a fraction of the business variety. We illustrate our model with empirical data collected in the city of Lyon.

  12. Protein aggregation and prionopathies.

    PubMed

    Renner, M; Melki, R

    2014-06-01

    Prion protein and prion-like proteins share a number of characteristics. From the molecular point of view, they are constitutive proteins that aggregate following conformational changes into insoluble particles. These particles escape the cellular clearance machinery and amplify by recruiting the soluble for of their constituting proteins. The resulting protein aggregates are responsible for a number of neurodegenerative diseases such as Creutzfeldt-Jacob, Alzheimer, Parkinson and Huntington diseases. In addition, there are increasing evidences supporting the inter-cellular trafficking of these aggregates, meaning that they are "transmissible" between cells. There are also evidences that brain homogenates from individuals developing Alzheimer and Parkinson diseases propagate the disease in recipient model animals in a manner similar to brain extracts of patients developing Creutzfeldt-Jacob's disease. Thus, the propagation of protein aggregates from cell to cell may be a generic phenomenon that contributes to the evolution of neurodegenerative diseases, which has important consequences on human health issues. Moreover, although the distribution of protein aggregates is characteristic for each disease, new evidences indicate the possibility of overlaps and crosstalk between the different disorders. Despite the increasing evidences that support prion or prion-like propagation of protein aggregates, there are many unanswered questions regarding the mechanisms of toxicity and this is a field of intensive research nowadays. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  13. Implementing a user-driven online quality improvement toolkit for cancer care.

    PubMed

    Luck, Jeff; York, Laura S; Bowman, Candice; Gale, Randall C; Smith, Nina; Asch, Steven M

    2015-05-01

    Peer-to-peer collaboration within integrated health systems requires a mechanism for sharing quality improvement lessons. The Veterans Health Administration (VA) developed online compendia of tools linked to specific cancer quality indicators. We evaluated awareness and use of the toolkits, variation across facilities, impact of social marketing, and factors influencing toolkit use. A diffusion of innovations conceptual framework guided the collection of user activity data from the Toolkit Series SharePoint site and an online survey of potential Lung Cancer Care Toolkit users. The VA Toolkit Series site had 5,088 unique visitors in its first 22 months; 5% of users accounted for 40% of page views. Social marketing communications were correlated with site usage. Of survey respondents (n = 355), 54% had visited the site, of whom 24% downloaded at least one tool. Respondents' awareness of the lung cancer quality performance of their facility, and facility participation in quality improvement collaboratives, were positively associated with Toolkit Series site use. Facility-level lung cancer tool implementation varied widely across tool types. The VA Toolkit Series achieved widespread use and a high degree of user engagement, although use varied widely across facilities. The most active users were aware of and active in cancer care quality improvement. Toolkit use seemed to be reinforced by other quality improvement activities. A combination of user-driven tool creation and centralized toolkit development seemed to be effective for leveraging health information technology to spread disease-specific quality improvement tools within an integrated health care system. Copyright © 2015 by American Society of Clinical Oncology.

  14. Simulation Toolkit for Renewable Energy Advanced Materials Modeling

    SciTech Connect

    Sides, Scott; Kemper, Travis; Larsen, Ross; Graf, Peter

    2013-11-13

    STREAMM is a collection of python classes and scripts that enables and eases the setup of input files and configuration files for simulations of advanced energy materials. The core STREAMM python classes provide a general framework for storing, manipulating and analyzing atomic/molecular coordinates to be used in quantum chemistry and classical molecular dynamics simulations of soft materials systems. The design focuses on enabling the interoperability of materials simulation codes such as GROMACS, LAMMPS and Gaussian.

  15. Demonstration of the Health Literacy Universal Precautions Toolkit: Lessons for Quality Improvement.

    PubMed

    Mabachi, Natabhona M; Cifuentes, Maribel; Barnard, Juliana; Brega, Angela G; Albright, Karen; Weiss, Barry D; Brach, Cindy; West, David

    2016-01-01

    The Agency for Healthcare Research and Quality Health Literacy Universal Precautions Toolkit was developed to help primary care practices assess and make changes to improve communication with and support for patients. Twelve diverse primary care practices implemented assigned tools over a 6-month period. Qualitative results revealed challenges practices experienced during implementation, including competing demands, bureaucratic hurdles, technological challenges, limited quality improvement experience, and limited leadership support. Practices used the Toolkit flexibly and recognized the efficiencies of implementing tools in tandem and in coordination with other quality improvement initiatives. Practices recommended reducing Toolkit density and making specific refinements.

  16. Overview and Meteorological Validation of the Wind Integration National Dataset toolkit

    SciTech Connect

    Draxl, C.; Hodge, B. M.; Clifton, A.; McCaa, J.

    2015-04-13

    The Wind Integration National Dataset (WIND) Toolkit described in this report fulfills these requirements, and constitutes a state-of-the-art national wind resource data set covering the contiguous United States from 2007 to 2013 for use in a variety of next-generation wind integration analyses and wind power planning. The toolkit is a wind resource data set, wind forecast data set, and wind power production and forecast data set derived from the Weather Research and Forecasting (WRF) numerical weather prediction model. WIND Toolkit data are available online for over 116,000 land-based and 10,000 offshore sites representing existing and potential wind facilities.

  17. Clinical Rheumatology Toolkit: A Laptop Computer “Toolkit” for Instruction and Practice of Clinical Rheumatology

    PubMed Central

    Berger, Robert G.; Friedman, Charles P.; Arnett, Jennifer; Winfield, John B.

    1990-01-01

    Over the last two years at the University of North Carolina School of Medicine, a computerized Clinical Rheumatology Toolkit (the Toolkit) for medical students, postgraduate trainees, and medical faculty has been developed to run on laptop computers. The Toolkit is an integrated software package that includes case simulations, differential diagnosis of rheumatic symptoms and signs, clinical note generation, patient database, medline searching and reference management; and is designed to be employed in both clinical patient care and education of medical trainees. The system has been implemented in the last year and used predominantly by fourth year medical students.

  18. Use of Remote Sensing Data to Enhance NWS Storm Damage Toolkit

    NASA Technical Reports Server (NTRS)

    Jedlove, Gary J.; Molthan, Andrew L.; White, Kris; Burks, Jason; Stellman, Keith; Smith, Mathew

    2012-01-01

    In the wake of a natural disaster such as a tornado, the National Weather Service (NWS) is required to provide a very detailed and timely storm damage assessment to local, state and federal homeland security officials. The Post ]Storm Data Acquisition (PSDA) procedure involves the acquisition and assembly of highly perishable data necessary for accurate post ]event analysis and potential integration into a geographic information system (GIS) available to its end users and associated decision makers. Information gained from the process also enables the NWS to increase its knowledge of extreme events, learn how to better use existing equipment, improve NWS warning programs, and provide accurate storm intensity and damage information to the news media and academia. To help collect and manage all of this information, forecasters in NWS Southern Region are currently developing a Storm Damage Assessment Toolkit (SDAT), which incorporates GIS ]capable phones and laptops into the PSDA process by tagging damage photography, location, and storm damage details with GPS coordinates for aggregation within the GIS database. However, this tool alone does not fully integrate radar and ground based storm damage reports nor does it help to identify undetected storm damage regions. In many cases, information on storm damage location (beginning and ending points, swath width, etc.) from ground surveys is incomplete or difficult to obtain. Geographic factors (terrain and limited roads in rural areas), manpower limitations, and other logistical constraints often prevent the gathering of a comprehensive picture of tornado or hail damage, and may allow damage regions to go undetected. Molthan et al. (2011) have shown that high resolution satellite data can provide additional valuable information on storm damage tracks to augment this database. This paper presents initial development to integrate satellitederived damage track information into the SDAT for near real ]time use by forecasters

  19. Integration of Earth Remote Sensing into the NOAA/NWS Damage Assessment Toolkit

    NASA Technical Reports Server (NTRS)

    Molthan, Andrew; Burks, Jason; Camp, Parks; McGrath, Kevin; Bell, Jordan

    2014-01-01

    Following the occurrence of severe weather, NOAA/NWS meteorologists are tasked with performing a storm damage survey to assess the type and severity of the weather event, primarily focused with the confirmation and assessment of tornadoes. This labor-intensive process requires meteorologists to venture into the affected area, acquire damage indicators through photos, eyewitness accounts, and other documentation, then aggregation of data in order to make a final determination of the tornado path length, width, maximum intensity, and other characteristics. Earth remote sensing from operational, polar-orbiting satellites can support the damage assessment process by helping to identify portions of damage tracks that are difficult to access due to road limitations or time constraints by applying change detection techniques. In addition, higher resolution commercial imagery can corroborate ground-based surveys by examining higher-resolution commercial imagery. As part of an ongoing collaboration, NASA and NOAA are working to integrate near real-time Earth remote sensing observations into the NOAA/NWS Damage Assessment Toolkit, a handheld application used by meteorologists in the survey process. The team has recently developed a more streamlined approach for delivering data via a web mapping service and menu interface, allowing for caching of imagery before field deployment. Near real-time products have been developed using MODIS and VIIRS imagery and change detection for preliminary track identification, along with conduits for higher-resolution Landsat, ASTER, and commercial imagery as they become available. In addition to tornado damage assessments, the team is also investigating the use of near real-time imagery for identifying hail damage to vegetation, which also results in large swaths of damage, particularly in the central United States during the peak growing season months of June, July, and August. This presentation will present an overview of recent activities

  20. Integration of Earth Remote Sensing into the NOAA/NWS Damage Assessment Toolkit

    NASA Astrophysics Data System (ADS)

    Molthan, A.; Burks, J. E.; Camp, P.; McGrath, K.; Bell, J. R.

    2014-12-01

    Following the occurrence of severe weather, NOAA/NWS meteorologists are tasked with performing a storm damage survey to assess the type and severity of the weather event, primarily focused with the confirmation and assessment of tornadoes. This labor-intensive process requires meteorologists to venture into the affected area, acquire damage indicators through photos, eyewitness accounts, and other documentation, then aggregation of data in order to make a final determination of the tornado path length, width, maximum intensity, and other characteristics. Earth remote sensing from operational, polar-orbiting satellites can support the damage assessment process by helping to identify portions of damage tracks that are difficult to access due to road limitations or time constraints by applying change detection techniques. In addition, higher resolution commercial imagery can corroborate ground-based surveys by examining higher-resolution commercial imagery. As part of an ongoing collaboration, NASA and NOAA are working to integrate near real-time Earth remote sensing observations into the NOAA/NWS Damage Assessment Toolkit (DAT), a suite of applications used by meteorologists in the survey process. The DAT includes a handheld application used by meteorologists in the survey process. The team has recently developed a more streamlined approach for delivering data via a web mapping service and menu interface, allowing for caching of imagery before field deployment. Near real-time products have been developed using MODIS and VIIRS imagery and change detection for preliminary track identification, along with conduits for higher-resolution Landsat, ASTER, and commercial imagery as they become available. In addition to tornado damage assessments, the team is also investigating the use of near real-time imagery for identifying hail damage to vegetation, which also results in large swaths of damage, particularly in the central United States during the peak growing season

  1. Fibronectin Aggregation and Assembly

    PubMed Central

    Ohashi, Tomoo; Erickson, Harold P.

    2011-01-01

    The mechanism of fibronectin (FN) assembly and the self-association sites are still unclear and contradictory, although the N-terminal 70-kDa region (I1–9) is commonly accepted as one of the assembly sites. We previously found that I1–9 binds to superfibronectin, which is an artificial FN aggregate induced by anastellin. In the present study, we found that I1–9 bound to the aggregate formed by anastellin and a small FN fragment, III1–2. An engineered disulfide bond in III2, which stabilizes folding, inhibited aggregation, but a disulfide bond in III1 did not. A gelatin precipitation assay showed that I1–9 did not interact with anastellin, III1, III2, III1–2, or several III1–2 mutants including III1–2KADA. (In contrast to previous studies, we found that the III1–2KADA mutant was identical in conformation to wild-type III1–2.) Because I1–9 only bound to the aggregate and the unfolding of III2 played a role in aggregation, we generated a III2 domain that was destabilized by deletion of the G strand. This mutant bound I1–9 as shown by the gelatin precipitation assay and fluorescence resonance energy transfer analysis, and it inhibited FN matrix assembly when added to cell culture. Next, we introduced disulfide mutations into full-length FN. Three disulfide locks in III2, III3, and III11 were required to dramatically reduce anastellin-induced aggregation. When we tested the disulfide mutants in cell culture, only the disulfide bond in III2 reduced the FN matrix. These results suggest that the unfolding of III2 is one of the key factors for FN aggregation and assembly. PMID:21949131

  2. Integrating surgical robots into the next medical toolkit.

    PubMed

    Lai, Fuji; Entin, Eileen

    2006-01-01

    Surgical robots hold much promise for revolutionizing the field of surgery and improving surgical care. However, despite the potential advantages they offer, there are multiple barriers to adoption and integration into practice that may prevent these systems from realizing their full potential benefit. This study elucidated some of the most salient considerations that need to be addressed for integration of new technologies such as robotic systems into the operating room of the future as it evolves into a complex system of systems. We conducted in-depth interviews with operating room team members and other stakeholders to identify potential barriers in areas of workflow, teamwork, training, clinical acceptance, and human-system interaction. The findings of this study will inform an approach for the design and integration of robotics and related computer-assisted technologies into the next medical toolkit for "computer-enhanced surgery" to improve patient safety and healthcare quality.

  3. A personal health information toolkit for health intervention research.

    PubMed

    Kizakevich, Paul N; Eckhoff, Randall; Weger, Stacey; Weeks, Adam; Brown, Janice; Bryant, Stephanie; Bakalov, Vesselina; Zhang, Yuying; Lyden, Jennifer; Spira, James

    2014-01-01

    With the emergence of mobile health (mHealth) apps, there is a growing demand for better tools for developing and evaluating mobile health interventions. Recently we developed the Personal Health Intervention Toolkit (PHIT), a software framework which eases app implementation and facilitates scientific evaluation. PHIT integrates self-report and physiological sensor instruments, evidence-based advisor logic, and self-help interventions such as meditation, health education, and cognitive behavior change. PHIT can be used to facilitate research, interventions for chronic diseases, risky behaviors, sleep, medication adherence, environmental monitoring, momentary data collection health screening, and clinical decision support. In a series of usability evaluations, participants reported an overall usability score of 4.5 on a 1-5 Likert scale and an 85 score on the System Usability Scale, indicating a high percentile rank of 95%.

  4. PARAMESH: A Parallel Adaptive Mesh Refinement Community Toolkit

    NASA Technical Reports Server (NTRS)

    MacNeice, Peter; Olson, Kevin M.; Mobarry, Clark; deFainchtein, Rosalinda; Packer, Charles

    1999-01-01

    In this paper, we describe a community toolkit which is designed to provide parallel support with adaptive mesh capability for a large and important class of computational models, those using structured, logically cartesian meshes. The package of Fortran 90 subroutines, called PARAMESH, is designed to provide an application developer with an easy route to extend an existing serial code which uses a logically cartesian structured mesh into a parallel code with adaptive mesh refinement. Alternatively, in its simplest use, and with minimal effort, it can operate as a domain decomposition tool for users who want to parallelize their serial codes, but who do not wish to use adaptivity. The package can provide them with an incremental evolutionary path for their code, converting it first to uniformly refined parallel code, and then later if they so desire, adding adaptivity.

  5. Upgrading the safety toolkit: Initiatives of the accident analysis subgroup

    SciTech Connect

    O'Kula, K.R.; Chung, D.Y.

    1999-07-01

    Since its inception, the Accident Analysis Subgroup (AAS) of the Energy Facility Contractors Group (EFCOG) has been a leading organization promoting development and application of appropriate methodologies for safety analysis of US Department of Energy (DOE) installations. The AAS, one of seven chartered by the EFCOG Safety Analysis Working Group, has performed an oversight function and provided direction to several technical groups. These efforts have been instrumental toward formal evaluation of computer models, improving the pedigree on high-use computer models, and development of the user-friendly Accident Analysis Guidebook (AAG). All of these improvements have improved the analytical toolkit for best complying with DOE orders and standards shaping safety analysis reports (SARs) and related documentation. Major support for these objectives has been through DOE/DP-45.

  6. The extended PP1 toolkit: designed to create specificity

    PubMed Central

    Bollen, Mathieu; Peti, Wolfgang; Ragusa, Michael J.; Beullens, Monique

    2011-01-01

    Protein Ser/Thr phosphatase-1 (PP1) catalyzes the majority of eukaryotic protein dephosphorylation reactions in a highly regulated and selective manner. Recent studies have identified an unusually diversified PP1 interactome with the properties of a regulatory toolkit. PP1-interacting proteins (PIPs) function as targeting subunits, substrates and/or inhibitors. As targeting subunits, PIPs contribute to substrate selection by bringing PP1 into the vicinity of specific substrates and by modulating substrate specificity via additional substrate docking sites or blocking substrate-binding channels. Many of the nearly 200 established mammalian PIPs are predicted to be intrinsically disordered, a property that facilitates their binding to a large surface area of PP1 via multiple docking motifs. These novel insights offer perspectives for the therapeutic targeting of PP1 by interfering with the binding of PIPs or substrates. PMID:20399103

  7. Water Security Toolkit User Manual: Version 1.3 | Science ...

    EPA Pesticide Factsheets

    User manual: Data Product/Software The Water Security Toolkit (WST) is a suite of tools that help provide the information necessary to make good decisions resulting in the minimization of further human exposure to contaminants, and the maximization of the effectiveness of intervention strategies. WST assists in the evaluation of multiple response actions in order to select the most beneficial consequence management strategy. It includes hydraulic and water quality modeling software and optimization methodologies to identify: (1) sensor locations to detect contamination, (2) locations in the network in which the contamination was introduced, (3) hydrants to remove contaminated water from the distribution system, (4) locations in the network to inject decontamination agents to inactivate, remove or destroy contaminants, (5) locations in the network to take grab sample to confirm contamination or cleanup and (6) valves to close in order to isolate contaminated areas of the network.

  8. PHISICS TOOLKIT: MULTI-REACTOR TRANSMUTATION ANALYSIS UTILITY - MRTAU

    SciTech Connect

    Andrea Alfonsi; Cristian Rabiti; Aaron S. Epiney; Yaqi Wang; Joshua Cogliati

    2012-04-01

    The principal idea of this paper is to present the new capabilities available in the PHISICS toolkit, connected with the implementation of the depletion code MRTAU, a generic depletion/ decay/burn-up code developed at the Idaho National Laboratory. It is programmed in a modular structure and modern FORTRAN 95/2003. The code tracks the time evolution of the isotopic concentration of a given material accounting for nuclear reaction happening in presence of neutron flux and also due to natural decay. MRTAU has two different methods to perform the depletion calculation, in order to let the user choose the best one respect his needs. Both the methodologies and some significant results are reported in this paper.

  9. Parametrization of macrolide antibiotics using the force field toolkit.

    PubMed

    Pavlova, Anna; Gumbart, James C

    2015-10-15

    Macrolides are an important class of antibiotics that target the bacterial ribosome. Computer simulations of macrolides are limited as specific force field parameters have not been previously developed for them. Here, we determine CHARMM-compatible force field parameters for erythromycin, azithromycin, and telithromycin, using the force field toolkit (ffTK) plugin in VMD. Because of their large size, novel approaches for parametrizing them had to be developed. Two methods for determining partial atomic charges, from interactions with TIP3P water and from the electrostatic potential, as well as several approaches for fitting the dihedral parameters were tested. The performance of the different parameter sets was evaluated by molecular dynamics simulations of the macrolides in ribosome, with a distinct improvement in maintenance of key interactions observed after refinement of the initial parameters. Based on the results of the macrolide tests, recommended procedures for parametrizing very large molecules using ffTK are given.

  10. A toolkit for MSDs prevention--WHO and IEA context.

    PubMed

    Caple, David C

    2012-01-01

    Many simple MSD risk management tools have been developed by ergonomists for use by workers and employers with little or no training to undertake injury prevention programs in their workplace. However, currently there is no "toolkit" which places such tools within an holistic, participative ergonomics framework and provides guidance on how best to use individual tools. It is proposed that such an holistic approach should entail initial analysis and evaluation of underlying systems of work and related health and performance indicators, prior to focusing in assessment of MSD risks stemming from particular hazards. Depending on the context, more narrowly focused tools might then be selected to assess risk associated with jobs or tasks identified as problematic. This approach ensures that biomechanical risk factors are considered within a broad context of organizational and psychosocial risk factors. This is consistent with current research evidence on work- related causes of MSDs.

  11. Migration of 1970s Minicomputer Controls to Modern Toolkit Software

    SciTech Connect

    Juras, R.C.; Meigs, M.J.; Sinclair, J.A.; Tatum, B.A.

    1999-11-13

    Controls for accelerators and associated systems at the Holifield Radioactive Ion Beam Facility (HRIBF) at Oak Ridge National Laboratory have been migrated from 197Os-vintage minicomputers to a modern system based on Vista and EPICS toolkit software. Stability and capabilities of EPICS software have motivated increasing use of EPICS for accelerator controls. In addition, very inexpensive subsystems based on EPICS and the EPICS portable CA server running on Linux PCs have been implemented to control an ion source test facility and to control a building-access badge reader system. A new object-oriented, extensible display manager has been developed for EPICS to facilitate the transition to EPICS and will be used in place of MEDM. EPICS device support has been developed for CAMAC serial highway controls.

  12. An expanded nuclear phylogenomic PCR toolkit for Sapindales1

    PubMed Central

    Collins, Elizabeth S.; Gostel, Morgan R.; Weeks, Andrea

    2016-01-01

    Premise of the study: We tested PCR amplification of 91 low-copy nuclear gene loci in taxa from Sapindales using primers developed for Bursera simaruba (Burseraceae). Methods and Results: Cross-amplification of these markers among 10 taxa tested was related to their phylogenetic distance from B. simaruba. On average, each Sapindalean taxon yielded product for 53 gene regions (range: 16–90). Arabidopsis thaliana (Brassicales), by contrast, yielded product for two. Single representatives of Anacardiaceae and Rutacaeae yielded 34 and 26 products, respectively. Twenty-six primer pairs worked for all Burseraceae species tested if highly divergent Aucoumea klaineana is excluded, and eight of these amplified product in every Sapindalean taxon. Conclusions: Our study demonstrates that customized primers for Bursera can amplify product in a range of Sapindalean taxa. This collection of primer pairs, therefore, is a valuable addition to the toolkit for nuclear phylogenomic analyses of Sapindales and warrants further investigation. PMID:28101434

  13. The interactive learning toolkit: technology and the classroom

    NASA Astrophysics Data System (ADS)

    Lukoff, Brian; Tucker, Laura

    2011-04-01

    Peer Instruction (PI) and Just-in-Time-Teaching (JiTT) have been shown to increase both students' conceptual understanding and problem-solving skills. However, the time investment for the instructor to prepare appropriate conceptual questions and manage student JiTT responses is one of the main implementation hurdles. To overcome this we have developed the Interactive Learning Toolkit (ILT), a course management system specifically designed to support PI and JiTT. We are working to integrate the ILT with a fully interactive classroom system where students can use their laptops and smartphones to respond to ConcepTests in class. The goal is to use technology to engage students in conceptual thinking both in and out of the classroom.

  14. Enhancing the Informatics Evaluation Toolkit with Remote Usability Testing

    PubMed Central

    Dixon, Brian E.

    2009-01-01

    Developing functional clinical informatics products that are also usable remains a challenge. Despite evidence that usability testing should be incorporated into the lifecycle of health information technologies, rarely does this occur. Challenges include poor standards, a lack of knowledge around usability practices, and the expense involved in rigorous testing with a large number of users. Remote usability testing may be a solution for many of these challenges. Remotely testing an application can greatly enhance the number of users who can iteratively interact with a product, and it can reduce the costs associated with usability testing. A case study presents the experiences with remote usability testing when evaluating a Web site designed for health informatics knowledge dissemination. The lessons can inform others seeking to enhance their evaluation toolkits for clinical informatics products. PMID:20351839

  15. Exploratory validation of a multidimensional power wheelchair outcomes toolkit

    PubMed Central

    Ben Mortenson, W.; Demers, Louise; Rushton, Paula W.; Auger, Claudine; Routhier, Francois; Miller, William C.

    2017-01-01

    OBJECTIVES To evaluate the relationship among the measures in a power wheelchair outcomes toolkit. DESIGN We performed path analysis of cross-sectional data from self-report questionnaires and one objective measure. SETTING Data were collected in six Canadian sites. PARTICIPANTS A convenience sample of 128 power wheelchair users. The majority, 69 (53.9%), were female. Multiple sclerosis and spinal cord injury/disease were the most common diagnoses. INTERVENTIONS Not applicable. MAIN OUTCOME MEASURES The power wheelchair version of the Wheelchair Skills Test (4.1) was used to carry out an objective evaluation of capacity to perform 32 wheelchair skills. The Late Life Disability Index measured frequency of participation in 16 life activities. The Life-space Assessment measured independence, extent and frequency of mobility. The Assistive Technology Outcomes Profile for Mobility was used to assess perceived difficulty performing activity and participation using assistive technology. The Wheelchair Use Confidence Scale for powered wheelchair users captured users’ self-efficacy with wheelchair use. RESULTS Wheelchair confidence was independently associated with less difficulty with activity (β =0.08, 0.01

    0.05) and participation (β=0.39, p<0.01), increased life space (β=0.09, p<0.03) and greater wheelchair skills (β=0.37, p<0.01) Less perceived difficulty with activity was independently associated with increased frequency of participation (β=0.30, p<0.01). Life space mobility was independently associated with increased frequency of participation ((β=0.31, p<0.01). Less difficulty with participation was independently associated with greater life-space mobility (β=0.32, p<0.01) and greater frequency of participation (β=0.13, p<0.01). CONCLUSION This study provides empirical support for the measures included as part of the power wheelchair outcomes toolkit. They appear to provide complementary information on a variety of constructs related to power wheelchair

  16. NASA Space Radiation Program Integrative Risk Model Toolkit

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Hu, Shaowen; Plante, Ianik; Ponomarev, Artem L.; Sandridge, Chris

    2015-01-01

    NASA Space Radiation Program Element scientists have been actively involved in development of an integrative risk models toolkit that includes models for acute radiation risk and organ dose projection (ARRBOD), NASA space radiation cancer risk projection (NSCR), hemocyte dose estimation (HemoDose), GCR event-based risk model code (GERMcode), and relativistic ion tracks (RITRACKS), NASA radiation track image (NASARTI), and the On-Line Tool for the Assessment of Radiation in Space (OLTARIS). This session will introduce the components of the risk toolkit with opportunity for hands on demonstrations. The brief descriptions of each tools are: ARRBOD for Organ dose projection and acute radiation risk calculation from exposure to solar particle event; NSCR for Projection of cancer risk from exposure to space radiation; HemoDose for retrospective dose estimation by using multi-type blood cell counts; GERMcode for basic physical and biophysical properties for an ion beam, and biophysical and radiobiological properties for a beam transport to the target in the NASA Space Radiation Laboratory beam line; RITRACKS for simulation of heavy ion and delta-ray track structure, radiation chemistry, DNA structure and DNA damage at the molecular scale; NASARTI for modeling of the effects of space radiation on human cells and tissue by incorporating a physical model of tracks, cell nucleus, and DNA damage foci with image segmentation for the automated count; and OLTARIS, an integrated tool set utilizing HZETRN (High Charge and Energy Transport) intended to help scientists and engineers study the effects of space radiation on shielding materials, electronics, and biological systems.

  17. The Bioperl Toolkit: Perl Modules for the Life Sciences

    PubMed Central

    Stajich, Jason E.; Block, David; Boulez, Kris; Brenner, Steven E.; Chervitz, Stephen A.; Dagdigian, Chris; Fuellen, Georg; Gilbert, James G.R.; Korf, Ian; Lapp, Hilmar; Lehväslaiho, Heikki; Matsalla, Chad; Mungall, Chris J.; Osborne, Brian I.; Pocock, Matthew R.; Schattner, Peter; Senger, Martin; Stein, Lincoln D.; Stupka, Elia; Wilkinson, Mark D.; Birney, Ewan

    2002-01-01

    The Bioperl project is an international open-source collaboration of biologists, bioinformaticians, and computer scientists that has evolved over the past 7 yr into the most comprehensive library of Perl modules available for managing and manipulating life-science information. Bioperl provides an easy-to-use, stable, and consistent programming interface for bioinformatics application programmers. The Bioperl modules have been successfully and repeatedly used to reduce otherwise complex tasks to only a few lines of code. The Bioperl object model has been proven to be flexible enough to support enterprise-level applications such as EnsEMBL, while maintaining an easy learning curve for novice Perl programmers. Bioperl is capable of executing analyses and processing results from programs such as BLAST, ClustalW, or the EMBOSS suite. Interoperation with modules written in Python and Java is supported through the evolving BioCORBA bridge. Bioperl provides access to data stores such as GenBank and SwissProt via a flexible series of sequence input/output modules, and to the emerging common sequence data storage format of the Open Bioinformatics Database Access project. This study describes the overall architecture of the toolkit, the problem domains that it addresses, and gives specific examples of how the toolkit can be used to solve common life-sciences problems. We conclude with a discussion of how the open-source nature of the project has contributed to the development effort. [Supplemental material is available online at www.genome.org. Bioperl is available as open-source software free of charge and is licensed under the Perl Artistic License (http://www.perl.com/pub/a/language/misc/Artistic.html). It is available for download at http://www.bioperl.org. Support inquiries should be addressed to bioperl-l@bioperl.org.] PMID:12368254

  18. Exploratory Validation of a Multidimensional Power Wheelchair Outcomes Toolkit.

    PubMed

    Mortenson, W Ben; Demers, Louise; Rushton, Paula W; Auger, Claudine; Routhier, Francois; Miller, William C

    2015-12-01

    To evaluate the relation among the measures in a power wheelchair outcomes toolkit. We performed path analysis of cross-sectional data from self-report questionnaires and 1 objective measure. Six sites. A convenience sample of power wheelchair users (N=128). Most (n=69; 53.9%) participants were women. Multiple sclerosis and spinal cord injury/disease were the most common diagnoses. Not applicable. The power wheelchair version of the Wheelchair Skills Test version 4.1 was used to carry out an objective evaluation of capacity to perform 32 wheelchair skills. The Late-Life Disability Index measured frequency of participation in 16 life activities. The Life-Space Assessment measured independence, extent, and frequency of mobility. The Assistive Technology Outcomes Profile for Mobility was used to assess perceived difficulty performing activity and participation using assistive technology. The Wheelchair Use Confidence Scale for powered wheelchair users captured users' self-efficacy with wheelchair use. Wheelchair confidence was independently associated with less difficulty with activity (β=.028, P=.002) and participation (β=.225, P<.001), increased life space (β=.095, P<.003), and greater wheelchair skills (β=.30, P<.001). Less perceived difficulty with activity was independently associated with increased frequency of participation (β=.55, P<.001). Life-space mobility was independently associated with increased frequency of participation (β=.167, P<.001). Less difficulty with participation was independently associated with greater life-space mobility (β=.59, P<.001) and greater frequency of participation (β=.13, P<.001). This study provides empirical support for the measures included as part of the power wheelchair outcomes toolkit. They appear to provide complementary information on a variety of constructs related to power wheelchair use. Copyright © 2015 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  19. Supporting integrated design through interlinked tools: The Labs21 toolkit

    SciTech Connect

    Mathew, Paul; Bell, Geoffrey; Carlisle, Nancy; Sartor, Dale; van Geet, Otto; Lintner, William; Wirdzek, Phil

    2003-09-15

    The sustainable design of complex building types such as laboratories and hospitals can be particularly challenging, given their inherent complexity of systems, health and safety requirements, long-term flexibility and adaptability needs, energy use intensity, and environmental impacts. Tools such as design guides, energy benchmarking, and LEED rating systems are especially helpful to support sustainable design in such buildings. Furthermore, designers need guidance on how to effectively and appropriately use each tool within the context of an integrated design process involving multiple actors with various objectives. Toward this end, the Laboratories for the 21st Century (Labs21) program has developed an interlinked set of tools -- the Labs21 Toolkit -- to support an integrated design process for sustainable laboratories. Labs21 is a voluntary partnership program sponsored by the U.S. Environmental Protection Agency (EPA) and U.S. Department of Energy (DOE) to improve the environment al performance of U.S. laboratories. In this paper, we present the Labs21 Toolkit, and illustrate how these tools can be used to support sustainable design within an integrated design process. The tool kit includes core information tools, as well as process-related tools, as indicated below: Core information tools: -A Design Guide, which is a compendium of publications on energy efficiency in laboratories -Case Studies that showcase high-performance design features and applications. -Best Practice Guides that highlight industry-leading sustainable design strategies. -A web-based Benchmarking Tool to benchmark laboratory energy performance.Process tools: -A Design Intent Tool, which can be used to used to plan, document, and verify that a facility's design intent is being met at each stage of the design process. The Environmental Performance Criteria (EPC), a rating system specifically designed for laboratory facilities that builds on the LEED(TM) system. -A web-based Process Manual

  20. Observing Convective Aggregation

    NASA Astrophysics Data System (ADS)

    Holloway, Christopher E.; Wing, Allison A.; Bony, Sandrine; Muller, Caroline; Masunaga, Hirohiko; L'Ecuyer, Tristan S.; Turner, David D.; Zuidema, Paquita

    2017-06-01

    Convective self-aggregation, the spontaneous organization of initially scattered convection into isolated convective clusters despite spatially homogeneous boundary conditions and forcing, was first recognized and studied in idealized numerical simulations. While there is a rich history of observational work on convective clustering and organization, there have been only a few studies that have analyzed observations to look specifically for processes related to self-aggregation in models. Here we review observational work in both of these categories and motivate the need for more of this work. We acknowledge that self-aggregation may appear to be far-removed from observed convective organization in terms of time scales, initial conditions, initiation processes, and mean state extremes, but we argue that these differences vary greatly across the diverse range of model simulations in the literature and that these comparisons are already offering important insights into real tropical phenomena. Some preliminary new findings are presented, including results showing that a self-aggregation simulation with square geometry has too broad distribution of humidity and is too dry in the driest regions when compared with radiosonde records from Nauru, while an elongated channel simulation has realistic representations of atmospheric humidity and its variability. We discuss recent work increasing our understanding of how organized convection and climate change may interact, and how model discrepancies related to this question are prompting interest in observational comparisons. We also propose possible future directions for observational work related to convective aggregation, including novel satellite approaches and a ground-based observational network.

  1. Observing convective aggregation

    NASA Astrophysics Data System (ADS)

    Holloway, Christopher; Wing, Allison; Bony, Sandrine; Muller, Caroline; Masunaga, Hirohiko; L'Ecuyer, Tristan; Turner, David; Zuidema, Paquita

    2017-04-01

    Convective self-aggregation was first recognized and studied in idealized numerical simulations. While there is a rich history of observational work on convective clustering and organization, there have been only a few studies that have analyzed observations to look specifically for processes related to self-aggregation in models. Here we review observational work in both of these categories and motivate the need for more of this work. We acknowledge that self-aggregation may appear to be far-removed from observed convective organization in terms of time scales, initial conditions, initiation processes, and mean state extremes, but we argue that these differences vary greatly across the diverse range of model simulations in the literature and that these comparisons are already offering important insights into real tropical phenomena. Some preliminary new findings are presented, including results showing that a self-aggregation simulation with square geometry has too broad a distribution of humidity and is too dry in the driest regions when compared with radiosonde records from Nauru, while an elongated channel simulation has realistic representations of atmospheric humidity and its variability. We discuss recent work increasing our understanding of how organized convection and climate change may interact, and how model discrepancies related to this question are prompting interest in observational comparisons. We also propose possible future directions for observational work related to convective aggregation, including novel satellite approaches and a ground-based observational network.

  2. Common File Formats.

    PubMed

    Mills, Lauren

    2014-03-21

    An overview of the many file formats commonly used in bioinformatics and genome sequence analysis is presented, including various data file formats, alignment file formats, and annotation file formats. Example workflows illustrate how some of the different file types are typically used.

  3. The effectiveness of toolkits as knowledge translation strategies for integrating evidence into clinical care: a systematic review

    PubMed Central

    Yamada, Janet; Shorkey, Allyson; Barwick, Melanie; Widger, Kimberley; Stevens, Bonnie J

    2015-01-01

    Objectives The aim of this systematic review was to evaluate the effectiveness of toolkits as a knowledge translation (KT) strategy for facilitating the implementation of evidence into clinical care. Toolkits include multiple resources for educating and/or facilitating behaviour change. Design Systematic review of the literature on toolkits. Methods A search was conducted on MEDLINE, EMBASE, PsycINFO and CINAHL. Studies were included if they evaluated the effectiveness of a toolkit to support the integration of evidence into clinical care, and if the KT goal(s) of the study were to inform, share knowledge, build awareness, change practice, change behaviour, and/or clinical outcomes in healthcare settings, inform policy, or to commercialise an innovation. Screening of studies, assessment of methodological quality and data extraction for the included studies were conducted by at least two reviewers. Results 39 relevant studies were included for full review; 8 were rated as moderate to strong methodologically with clinical outcomes that could be somewhat attributed to the toolkit. Three of the eight studies evaluated the toolkit as a single KT intervention, while five embedded the toolkit into a multistrategy intervention. Six of the eight toolkits were partially or mostly effective in changing clinical outcomes and six studies reported on implementation outcomes. The types of resources embedded within toolkits varied but included predominantly educational materials. Conclusions Future toolkits should be informed by high-quality evidence and theory, and should be evaluated using rigorous study designs to explain the factors underlying their effectiveness and successful implementation. PMID:25869686

  4. EPA and Tribal Workgroup Launch Toolkit to Support Tribal Green Building

    EPA Pesticide Factsheets

    SAN FRANCISCO -Today, the U.S. Environmental Protection Agency and its Tribal Green Building Codes Workgroup-which consists of representatives from tribal nations and federal agencies-announced a new toolkit designed to assist tribes to prioritize a

  5. Challenges and Opportunities in Using Automatic Differentiation with Object-Oriented Toolkits for Scientific Computing

    SciTech Connect

    Hovland, P; Lee, S; McInnes, L; Norris, B; Smith, B

    2001-04-17

    The increased use of object-oriented toolkits in large-scale scientific simulation presents new opportunities and challenges for the use of automatic (or algorithmic) differentiation (AD) techniques, especially in the context of optimization. Because object-oriented toolkits use well-defined interfaces and data structures, there is potential for simplifying the AD process. Furthermore, derivative computation can be improved by exploiting high-level information about numerical and computational abstractions. However, challenges to the successful use of AD with these toolkits also exist. Among the greatest challenges is balancing the desire to limit the scope of the AD process with the desire to minimize the work required of a user. They discuss their experiences in integrating AD with the PETSc, PVODE, and TAO toolkits and the plans for future research and development in this area.

  6. EPA, Product Stewardship Institute, and University of California Launch Toolkit to Reduce Marine Debris

    EPA Pesticide Factsheets

    SAN FRANCISCO - Today, the U.S. Environmental Protection Agency, the Product Stewardship Institute, and the University of California announced the launch of a new Marine Debris Campus Toolkit designed to help college campuses and other institutions

  7. BTK: an open-source toolkit for fetal brain MR image processing.

    PubMed

    Rousseau, François; Oubel, Estanislao; Pontabry, Julien; Schweitzer, Marc; Studholme, Colin; Koob, Mériam; Dietemann, Jean-Louis

    2013-01-01

    Studies about brain maturation aim at providing a better understanding of brain development and links between brain changes and cognitive development. Such studies are of great interest for diagnosis help and clinical course of development and treatment of illnesses. However, the processing of fetal brain MR images remains complicated which limits the translation from the research to the clinical domain. In this article, we describe an open-source image processing toolkit dedicated to these images. In this toolkit various tools are included such as: denoising, image reconstruction, super-resolution and tractography. The BTK resource program (distributed under CeCILL-B license) is developed in C++ and relies on common medical imaging libraries such as Insight Toolkit (ITK), Visualization Toolkit (VTK) and Open Multi-Processing (OpenMP).

  8. Field tests of a participatory ergonomics toolkit for Total Worker Health.

    PubMed

    Nobrega, Suzanne; Kernan, Laura; Plaku-Alakbarova, Bora; Robertson, Michelle; Warren, Nicholas; Henning, Robert

    2017-04-01

    Growing interest in Total Worker Health(®) (TWH) programs to advance worker safety, health and well-being motivated development of a toolkit to guide their implementation. Iterative design of a program toolkit occurred in which participatory ergonomics (PE) served as the primary basis to plan integrated TWH interventions in four diverse organizations. The toolkit provided start-up guides for committee formation and training, and a structured PE process for generating integrated TWH interventions. Process data from program facilitators and participants throughout program implementation were used for iterative toolkit design. Program success depended on organizational commitment to regular design team meetings with a trained facilitator, the availability of subject matter experts on ergonomics and health to support the design process, and retraining whenever committee turnover occurred. A two committee structure (employee Design Team, management Steering Committee) provided advantages over a single, multilevel committee structure, and enhanced the planning, communication, and teamwork skills of participants.

  9. Development and evaluation of a toolkit to assess partnership readiness for community-based participatory research.

    PubMed

    Andrews, Jeannette O; Cox, Melissa J; Newman, Susan D; Meadows, Otha

    2011-01-01

    An earlier investigation by academic and community co-investigators led to the development of the Partnership Readiness for Community-Based Participatory Research (CBPR) Model, which defined major dimensions and key indicators of partnership readiness. As a next step in this process, we used qualitative methods, cognitive pretesting, and expert reviews to develop a working guide, or toolkit, based on the model for academic and community partners to assess and leverage their readiness for CBPR. The 75-page toolkit is designed as a qualitative assessment promoting equal voice and transparent, bi-directional discussions among all the partners. The toolkit is formatted to direct individual partner assessments, followed by team assessments, discussions, and action plans to optimize their goodness of fit, capacity, and operations to conduct CBPR. The toolkit has been piloted with two cohorts in the Medical University of South Carolina's (MUSC) Community Engaged Scholars (CES) Program with promising results from process and outcome evaluation data.

  10. BTK: An Open-Source Toolkit for Fetal Brain MR Image Processing

    PubMed Central

    Rousseau, François; Oubel, Estanislao; Pontabry, Julien; Schweitzer, Marc; Studholme, Colin; Koob, Mériam; Dietemann, Jean-Louis

    2012-01-01

    Studies about brain maturation aim at providing a better understanding of brain development and links between brain changes and cognitive development. Such studies are of great interest for diagnosis help and clinical course of development and treatment of illnesses. However, the processing of fetal brain MR images remains complicated which limits the translation from the research to the clinical domain. In this article, we describe an open-source image processing toolkit dedicated to these images. In this toolkit various tools are included such as: denoising, image reconstruction, super-resolution and tractography. The BTK resource program (distributed under CeCILL-B license) is developed in C++ and relies on common medical imaging libraries such as Insight Toolkit (ITK), Visualization Toolkit (VTK) and Open Multi-Processing (OpenMP). PMID:23036854

  11. Integrated Information Support System (IISS). Volume 5. Common Data Model Subsystem. Part 29. Data Aggregators Product Specification.

    DTIC Science & Technology

    1985-11-01

    Role Boeing Military Aircraft Reviewer. Company (EMAC) D. Appleton Company Responsible for IDEF support, (DACOM) state-of-the-art literature search...SRTCDR SORTING THE COMMON DATA RECORD TRMSRT TERMINATE THE SORTING 3-12 Milt X - PS 620141320 1 November 1985 DATA AGGREGATORS Module List Module Name...OF THE RECORD "INPUT-FILE". REVISED 15 APR 1985 MODIFIED 2 MAY 1985 MODIFIED 9 MAY 1985 ARGUMENTS: INPUT-FILE - RECRD TEMP-FILE-NAME - DSPLY [ X (30

  12. STAND: Surface Tension for Aggregation Number Determination.

    PubMed

    Garrido, Pablo F; Brocos, Pilar; Amigo, Alfredo; García-Río, Luis; Gracia-Fadrique, Jesús; Piñeiro, Ángel

    2016-04-26

    Taking advantage of the extremely high dependence of surface tension on the concentration of amphiphilic molecules in aqueous solution, a new model based on the double equilibrium between free and aggregated molecules in the liquid phase and between free molecules in the liquid phase and those adsorbed at the air/liquid interface is presented and validated using literature data and fluorescence measurements. A key point of the model is the use of both the Langmuir isotherm and the Gibbs adsorption equation in terms of free molecules instead of the nominal concentration of the solute. The application of the model should be limited to non ionic compounds since it does not consider the presence of counterions. It requires several coupled nonlinear fittings for which we developed a software that is publicly available in our server as a web application. Using this tool, it is straightforward to get the average aggregation number of an amphiphile, the micellization free energy, the adsorption constant, the maximum surface excess (and so the minimum area per molecule), the distribution of solute in the liquid phase between free and aggregate species, and the surface coverage in only a couple of seconds, just by uploading a text file with surface tension vs concentration data and the corresponding uncertainties.

  13. Technology meets aggregate

    SciTech Connect

    Wilson, C.; Swan, C.

    2007-07-01

    New technology carried out at Tufts University and the University of Massachusetts on synthetic lightweight aggregate has created material from various qualities of fly ash from coal-fired power plants for use in different engineered applications. In pilot scale manufacturing tests an 'SLA' containing 80% fly ash and 20% mixed plastic waste from packaging was produced by 'dry blending' mixed plastic with high carbon fly ash. A trial run was completed to produce concrete masonry unit (CMU) blocks at a full-scale facility. It has been shown that SLA can be used as a partial substitution of a traditional stone aggregate in hot asphalt mix. 1 fig., 2 photos.

  14. TChem - A Software Toolkit for the Analysis of Complex Kinetic Models

    SciTech Connect

    Safta, Cosmin; Najm, Habib N.; Knio, Omar

    2011-05-01

    The TChem toolkit is a software library that enables numerical simulations using complex chemistry and facilitates the analysis of detailed kinetic models. The toolkit provide capabilities for thermodynamic properties based on NASA polynomials and species production/consumption rates. It incorporates methods that can selectively modify reaction parameters for sensitivity analysis. The library contains several functions that provide analytically computed Jacobian matrices necessary for the efficient time advancement and analysis of detailed kinetic models.

  15. Wind Integration National Dataset (WIND) Toolkit; NREL (National Renewable Energy Laboratory)

    SciTech Connect

    Draxl, Caroline; Hodge, Bri-Mathias

    2015-07-14

    A webinar about the Wind Integration National Dataset (WIND) Toolkit was presented by Bri-Mathias Hodge and Caroline Draxl on July 14, 2015. It was hosted by the Southern Alliance for Clean Energy. The toolkit is a grid integration data set that contains meteorological and power data at a 5-minute resolution across the continental United States for 7 years and hourly power forecasts.

  16. New Mexico aggregate production sites, 1997-1999

    USGS Publications Warehouse

    Orris, Greta J.

    2000-01-01

    This report presents data, including latitude and longitude, for aggregate sites in New Mexico that were believed to be active in the period 1997-1999. The data are presented in paper form in Part A of this report and as Microsoft Excel 97 and Data Interchange Format (DIF) files in Part B. The work was undertaken as part of the effort to update information for the National Atlas. This compilation includes data from: the files of U.S. Geological Survey (USGS); company contacts; the New Mexico Bureau of Mines and Mineral Resources, New Mexico Bureau of Mine Inspection, and the Mining and Minerals Division of the New Mexico Energy, Minerals and Natural Resources Department (Hatton and others, 1998); the Bureau of Land Management Information; and direct communications with some of the aggregate operators. Additional information on most of the sites is available in Hatton and others (1998).

  17. The MPI bioinformatics Toolkit as an integrative platform for advanced protein sequence and structure analysis

    PubMed Central

    Alva, Vikram; Nam, Seung-Zin; Söding, Johannes; Lupas, Andrei N.

    2016-01-01

    The MPI Bioinformatics Toolkit (http://toolkit.tuebingen.mpg.de) is an open, interactive web service for comprehensive and collaborative protein bioinformatic analysis. It offers a wide array of interconnected, state-of-the-art bioinformatics tools to experts and non-experts alike, developed both externally (e.g. BLAST+, HMMER3, MUSCLE) and internally (e.g. HHpred, HHblits, PCOILS). While a beta version of the Toolkit was released 10 years ago, the current production-level release has been available since 2008 and has serviced more than 1.6 million external user queries. The usage of the Toolkit has continued to increase linearly over the years, reaching more than 400 000 queries in 2015. In fact, through the breadth of its tools and their tight interconnection, the Toolkit has become an excellent platform for experimental scientists as well as a useful resource for teaching bioinformatic inquiry to students in the life sciences. In this article, we report on the evolution of the Toolkit over the last ten years, focusing on the expansion of the tool repertoire (e.g. CS-BLAST, HHblits) and on infrastructural work needed to remain operative in a changing web environment. PMID:27131380

  18. Designing a Composable Geometric Toolkit for Versatility in Applications to Simulation Development

    NASA Technical Reports Server (NTRS)

    Reed, Gregory S.; Campbell, Thomas

    2008-01-01

    Conceived and implemented through the development of probabilistic risk assessment simulations for Project Constellation, the Geometric Toolkit allows users to create, analyze, and visualize relationships between geometric shapes in three-space using the MATLAB computing environment. The key output of the toolkit is an analysis of how emanations from one "source" geometry (e.g., a leak in a pipe) will affect another "target" geometry (e.g., another heat-sensitive component). It can import computer-aided design (CAD) depictions of a system to be analyzed, allowing the user to reliably and easily represent components within the design and determine the relationships between them, ultimately supporting more technical or physics-based simulations that use the toolkit. We opted to develop a variety of modular, interconnecting software tools to extend the scope of the toolkit, providing the capability to support a range of applications. This concept of simulation composability allows specially-developed tools to be reused by assembling them in various combinations. As a result, the concepts described here and implemented in this toolkit have a wide range of applications outside the domain of risk assessment. To that end, the Geometric Toolkit has been evaluated for use in other unrelated applications due to the advantages provided by its underlying design.

  19. The nursing human resource planning best practice toolkit: creating a best practice resource for nursing managers.

    PubMed

    Vincent, Leslie; Beduz, Mary Agnes

    2010-05-01

    Evidence of acute nursing shortages in urban hospitals has been surfacing since 2000. Further, new graduate nurses account for more than 50% of total nurse turnover in some hospitals and between 35% and 60% of new graduates change workplace during the first year. Critical to organizational success, first line nurse managers must have the knowledge and skills to ensure the accurate projection of nursing resource requirements and to develop proactive recruitment and retention programs that are effective, promote positive nursing socialization, and provide early exposure to the clinical setting. The Nursing Human Resource Planning Best Practice Toolkit project supported the creation of a network of teaching and community hospitals to develop a best practice toolkit in nursing human resource planning targeted at first line nursing managers. The toolkit includes the development of a framework including the conceptual building blocks of planning tools, manager interventions, retention and recruitment and professional practice models. The development of the toolkit involved conducting a review of the literature for best practices in nursing human resource planning, using a mixed method approach to data collection including a survey and extensive interviews of managers and completing a comprehensive scan of human resource practices in the participating organizations. This paper will provide an overview of the process used to develop the toolkit, a description of the toolkit contents and a reflection on the outcomes of the project.

  20. A CRISPR/Cas9 toolkit for multiplex genome editing in plants.

    PubMed

    Xing, Hui-Li; Dong, Li; Wang, Zhi-Ping; Zhang, Hai-Yan; Han, Chun-Yan; Liu, Bing; Wang, Xue-Chen; Chen, Qi-Jun

    2014-11-29

    To accelerate the application of the CRISPR/Cas9 (clustered regularly interspaced short palindromic repeats/ CRISPR-associated protein 9) system to a variety of plant species, a toolkit with additional plant selectable markers, more gRNA modules, and easier methods for the assembly of one or more gRNA expression cassettes is required. We developed a CRISPR/Cas9 binary vector set based on the pGreen or pCAMBIA backbone, as well as a gRNA (guide RNA) module vector set, as a toolkit for multiplex genome editing in plants. This toolkit requires no restriction enzymes besides BsaI to generate final constructs harboring maize-codon optimized Cas9 and one or more gRNAs with high efficiency in as little as one cloning step. The toolkit was validated using maize protoplasts, transgenic maize lines, and transgenic Arabidopsis lines and was shown to exhibit high efficiency and specificity. More importantly, using this toolkit, targeted mutations of three Arabidopsis genes were detected in transgenic seedlings of the T1 generation. Moreover, the multiple-gene mutations could be inherited by the next generation. We developed a toolkit that facilitates transient or stable expression of the CRISPR/Cas9 system in a variety of plant species, which will facilitate plant research, as it enables high efficiency generation of mutants bearing multiple gene mutations.

  1. Aggregates, broccoli and cauliflower

    NASA Astrophysics Data System (ADS)

    Grey, Francois; Kjems, Jørgen K.

    1989-09-01

    Naturally grown structures with fractal characters like broccoli and cauliflower are discussed and compared with DLA-type aggregates. It is suggested that the branching density can be used to characterize the growth process and an experimental method to determine this parameter is proposed.

  2. 43 CFR 4.1352 - Who may file; where to file; when to file.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 43 Public Lands: Interior 1 2010-10-01 2010-10-01 false Who may file; where to file; when to file... Indian Lands) § 4.1352 Who may file; where to file; when to file. (a) The applicant or operator may file... to file a timely request constitutes a waiver of the opportunity for a hearing before OSM makes...

  3. The MUSOS (MUsic SOftware System) Toolkit: A computer-based, open source application for testing memory for melodies.

    PubMed

    Rainsford, M; Palmer, M A; Paine, G

    2017-04-21

    Despite numerous innovative studies, rates of replication in the field of music psychology are extremely low (Frieler et al., 2013). Two key methodological challenges affecting researchers wishing to administer and reproduce studies in music cognition are the difficulty of measuring musical responses, particularly when conducting free-recall studies, and access to a reliable set of novel stimuli unrestricted by copyright or licensing issues. In this article, we propose a solution for these challenges in computer-based administration. We present a computer-based application for testing memory for melodies. Created using the software Max/MSP (Cycling '74, 2014a), the MUSOS (Music Software System) Toolkit uses a simple modular framework configurable for testing common paradigms such as recall, old-new recognition, and stem completion. The program is accompanied by a stimulus set of 156 novel, copyright-free melodies, in audio and Max/MSP file formats. Two pilot tests were conducted to establish the properties of the accompanying stimulus set that are relevant to music cognition and general memory research. By using this software, a researcher without specialist musical training may administer and accurately measure responses from common paradigms used in the study of memory for music.

  4. Ten years of medical imaging standardization and prototypical implementation: the DICOM standard and the OFFIS DICOM toolkit (DCMTK)

    NASA Astrophysics Data System (ADS)

    Eichelberg, Marco; Riesmeier, Joerg; Wilkens, Thomas; Hewett, Andrew J.; Barth, Andreas; Jensch, Peter

    2004-04-01

    In 2003, the DICOM standard celebrated its 10th anniversary. Aside from the standard itself, also OFFIS" open source DICOM toolkit DCMTK, which has continuously followed the development of DICOM, turns 10 years old. On this occasion, this article looks back at the main standardization efforts in DICOM and illustrates related developments in DCMTK. Considering the development of the DICOM standard, it is possible to distinguish several phases of progress. Within the first phase at the beginning of the 1990s, basic network services for image transfer and retrieval were being introduced. The second phase, in the mid 1990s, was characterized by advances in the specification of a file format and of regulations for media interchange. In the later but partly parallel third phase, DICOM predominantly dealt with the problem of optimizing the workflow within imaging departments. As a result of the fact that it was now possible to exchange images between different systems, efforts concerning image display consistency followed in a fourth phase at the end of the 1990s. In the current fifth phase, security enhancements are being integrated into the standard. In another phase of progress, which took place over a relatively long time period concurrently to the other mentioned phases, DICOM Structured Reporting was developed.

  5. Tripal v1.1: a standards-based toolkit for construction of online genetic and genomic databases.

    PubMed

    Sanderson, Lacey-Anne; Ficklin, Stephen P; Cheng, Chun-Huai; Jung, Sook; Feltus, Frank A; Bett, Kirstin E; Main, Dorrie

    2013-01-01

    Tripal is an open-source freely available toolkit for construction of online genomic and genetic databases. It aims to facilitate development of community-driven biological websites by integrating the GMOD Chado database schema with Drupal, a popular website creation and content management software. Tripal provides a suite of tools for interaction with a Chado database and display of content therein. The tools are designed to be generic to support the various ways in which data may be stored in Chado. Previous releases of Tripal have supported organisms, genomic libraries, biological stocks, stock collections and genomic features, their alignments and annotations. Also, Tripal and its extension modules provided loaders for commonly used file formats such as FASTA, GFF, OBO, GAF, BLAST XML, KEGG heir files and InterProScan XML. Default generic templates were provided for common views of biological data, which could be customized using an open Application Programming Interface to change the way data are displayed. Here, we report additional tools and functionality that are part of release v1.1 of Tripal. These include (i) a new bulk loader that allows a site curator to import data stored in a custom tab delimited format; (ii) full support of every Chado table for Drupal Views (a powerful tool allowing site developers to construct novel displays and search pages); (iii) new modules including 'Feature Map', 'Genetic', 'Publication', 'Project', 'Contact' and the 'Natural Diversity' modules. Tutorials, mailing lists, download and set-up instructions, extension modules and other documentation can be found at the Tripal website located at http://tripal.info. DATABASE URL: http://tripal.info/.

  6. A Qualitative Evaluation of Web-Based Cancer Care Quality Improvement Toolkit Use in the Veterans Health Administration.

    PubMed

    Bowman, Candice; Luck, Jeff; Gale, Randall C; Smith, Nina; York, Laura S; Asch, Steven

    2015-01-01

    Disease severity, complexity, and patient burden highlight cancer care as a target for quality improvement (QI) interventions. The Veterans Health Administration (VHA) implemented a series of disease-specific online cancer care QI toolkits. To describe characteristics of the toolkits, target users, and VHA cancer care facilities that influenced toolkit access and use and assess whether such resources were beneficial for users. Deductive content analysis of detailed notes from 94 telephone interviews with individuals from 48 VHA facilities. We evaluated toolkit access and use across cancer types, participation in learning collaboratives, and affiliation with VHA cancer care facilities. The presence of champions was identified as a strong facilitator of toolkit use, and learning collaboratives were important for spreading information about toolkit availability. Identified barriers included lack of personnel and financial resources and complicated approval processes to support tool use. Online cancer care toolkits are well received across cancer specialties and provider types. Clinicians, administrators, and QI staff may benefit from the availability of toolkits as they become more reliant on rapid access to strategies that support comprehensive delivery of evidence-based care. Toolkits should be considered as a complement to other QI approaches.

  7. Modifications to the accuracy assessment analysis routine MLTCRP to produce an output file

    NASA Technical Reports Server (NTRS)

    Carnes, J. G.

    1978-01-01

    Modifications are described that were made to the analysis program MLTCRP in the accuracy assessment software system to produce a disk output file. The output files produced by this modified program are used to aggregate data for regions greater than a single segment.

  8. Census of Population and Housing, 1980: Summary Tape File 3F. Technical Documentation.

    ERIC Educational Resources Information Center

    Bureau of the Census (DOC), Washington, DC. Data User Services Div.

    This report provides technical documentation associated with a 1980 Census of Population and Housing Summary Tape File (STF) 3F--which contains responses to the extended questionnaire summarized in STF 3, aggregated by school district. The file contains sample data inflated to represent the total population, 100% counts, and unweighted sample…

  9. Information Literacy Toolkit: Grades Kindergarten-6 [and] Information Literacy Toolkit: Grades 7 and Up [and] Research Projects: An Information Literacy Planner for Students [with CD-ROM].

    ERIC Educational Resources Information Center

    Ryan, Jenny; Capra, Steph

    The three guides in the new Information Literacy Toolkit Series can help school library media specialists and teachers to promote and teach information literacy skills to young library users and to: collaborate in curriculum planning so that students will develop a cohesive skill set; teach the critical thinking and problem-solving skills that…

  10. A Highly Characterized Yeast Toolkit for Modular, Multipart Assembly.

    PubMed

    Lee, Michael E; DeLoache, William C; Cervantes, Bernardo; Dueber, John E

    2015-09-18

    Saccharomyces cerevisiae is an increasingly attractive host for synthetic biology because of its long history in industrial fermentations. However, until recently, most synthetic biology systems have focused on bacteria. While there is a wealth of resources and literature about the biology of yeast, it can be daunting to navigate and extract the tools needed for engineering applications. Here we present a versatile engineering platform for yeast, which contains both a rapid, modular assembly method and a basic set of characterized parts. This platform provides a framework in which to create new designs, as well as data on promoters, terminators, degradation tags, and copy number to inform those designs. Additionally, we describe genome-editing tools for making modifications directly to the yeast chromosomes, which we find preferable to plasmids due to reduced variability in expression. With this toolkit, we strive to simplify the process of engineering yeast by standardizing the physical manipulations and suggesting best practices that together will enable more straightforward translation of materials and data from one group to another. Additionally, by relieving researchers of the burden of technical details, they can focus on higher-level aspects of experimental design.

  11. Targeting protein function: the expanding toolkit for conditional disruption.

    PubMed

    Campbell, Amy E; Bennett, Daimark

    2016-09-01

    A major objective in biological research is to understand spatial and temporal requirements for any given gene, especially in dynamic processes acting over short periods, such as catalytically driven reactions, subcellular transport, cell division, cell rearrangement and cell migration. The interrogation of such processes requires the use of rapid and flexible methods of interfering with gene function. However, many of the most widely used interventional approaches, such as RNAi or CRISPR (clustered regularly interspaced short palindromic repeats)-Cas9 (CRISPR-associated 9), operate at the level of the gene or its transcripts, meaning that the effects of gene perturbation are exhibited over longer time frames than the process under investigation. There has been much activity over the last few years to address this fundamental problem. In the present review, we describe recent advances in disruption technologies acting at the level of the expressed protein, involving inducible methods of protein cleavage, (in)activation, protein sequestration or degradation. Drawing on examples from model organisms we illustrate the utility of fast-acting techniques and discuss how different components of the molecular toolkit can be employed to dissect previously intractable biochemical processes and cellular behaviours.

  12. The Insight ToolKit image registration framework

    PubMed Central

    Avants, Brian B.; Tustison, Nicholas J.; Stauffer, Michael; Song, Gang; Wu, Baohua; Gee, James C.

    2014-01-01

    Publicly available scientific resources help establish evaluation standards, provide a platform for teaching and improve reproducibility. Version 4 of the Insight ToolKit (ITK4) seeks to establish new standards in publicly available image registration methodology. ITK4 makes several advances in comparison to previous versions of ITK. ITK4 supports both multivariate images and objective functions; it also unifies high-dimensional (deformation field) and low-dimensional (affine) transformations with metrics that are reusable across transform types and with composite transforms that allow arbitrary series of geometric mappings to be chained together seamlessly. Metrics and optimizers take advantage of multi-core resources, when available. Furthermore, ITK4 reduces the parameter optimization burden via principled heuristics that automatically set scaling across disparate parameter types (rotations vs. translations). A related approach also constrains steps sizes for gradient-based optimizers. The result is that tuning for different metrics and/or image pairs is rarely necessary allowing the researcher to more easily focus on design/comparison of registration strategies. In total, the ITK4 contribution is intended as a structure to support reproducible research practices, will provide a more extensive foundation against which to evaluate new work in image registration and also enable application level programmers a broad suite of tools on which to build. Finally, we contextualize this work with a reference registration evaluation study with application to pediatric brain labeling.1 PMID:24817849

  13. The NITE XML Toolkit: flexible annotation for multimodal language data.

    PubMed

    Carletta, Jean; Evert, Stefan; Heid, Ulrich; Kilgour, Jonathan; Robertson, Judy; Voormann, Holger

    2003-08-01

    Multimodal corpora that show humans interacting via language are now relatively easy to collect. Current tools allow one either to apply sets of time-stamped codes to the data and consider their timing and sequencing or to describe some specific linguistic structure that is present in the data, built over the top of some form of transcription. To further our understanding of human communication, the research community needs code sets with both timings and structure, designed flexibly to address the research questions at hand. The NITE XML Toolkit offers library support that software developers can call upon when writing tools for such code sets and, thus, enables richer analyses than have previously been possible. It includes data handling, a query language containing both structural and temporal constructs, components that can be used to build graphical interfaces, sample programs that demonstrate how to use the libraries, a tool for running queries, and an experimental engine that builds interfaces on the basis of declarative specifications.

  14. Using the Browser for Science: A Collaborative Toolkit for Astronomy

    NASA Astrophysics Data System (ADS)

    Connolly, A. J.; Smith, I.; Krughoff, K. S.; Gibson, R.

    2011-07-01

    Astronomical surveys have yielded hundreds of terabytes of catalogs and images that span many decades of the electromagnetic spectrum. Even when observatories provide user-friendly web interfaces, exploring these data resources remains a complex and daunting task. In contrast, gadgets and widgets have become popular in social networking (e.g. iGoogle, Facebook). They provide a simple way to make complex data easily accessible that can be customized based on the interest of the user. With ASCOT (an AStronomical COllaborative Toolkit) we expand on these concepts to provide a customizable and extensible gadget framework for use in science. Unlike iGoogle, where all of the gadgets are independent, the gadgets we develop communicate and share information, enabling users to visualize and interact with data through multiple, simultaneous views. With this approach, web-based applications for accessing and visualizing data can be generated easily and, by linking these tools together, integrated and powerful data analysis and discovery tools can be constructed.

  15. Scientific Visualization Using the Flow Analysis Software Toolkit (FAST)

    NASA Technical Reports Server (NTRS)

    Bancroft, Gordon V.; Kelaita, Paul G.; Mccabe, R. Kevin; Merritt, Fergus J.; Plessel, Todd C.; Sandstrom, Timothy A.; West, John T.

    1993-01-01

    Over the past few years the Flow Analysis Software Toolkit (FAST) has matured into a useful tool for visualizing and analyzing scientific data on high-performance graphics workstations. Originally designed for visualizing the results of fluid dynamics research, FAST has demonstrated its flexibility by being used in several other areas of scientific research. These research areas include earth and space sciences, acid rain and ozone modelling, and automotive design, just to name a few. This paper describes the current status of FAST, including the basic concepts, architecture, existing functionality and features, and some of the known applications for which FAST is being used. A few of the applications, by both NASA and non-NASA agencies, are outlined in more detail. Described in the Outlines are the goals of each visualization project, the techniques or 'tricks' used lo produce the desired results, and custom modifications to FAST, if any, done to further enhance the analysis. Some of the future directions for FAST are also described.

  16. Using the Model Coupling Toolkit to couple earth system models

    USGS Publications Warehouse

    Warner, J.C.; Perlin, N.; Skyllingstad, E.D.

    2008-01-01

    Continued advances in computational resources are providing the opportunity to operate more sophisticated numerical models. Additionally, there is an increasing demand for multidisciplinary studies that include interactions between different physical processes. Therefore there is a strong desire to develop coupled modeling systems that utilize existing models and allow efficient data exchange and model control. The basic system would entail model "1" running on "M" processors and model "2" running on "N" processors, with efficient exchange of model fields at predetermined synchronization intervals. Here we demonstrate two coupled systems: the coupling of the ocean circulation model Regional Ocean Modeling System (ROMS) to the surface wave model Simulating WAves Nearshore (SWAN), and the coupling of ROMS to the atmospheric model Coupled Ocean Atmosphere Prediction System (COAMPS). Both coupled systems use the Model Coupling Toolkit (MCT) as a mechanism for operation control and inter-model distributed memory transfer of model variables. In this paper we describe requirements and other options for model coupling, explain the MCT library, ROMS, SWAN and COAMPS models, methods for grid decomposition and sparse matrix interpolation, and provide an example from each coupled system. Methods presented in this paper are clearly applicable for coupling of other types of models. ?? 2008 Elsevier Ltd. All rights reserved.

  17. Integrating the microbiome as a resource in the forensics toolkit.

    PubMed

    Clarke, Thomas H; Gomez, Andres; Singh, Harinder; Nelson, Karen E; Brinkac, Lauren M

    2017-09-01

    The introduction of DNA fingerprinting to forensic science rapidly expanded the available evidence that could be garnered from a crime scene and used in court cases. Next generation sequencing technologies increased available genetic data that could be used as evidence by orders of magnitude, and as such, significant additional genetic information is now available for use in forensic science. This includes DNA from the bacteria that live in and on humans, known as the human microbiome. Next generation sequencing of the human microbiome demonstrates that its bacterial DNA can be used to uniquely identify an individual, provide information about their life and behavioral patterns, determine the body site where a sample came from, and estimate postmortem intervals. Bacterial samples from the environment and objects can also be leveraged to address similar questions about the individual(s) who interacted with them. However, the applications of this new field in forensic sciences raises concerns on current methods used in sample processing, including sample collection, storage, and the statistical power of published studies. These areas of human microbiome research need to be fully addressed before microbiome data can become a regularly incorporated evidence type and routine procedure of the forensic toolkit. Here, we summarize information on the current status of microbiome research as applies to the forensic field, the mathematical models used to make predictions, and the possible legal and practical difficulties that can limit the application of microbiomes in forensic science. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  18. The AAV Vector Toolkit: Poised at the Clinical Crossroads.

    PubMed

    Asokan, Aravind; Schaffer, David V; Jude Samulski, R

    2012-04-01

    The discovery of naturally occurring adeno-associated virus (AAV) isolates in different animal species and the generation of engineered AAV strains using molecular genetics tools have yielded a versatile AAV vector toolkit. Promising results in preclinical animal models of human disease spurred the much awaited transition toward clinical application, and early successes in phase I/II clinical trials for a broad spectrum of genetic diseases have recently been reported. As the gene therapy community forges ahead with cautious optimism, both preclinical and clinical studies using first generation AAV vectors have highlighted potential challenges. These include cross-species variation in vector tissue tropism and gene transfer efficiency, pre-existing humoral immunity to AAV capsids and vector dose-dependent toxicity in patients. A battery of second generation AAV vectors, engineered through rational and combinatorial approaches to address the aforementioned concerns, are now available. This review will provide an overview of preclinical studies with the ever-expanding AAV vector portfolio in large animal models and an update on new lead AAV vector candidates poised for clinical translation.

  19. The AAV vector toolkit: poised at the clinical crossroads.

    PubMed

    Asokan, Aravind; Schaffer, David V; Samulski, R Jude

    2012-04-01

    The discovery of naturally occurring adeno-associated virus (AAV) isolates in different animal species and the generation of engineered AAV strains using molecular genetics tools have yielded a versatile AAV vector toolkit. Promising results in preclinical animal models of human disease spurred the much awaited transition toward clinical application, and early successes in phase I/II clinical trials for a broad spectrum of genetic diseases have recently been reported. As the gene therapy community forges ahead with cautious optimism, both preclinical and clinical studies using first generation AAV vectors have highlighted potential challenges. These include cross-species variation in vector tissue tropism and gene transfer efficiency, pre-existing humoral immunity to AAV capsids and vector dose-dependent toxicity in patients. A battery of second generation AAV vectors, engineered through rational and combinatorial approaches to address the aforementioned concerns, are now available. This review will provide an overview of preclinical studies with the ever-expanding AAV vector portfolio in large animal models and an update on new lead AAV vector candidates poised for clinical translation.

  20. Machine learning for a Toolkit for Image Mining

    NASA Technical Reports Server (NTRS)

    Delanoy, Richard L.

    1995-01-01

    A prototype user environment is described that enables a user with very limited computer skills to collaborate with a computer algorithm to develop search tools (agents) that can be used for image analysis, creating metadata for tagging images, searching for images in an image database on the basis of image content, or as a component of computer vision algorithms. Agents are learned in an ongoing, two-way dialogue between the user and the algorithm. The user points to mistakes made in classification. The algorithm, in response, attempts to discover which image attributes are discriminating between objects of interest and clutter. It then builds a candidate agent and applies it to an input image, producing an 'interest' image highlighting features that are consistent with the set of objects and clutter indicated by the user. The dialogue repeats until the user is satisfied. The prototype environment, called the Toolkit for Image Mining (TIM) is currently capable of learning spectral and textural patterns. Learning exhibits rapid convergence to reasonable levels of performance and, when thoroughly trained, Fo appears to be competitive in discrimination accuracy with other classification techniques.

  1. Toward a VPH/Physiome ToolKit.

    PubMed

    Garny, Alan; Cooper, Jonathan; Hunter, Peter J

    2010-01-01

    The Physiome Project was officially launched in 1997 and has since brought together teams from around the world to work on the development of a computational framework for the modeling of the human body. At the European level, this effort is focused around patient-specific solutions and is known as the Virtual Physiological Human (VPH) Initiative.Such modeling is both multiscale (in space and time) and multiphysics. This, therefore, requires careful interaction and collaboration between the teams involved in the VPH/Physiome effort, if we are to produce computer models that are not only quantitative, but also integrative and predictive.In that context, several technologies and solutions are already available, developed both by groups involved in the VPH/Physiome effort, and by others. They address areas such as data handling/fusion, markup languages, model repositories, ontologies, tools (for simulation, imaging, data fitting, etc.), as well as grid, middleware, and workflow.Here, we provide an overview of resources that should be considered for inclusion in the VPH/Physiome ToolKit (i.e., the set of tools that addresses the needs and requirements of the Physiome Project and VPH Initiative) and discuss some of the challenges that we are still facing.

  2. Rapid parameterization of small molecules using the Force Field Toolkit

    PubMed Central

    Mayne, Christopher G.; Saam, Jan; Schulten, Klaus; Tajkhorshid, Emad; Gumbart, James C.

    2013-01-01

    The inability to rapidly generate accurate and robust parameters for novel chemical matter continues to severely limit the application of molecular dynamics (MD) simulations to many biological systems of interest, especially in fields such as drug discovery. Although the release of generalized versions of common classical force fields, e.g., GAFF and CGenFF, have posited guidelines for parameterization of small molecules, many technical challenges remain that have hampered their wide-scale extension. The Force Field Toolkit (ffTK), described herein, minimizes common barriers to ligand parameterization through algorithm and method development, automation of tedious and error-prone tasks, and graphical user interface design. Distributed as a VMD plugin, ffTK facilitates the traversal of a clear and organized workflow resulting in a complete set of CHARMM-compatible parameters. A variety of tools are provided to generate quantum mechanical target data, set up multidimensional optimization routines, and analyze parameter performance. Parameters developed for a small test set of molecules using ffTK were comparable to existing CGenFF parameters in their ability to reproduce experimentally measured values for pure-solvent properties (<15% error from experiment) and free energy of solvation (±0.5 kcal/mol from experiment). PMID:24000174

  3. Space and Medical Applications of the Geant4 Simulation Toolkit

    NASA Astrophysics Data System (ADS)

    Perl, Joseph

    2008-10-01

    Geant4 is a toolkit to simulate the passage of particles through matter. While Geant4 was developed for High Energy Physics (HEP), applications now include Nuclear, Medical and Space Physics. Medical applications have been increasing rapidly due to the overall growth of Monte Carlo in Medical Physics and the unique qualities of Geant4 as an all-particle code able to handle complex geometry, motion and fields with the flexibility of modern programming and an open free source code. Work has included characterizing beams and sources, treatment planning and imaging. The all-particle nature of Geant4 has made it popular for the newest modes of radiation treatment: Proton and Particle therapy. Geant4 has been used by ESA, NASA and JAXA to study radiation effects to spacecraft and personnel. The flexibility of Geant4 has enabled teams to incorporate it into their own applications (SPENVIS MULASSIS space environment from QinetiQ and ESA, RADSAFE simulation from Vanderbilt University and NASA). We provide an overview of applications and discuss how Geant4 has responded to specific challenges of moving from HEP to Medical and Space Physics, including recent work to extend Geant4's energy range to low dose radiobiology.

  4. The PhenX Toolkit pregnancy and birth collections.

    PubMed

    Whitehead, Nedra S; Hammond, Jane A; Williams, Michelle A; Huggins, Wayne; Hoover, Sonja; Hamilton, Carol M; Ramos, Erin M; Junkins, Heather A; Harlan, William R; Hogue, Carol J

    2012-11-01

    Pregnancy and childbirth are normal conditions, but complications and adverse outcomes are common. Both genetic and environmental factors influence the course of pregnancy. Genetic epidemiologic research into pregnancy outcomes could be strengthened by the use of common measures, which would allow data from different studies to be combined or compared. Here, we introduce perinatal researchers to the PhenX Toolkit and the Collections related to pregnancy and childbirth. The Pregnancy and Birth Collections were drawn from measures in the PhenX Tooklit. The lead author selected a list of measures for each Collection, which was reviewed by the remaining authors and revised on the basis of their comments. We chose the measures we thought were most relevant for perinatal research and had been linked most strongly to perinatal outcomes. The Pregnancy and Birth Health Conditions Collection includes 24 measures related to pregnancy and fertility history, maternal complications, and infant complications. The Pregnancy and Birth Outcome Risk Factors Collection includes 43 measures of chemical, medical, psychosocial, and personal factors associated with pregnancy outcomes. The biological complexity of pregnancy and its sensitivity to environmental and genomic influences suggest that multidisciplinary approaches are needed to generate new insights or practical interventions. To fully exploit new research methods and resources, we encourage the biomedical research community to adopt standard measures to facilitate pooled or meta-analyses. Copyright © 2012 Elsevier Inc. All rights reserved.

  5. Machine learning for a Toolkit for Image Mining

    NASA Technical Reports Server (NTRS)

    Delanoy, Richard L.

    1995-01-01

    A prototype user environment is described that enables a user with very limited computer skills to collaborate with a computer algorithm to develop search tools (agents) that can be used for image analysis, creating metadata for tagging images, searching for images in an image database on the basis of image content, or as a component of computer vision algorithms. Agents are learned in an ongoing, two-way dialogue between the user and the algorithm. The user points to mistakes made in classification. The algorithm, in response, attempts to discover which image attributes are discriminating between objects of interest and clutter. It then builds a candidate agent and applies it to an input image, producing an 'interest' image highlighting features that are consistent with the set of objects and clutter indicated by the user. The dialogue repeats until the user is satisfied. The prototype environment, called the Toolkit for Image Mining (TIM) is currently capable of learning spectral and textural patterns. Learning exhibits rapid convergence to reasonable levels of performance and, when thoroughly trained, Fo appears to be competitive in discrimination accuracy with other classification techniques.

  6. Targeting protein function: the expanding toolkit for conditional disruption

    PubMed Central

    Campbell, Amy E.; Bennett, Daimark

    2016-01-01

    A major objective in biological research is to understand spatial and temporal requirements for any given gene, especially in dynamic processes acting over short periods, such as catalytically driven reactions, subcellular transport, cell division, cell rearrangement and cell migration. The interrogation of such processes requires the use of rapid and flexible methods of interfering with gene function. However, many of the most widely used interventional approaches, such as RNAi or CRISPR (clustered regularly interspaced short palindromic repeats)-Cas9 (CRISPR-associated 9), operate at the level of the gene or its transcripts, meaning that the effects of gene perturbation are exhibited over longer time frames than the process under investigation. There has been much activity over the last few years to address this fundamental problem. In the present review, we describe recent advances in disruption technologies acting at the level of the expressed protein, involving inducible methods of protein cleavage, (in)activation, protein sequestration or degradation. Drawing on examples from model organisms we illustrate the utility of fast-acting techniques and discuss how different components of the molecular toolkit can be employed to dissect previously intractable biochemical processes and cellular behaviours. PMID:27574023

  7. First responder tracking and visualization for command and control toolkit

    NASA Astrophysics Data System (ADS)

    Woodley, Robert; Petrov, Plamen; Meisinger, Roger

    2010-04-01

    In order for First Responder Command and Control personnel to visualize incidents at urban building locations, DHS sponsored a small business research program to develop a tool to visualize 3D building interiors and movement of First Responders on site. 21st Century Systems, Inc. (21CSI), has developed a toolkit called Hierarchical Grid Referenced Normalized Display (HiGRND). HiGRND utilizes three components to provide a full spectrum of visualization tools to the First Responder. First, HiGRND visualizes the structure in 3D. Utilities in the 3D environment allow the user to switch between views (2D floor plans, 3D spatial, evacuation routes, etc.) and manually edit fast changing environments. HiGRND accepts CAD drawings and 3D digital objects and renders these in the 3D space. Second, HiGRND has a First Responder tracker that uses the transponder signals from First Responders to locate them in the virtual space. We use the movements of the First Responder to map the interior of structures. Finally, HiGRND can turn 2D blueprints into 3D objects. The 3D extruder extracts walls, symbols, and text from scanned blueprints to create the 3D mesh of the building. HiGRND increases the situational awareness of First Responders and allows them to make better, faster decisions in critical urban situations.

  8. A Gateway MultiSite Recombination Cloning Toolkit

    PubMed Central

    Petersen, Lena K.; Stowers, R. Steven

    2011-01-01

    The generation of DNA constructs is often a rate-limiting step in conducting biological experiments. Recombination cloning of single DNA fragments using the Gateway system provided an advance over traditional restriction enzyme cloning due to increases in efficiency and reliability. Here we introduce a series of entry clones and a destination vector for use in two, three, and four fragment Gateway MultiSite recombination cloning whose advantages include increased flexibility and versatility. In contrast to Gateway single-fragment cloning approaches where variations are typically incorporated into model system-specific destination vectors, our Gateway MultiSite cloning strategy incorporates variations in easily generated entry clones that are model system-independent. In particular, we present entry clones containing insertions of GAL4, QF, UAS, QUAS, eGFP, and mCherry, among others, and demonstrate their in vivo functionality in Drosophila by using them to generate expression clones including GAL4 and QF drivers for various trp ion channel family members, UAS and QUAS excitatory and inhibitory light-gated ion channels, and QUAS red and green fluorescent synaptic vesicle markers. We thus establish a starter toolkit of modular Gateway MultiSite entry clones potentially adaptable to any model system. An inventory of entry clones and destination vectors for Gateway MultiSite cloning has also been established (www.gatewaymultisite.org). PMID:21931740

  9. A cosmology forecast toolkit — CosmoLib

    NASA Astrophysics Data System (ADS)

    Huang, Zhiqi

    2012-06-01

    The package CosmoLib is a combination of a cosmological Boltzmann code and a simulation toolkit to forecast the constraints on cosmological parameters from future observations. In this paper we describe the released linear-order part of the package. We discuss the stability and performance of the Boltzmann code. This is written in Newtonian gauge and including dark energy perturbations. In CosmoLib the integrator that computes the CMB angular power spectrum is optimized for a l-by-l brute-force integration, which is useful for studying inflationary models predicting sharp features in the primordial power spectrum of metric fluctuations. As an application, CosmoLib is used to study the axion monodromy inflation model that predicts cosine oscillations in the primordial power spectrum. In contrast to the previous studies by Aich et al. and Meerburg et al., we found no detection or hint of the osicllations. We pointed out that the CAMB code modified by Aich et al. does not have sufficient numerical accuracy. CosmoLib and its documentation are available at http://www.cita.utoronto.ca/~zqhuang/CosmoLib

  10. Microgrid Design Toolkit (MDT) Technical Documentation and Component Summaries

    SciTech Connect

    Arguello, Bryan; Gearhart, Jared Lee; Jones, Katherine A.; Eddy, John P.

    2015-09-01

    The Microgrid Design Toolkit (MDT) is a decision support software tool for microgrid designers to use during the microgrid design process. The models that support the two main capabilities in MDT are described. The first capability, the Microgrid Sizing Capability (MSC), is used to determine the size and composition of a new microgrid in the early stages of the design process. MSC is a mixed-integer linear program that is focused on developing a microgrid that is economically viable when connected to the grid. The second capability is focused on refining a microgrid design for operation in islanded mode. This second capability relies on two models: the Technology Management Optimization (TMO) model and Performance Reliability Model (PRM). TMO uses a genetic algorithm to create and refine a collection of candidate microgrid designs. It uses PRM, a simulation based reliability model, to assess the performance of these designs. TMO produces a collection of microgrid designs that perform well with respect to one or more performance metrics.

  11. Quality improvement projects targeting health care-associated infections: comparing Virtual Collaborative and Toolkit approaches.

    PubMed

    Speroff, Theodore; Ely, E Wes; Greevy, Robert; Weinger, Matthew B; Talbot, Thomas R; Wall, Richard J; Deshpande, Jayant K; France, Daniel J; Nwosu, Sam; Burgess, Hayley; Englebright, Jane; Williams, Mark V; Dittus, Robert S

    2011-05-01

    Collaborative and toolkit approaches have gained traction for improving quality in health care. To determine if a quality improvement virtual collaborative intervention would perform better than a toolkit-only approach at preventing central line-associated bloodstream infections (CLABSIs) and ventilator-associated pneumonias (VAPs). Cluster randomized trial with the Intensive Care Units (ICUs) of 60 hospitals assigned to the Toolkit (n=29) or Virtual Collaborative (n=31) group from January 2006 through September 2007. CLABSI and VAP rates. Follow-up survey on improvement interventions, toolkit utilization, and strategies for implementing improvement. A total of 83% of the Collaborative ICUs implemented all CLABSI interventions compared to 64% of those in the Toolkit group (P = 0.13), implemented daily catheter reviews more often (P = 0.04), and began this intervention sooner (P < 0.01). Eighty-six percent of the Collaborative group implemented the VAP bundle compared to 64% of the Toolkit group (P = 0.06). The CLABSI rate was 2.42 infections per 1000 catheter days at baseline and 2.73 at 18 months (P = 0.59). The VAP rate was 3.97 per 1000 ventilator days at baseline and 4.61 at 18 months (P = 0.50). Neither group improved outcomes over time; there was no differential performance between the 2 groups for either CLABSI rates (P = 0.71) or VAP rates (P = 0.80). The intensive collaborative approach outpaced the simpler toolkit approach in changing processes of care, but neither approach improved outcomes. Incorporating quality improvement methods, such as ICU checklists, into routine care processes is complex, highly context-dependent, and may take longer than 18 months to achieve. Copyright © 2011 Society of Hospital Medicine.

  12. Reversible Aggregation of Albumin

    NASA Astrophysics Data System (ADS)

    Colby, Ralph H.; Oates, Katherine M. N.; Krause, Wendy E.; Jones, Ronald L.

    2004-03-01

    We explore the interactions in synovial fluid involving the polyelectrolyte sodium hyaluronate (NaHA) and plasma proteins in their native state (albumin and globulins). Rheological measurements on synovial fluid show it to be highly viscoelastic and also rheopectic (stress increases with time in steady shear). Equilibrium dialysis confirms the findings of Ogston and Dubin that there is no association between NaHA and albumin at physiological pH and salt. What we find instead is a reversible aggregation of albumin, with an association energy of order 3kT and commensurate association lifetime of order microseconds. Certain anti-inflammatory drugs are shown to prevent this reversible aggregation. The implications of these findings for synovial fluid and blood rheology are discussed.

  13. Tracking protein aggregate interactions

    PubMed Central

    Bartz, Jason C; Nilsson, K Peter R

    2011-01-01

    Amyloid fibrils share a structural motif consisting of highly ordered β-sheets aligned perpendicular to the fibril axis.1, 2 At each fibril end, β-sheets provide a template for recruiting and converting monomers.3 Different amyloid fibrils often co-occur in the same individual, yet whether a protein aggregate aids or inhibits the assembly of a heterologous protein is unclear. In prion disease, diverse prion aggregate structures, known as strains, are thought to be the basis of disparate disease phenotypes in the same species expressing identical prion protein sequences.4–7 Here we explore the interactions reported to occur when two distinct prion strains occur together in the central nervous system. PMID:21597336

  14. Zooplankton Aggregations Near Sills

    DTIC Science & Technology

    2003-09-30

    frequency echo-sounder system. This data were supplemented with multi-net (BIONESS) trawls, bongo nets, and otter trawls (operated by D. Mackas and group...side. The general composition of the zooplankton aggregations can be deduced from the relative levels of the three echo-sounder frequencies; krill ...Nov. 20th, 2002. Krill layer is evident at 66 – 90 m, coincident with BIONESS trawl through the region. 3 Figure 2 shows a comparison between

  15. Proteins aggregation and human diseases

    NASA Astrophysics Data System (ADS)

    Hu, Chin-Kun

    2015-04-01

    Many human diseases and the death of most supercentenarians are related to protein aggregation. Neurodegenerative diseases include Alzheimer's disease (AD), Huntington's disease (HD), Parkinson's disease (PD), frontotemporallobar degeneration, etc. Such diseases are due to progressive loss of structure or function of neurons caused by protein aggregation. For example, AD is considered to be related to aggregation of Aβ40 (peptide with 40 amino acids) and Aβ42 (peptide with 42 amino acids) and HD is considered to be related to aggregation of polyQ (polyglutamine) peptides. In this paper, we briefly review our recent discovery of key factors for protein aggregation. We used a lattice model to study the aggregation rates of proteins and found that the probability for a protein sequence to appear in the conformation of the aggregated state can be used to determine the temperature at which proteins can aggregate most quickly. We used molecular dynamics and simple models of polymer chains to study relaxation and aggregation of proteins under various conditions and found that when the bending-angle dependent and torsion-angle dependent interactions are zero or very small, then protein chains tend to aggregate at lower temperatures. All atom models were used to identify a key peptide chain for the aggregation of insulin chains and to find that two polyQ chains prefer anti-parallel conformation. It is pointed out that in many cases, protein aggregation does not result from protein mis-folding. A potential drug from Chinese medicine was found for Alzheimer's disease.

  16. Report filing in histopathology.

    PubMed Central

    Blenkinsopp, W K

    1977-01-01

    An assessment of alternative methods of filing histopathology report forms in alphabetical order showed that orthodox card index filing is satisfactory up to about 100000 reports but, because of the need for long-term retrieval, when the reports filed exceed this number they should be copied on jacketed microfilm and a new card index file begun. PMID:591645

  17. Next Generation of the Java Image Science Toolkit (JIST): Visualization and Validation

    PubMed Central

    Li, Bo; Bryan, Frederick; Landman, Bennett A.

    2013-01-01

    Modern medical imaging analyses often involve the concatenation of multiple steps, and neuroimaging analysis is no exception. The Java Image Science Toolkit (JIST) has provided a framework for both end users and engineers to synthesize processing modules into tailored, automatic multi-step processing pipelines (“layouts”) and rapid prototyping of module development. Since its release, JIST has facilitated substantial neuroimaging research and fulfilled much of its intended goal. However, key weaknesses must be addressed for JIST to more fully realize its potential and become accessible to an even broader community base. Herein, we identify three core challenges facing traditional JIST (JIST-I) and present the solutions in the next generation JIST (JIST-II). First, in response to community demand, we have introduced seamless data visualization; users can now click ‘show this data’ through the program interfaces and avoid the need to locating files on the disk. Second, as JIST is an open-source community effort by-design; any developer may add modules to the distribution and extend existing functionality for release. However, the large number of developers and different use cases introduced instability into the overal JIST-I framework, causing users to freeze on different, incompatible versions of JIST-I, and the JIST community began to fracture. JIST-II addresses the problem of compilation instability by performing continuous integration checks nightly to ensure community implemented changes do not negatively impact overall JIST-II functionality. Third, JIST-II allows developers and users to ensure that functionality is preserved by running functionality checks nightly using the continuous integration framework. With JIST-II, users can submit layout test cases and quality control criteria through a new GUI. These test cases capture all runtime parameters and help to ensure that the module produces results within tolerance, despite changes in the underlying

  18. Age-Dependent Protein Aggregation Initiates Amyloid-β Aggregation

    PubMed Central

    Groh, Nicole; Bühler, Anika; Huang, Chaolie; Li, Ka Wan; van Nierop, Pim; Smit, August B.; Fändrich, Marcus; Baumann, Frank; David, Della C.

    2017-01-01

    Aging is the most important risk factor for neurodegenerative diseases associated with pathological protein aggregation such as Alzheimer’s disease. Although aging is an important player, it remains unknown which molecular changes are relevant for disease initiation. Recently, it has become apparent that widespread protein aggregation is a common feature of aging. Indeed, several studies demonstrate that 100s of proteins become highly insoluble with age, in the absence of obvious disease processes. Yet it remains unclear how these misfolded proteins aggregating with age affect neurodegenerative diseases. Importantly, several of these aggregation-prone proteins are found as minor components in disease-associated hallmark aggregates such as amyloid-β plaques or neurofibrillary tangles. This co-localization raises the possibility that age-dependent protein aggregation directly contributes to pathological aggregation. Here, we show for the first time that highly insoluble proteins from aged Caenorhabditis elegans or aged mouse brains, but not from young individuals, can initiate amyloid-β aggregation in vitro. We tested the seeding potential at four different ages across the adult lifespan of C. elegans. Significantly, protein aggregates formed during the early stages of aging did not act as seeds for amyloid-β aggregation. Instead, we found that changes in protein aggregation occurring during middle-age initiated amyloid-β aggregation. Mass spectrometry analysis revealed several late-aggregating proteins that were previously identified as minor components of amyloid-β plaques and neurofibrillary tangles such as 14-3-3, Ubiquitin-like modifier-activating enzyme 1 and Lamin A/C, highlighting these as strong candidates for cross-seeding. Overall, we demonstrate that widespread protein misfolding and aggregation with age could be critical for the initiation of pathogenesis, and thus should be targeted by therapeutic strategies to alleviate neurodegenerative

  19. Development of an Online Toolkit for Measuring Performance in Health Emergency Response Exercises.

    PubMed

    Agboola, Foluso; Bernard, Dorothy; Savoia, Elena; Biddinger, Paul D

    2015-10-01

    Exercises that simulate emergency scenarios are accepted widely as an essential component of a robust Emergency Preparedness program. Unfortunately, the variability in the quality of the exercises conducted, and the lack of standardized processes to measure performance, has limited the value of exercises in measuring preparedness. In order to help health organizations improve the quality and standardization of the performance data they collect during simulated emergencies, a model online exercise evaluation toolkit was developed using performance measures tested in over 60 Emergency Preparedness exercises. The exercise evaluation toolkit contains three major components: (1) a database of measures that can be used to assess performance during an emergency response exercise; (2) a standardized data collection tool (form); and (3) a program that populates the data collection tool with the measures that have been selected by the user from the database. The evaluation toolkit was pilot tested from January through September 2014 in collaboration with 14 partnering organizations representing 10 public health agencies and four health care agencies from eight states across the US. Exercise planners from the partnering organizations were asked to use the toolkit for their exercise evaluation process and were interviewed to provide feedback on the use of the toolkit, the generated evaluation tool, and the usefulness of the data being gathered for the development of the exercise after-action report. Ninety-three percent (93%) of exercise planners reported that they found the online database of performance measures appropriate for the creation of exercise evaluation forms, and they stated that they would use it again for future exercises. Seventy-two percent (72%) liked the exercise evaluation form that was generated from the toolkit, and 93% reported that the data collected by the use of the evaluation form were useful in gauging their organization's performance during the

  20. PsyToolkit: a software package for programming psychological experiments using Linux.

    PubMed

    Stoet, Gijsbert

    2010-11-01

    PsyToolkit is a set of software tools for programming psychological experiments on Linux computers. Given that PsyToolkit is freely available under the Gnu Public License, open source, and designed such that it can easily be modified and extended for individual needs, it is suitable not only for technically oriented Linux users, but also for students, researchers on small budgets, and universities in developing countries. The software includes a high-level scripting language, a library for the programming language C, and a questionnaire presenter. The software easily integrates with other open source tools, such as the statistical software package R. PsyToolkit is designed to work with external hardware (including IoLab and Cedrus response keyboards and two common digital input/output boards) and to support millisecond timing precision. Four in-depth examples explain the basic functionality of PsyToolkit. Example 1 demonstrates a stimulus-response compatibility experiment. Example 2 demonstrates a novel mouse-controlled visual search experiment. Example 3 shows how to control light emitting diodes using PsyToolkit, and Example 4 shows how to build a light-detection sensor. The last two examples explain the electronic hardware setup such that they can even be used with other software packages.

  1. The development of an artificial organic networks toolkit for LabVIEW.

    PubMed

    Ponce, Hiram; Ponce, Pedro; Molina, Arturo

    2015-03-15

    Two of the most challenging problems that scientists and researchers face when they want to experiment with new cutting-edge algorithms are the time-consuming for encoding and the difficulties for linking them with other technologies and devices. In that sense, this article introduces the artificial organic networks toolkit for LabVIEW™ (AON-TL) from the implementation point of view. The toolkit is based on the framework provided by the artificial organic networks technique, giving it the potential to add new algorithms in the future based on this technique. Moreover, the toolkit inherits both the rapid prototyping and the easy-to-use characteristics of the LabVIEW™ software (e.g., graphical programming, transparent usage of other softwares and devices, built-in programming event-driven for user interfaces), to make it simple for the end-user. In fact, the article describes the global architecture of the toolkit, with particular emphasis in the software implementation of the so-called artificial hydrocarbon networks algorithm. Lastly, the article includes two case studies for engineering purposes (i.e., sensor characterization) and chemistry applications (i.e., blood-brain barrier partitioning data model) to show the usage of the toolkit and the potential scalability of the artificial organic networks technique.

  2. New Careers in Nursing Scholar Alumni Toolkit: Development of an Innovative Resource for Transition to Practice.

    PubMed

    Mauro, Ann Marie P; Escallier, Lori A; Rosario-Sim, Maria G

    2016-01-01

    The transition from student to professional nurse is challenging and may be more difficult for underrepresented minority nurses. The Robert Wood Johnson Foundation New Careers in Nursing (NCIN) program supported development of a toolkit that would serve as a transition-to-practice resource to promote retention of NCIN alumni and other new nurses. Thirteen recent NCIN alumni (54% male, 23% Hispanic/Latino, 23% African Americans) from 3 schools gave preliminary content feedback. An e-mail survey was sent to a convenience sample of 29 recent NCIN alumni who evaluated the draft toolkit using a Likert scale (poor = 1; excellent = 5). Twenty NCIN alumni draft toolkit reviewers (response rate 69%) were primarily female (80%) and Hispanic/Latino (40%). Individual chapters' mean overall rating of 4.67 demonstrated strong validation. Mean scores for overall toolkit content (4.57), usability (4.5), relevance (4.79), and quality (4.71) were also excellent. Qualitative comments were analyzed using thematic content analysis and supported the toolkit's relevance and utility. A multilevel peer review process was also conducted. Peer reviewer feedback resulted in a 6-chapter document that offers resources for successful transition to practice and lays the groundwork for continued professional growth. Future research is needed to determine the ideal time to introduce this resource. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. The medical exploration toolkit: an efficient support for visual computing in surgical planning and training.

    PubMed

    Mühler, Konrad; Tietjen, Christian; Ritter, Felix; Preim, Bernhard

    2010-01-01

    Application development is often guided by the usage of software libraries and toolkits. For medical applications, the toolkits currently available focus on image analysis and volume rendering. Advance interactive visualizations and user interface issues are not adequately supported. Hence, we present a toolkit for application development in the field of medical intervention planning, training, and presentation--the MEDICALEXPLORATIONTOOLKIT (METK). The METK is based on the rapid prototyping platform MeVisLab and offers a large variety of facilities for an easy and efficient application development process. We present dedicated techniques for advanced medical visualizations, exploration, standardized documentation, adn interface widgets for common tasks. These include, e.g., advanced animation facilities, viewpoint selection, several illustrative rendering techniques, and new techniques for object selection in 3D surface models. No extended programming skills are needed for application building, since a graphical programming approach can be used. the toolkit is freely available and well documented to facilitate the use and extension of the toolkit.

  4. Educational RIS/PACS simulator integrated with the HIPAA compliant auditing (HCA) toolkit

    NASA Astrophysics Data System (ADS)

    Zhou, Zheng; Liu, Brent J.; Huang, H. K.; Zhang, J.

    2005-04-01

    Health Insurance Portability and Accountability Act (HIPAA), a guideline for healthcare privacy and security, has been officially instituted recently. HIPAA mandates healthcare providers to follow its privacy and security rules, one of which is to have the ability to generate audit trails on the data access for any specific patient on demand. Although most current medical imaging systems such as PACS utilize logs to record their activities, there is a lack of formal methodology to interpret these large volumes of log data and generate HIPAA compliant auditing trails. In this paper, we present a HIPAA compliant auditing (HCA) toolkit for auditing the image data flow of PACS. The toolkit can extract pertinent auditing information from the logs of various PACS components and store the information in a centralized auditing database. The HIPAA compliant audit trails can be generated based on the database, which can also be utilized for data analysis to facilitate the dynamic monitoring of the data flow of PACS. In order to demonstrate the HCA toolkit in a PACS environment, it was integrated with the PACS Simulator, that was presented as an educational tool in 2003 and 2004 SPIE. With the integration of the HCA toolkit with the PACS simulator, users can learn HIPAA audit concepts and how to generate audit trails of image data access in PACS, as well as trace the image data flow of PACS Simulator through the toolkit.

  5. SatelliteDL - An IDL Toolkit for the Analysis of Satellite Earth Observations - GOES, MODIS, VIIRS and CERES

    NASA Astrophysics Data System (ADS)

    Fillmore, D. W.; Galloy, M. D.; Kindig, D.

    2013-12-01

    SatelliteDL is an IDL toolkit for the analysis of satellite Earth observations from a diverse set of platforms and sensors. The design features an abstraction layer that allows for easy inclusion of new datasets in a modular way. The core function of the toolkit is the spatial and temporal alignment of satellite swath and geostationary data. IDL has a powerful suite of statistical and visualization tools that can be used in conjunction with SatelliteDL. Our overarching objective is to create utilities that automate the mundane aspects of satellite data analysis, are extensible and maintainable, and do not place limitations on the analysis itself. Toward this end we have constructed SatelliteDL to include (1) HTML and LaTeX API document generation, (2) a unit test framework, (3) automatic message and error logs, (4) HTML and LaTeX plot and table generation, and (5) several real world examples with bundled datasets available for download. For ease of use, datasets, variables and optional workflows may be specified in a flexible format configuration file. Configuration statements may specify, for example, a region and date range, and the creation of images, plots and statistical summary tables for a long list of variables. SatelliteDL enforces data provenance; all data should be traceable and reproducible. The output NetCDF file metadata holds a complete history of the original datasets and their transformations, and a method exists to reconstruct a configuration file from this information. Release 0.1.0 of SatelliteDL is anticipated for the 2013 Fall AGU conference. It will distribute with ingest methods for GOES, MODIS, VIIRS and CERES radiance data (L1) as well as select 2D atmosphere products (L2) such as aerosol and cloud (MODIS and VIIRS) and radiant flux (CERES). Future releases will provide ingest methods for ocean and land surface products, gridded and time averaged datasets (L3 Daily, Monthly and Yearly), and support for 3D products such as temperature and

  6. Parameter Sweep and Optimization of Loosely Coupled Simulations Using the DAKOTA Toolkit

    SciTech Connect

    Elwasif, Wael R; Bernholdt, David E; Pannala, Sreekanth; Allu, Srikanth; Foley, Samantha S

    2012-01-01

    The increasing availability of large scale computing capabilities has accelerated the development of high-fidelity coupled simulations. Such simulations typically involve the integration of models that implement various aspects of the complex phenomena under investigation. Coupled simulations are playing an integral role in fields such as climate modeling, earth systems modeling, rocket simulations, computational chemistry, fusion research, and many other computational fields. Model coupling provides scientists with systematic ways to virtually explore the physical, mathematical, and computational aspects of the problem. Such exploration is rarely done using a single execution of a simulation, but rather by aggregating the results from many simulation runs that, together, serve to bring to light novel knowledge about the system under investigation. Furthermore, it is often the case (particularly in engineering disciplines) that the study of the underlying system takes the form of an optimization regime, where the control parameter space is explored to optimize an objective functions that captures system realizability, cost, performance, or a combination thereof. Novel and flexible frameworks that facilitate the integration of the disparate models into a holistic simulation are used to perform this research, while making efficient use of the available computational resources. In this paper, we describe the integration of the DAKOTA optimization and parameter sweep toolkit with the Integrated Plasma Simulator (IPS), a component-based framework for loosely coupled simulations. The integration allows DAKOTA to exploit the internal task and resource management of the IPS to dynamically instantiate simulation instances within a single IPS instance, allowing for greater control over the trade-off between efficiency of resource utilization and time to completion. We present a case study showing the use of the combined DAKOTA-IPS system to aid in the design of a lithium ion

  7. SMART: Soil Moisture and Runoff Toolkit for Semi-distributed Hydrologic Modeling

    NASA Astrophysics Data System (ADS)

    Ajami, H.; Khan, U.; Tuteja, N. K.; Sharma, A.

    2015-12-01

    A new GIS based semi-distributed hydrologic modeling framework is developed for simulating runoff, evapotranspiration and soil moisture at large catchment scale. The framework is based upon the delineation of contiguous and topologically connected Hydrologic Response Units (HRUs). The HRU delineation methodology consists of delineating first order sub-basins and landforms. To reduce the number of computational elements, simulations are performed across a series of cross sections or equivalent cross sections (ECS) in each first order sub-basin using a 2-dimensional, Richards' equation based distributed hydrological model. Delineation of ECSs is performed by weighting the topographic and physiographic properties of the part or entire first-order sub-basins and has the advantage of reducing the computational time while maintaining reasonable accuracy in simulated hydrologic fluxes. Simulated fluxes from every cross section or ECS are weighted by the respective area from which the cross sections or ECSs were formulated and then aggregated to obtain the catchment scale fluxes. SMART pre- and post-processing scripts are written in MATLAB to automate the cross section delineation, model simulations across multiple cross sections, and post-processing of outputs for visualization. The MATLAB Parallel Processing Toolbox is used for simultaneous simulations of cross sections and is further reduced computational time. SMART pre-processing workflow consists of the following steps: 1) delineation of first order sub-basins using a digital elevation model, 2) hillslope delineation, 3) landform delineation in every first order sub-basin based on topographic and geomorphic properties of a group of sub-basins or entire catchment, 4) formulation of cross sections as well as ECSs and 5) extraction of vegetation and soil parameters using spatially distributed land cover and soil information for the 2-d distributed hydrological model. The post-processing tools generate streamflow at the

  8. 43 CFR 4.1381 - Who may file; when to file; where to file.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 43 Public Lands: Interior 1 2010-10-01 2010-10-01 false Who may file; when to file; where to file... may file; when to file; where to file. (a) Any person who receives a written decision issued by OSM under 30 CFR 773.28 on a challenge to an ownership or control listing or finding may file a request...

  9. Non-Arrhenius protein aggregation.

    PubMed

    Wang, Wei; Roberts, Christopher J

    2013-07-01

    Protein aggregation presents one of the key challenges in the development of protein biotherapeutics. It affects not only product quality but also potentially impacts safety, as protein aggregates have been shown to be linked with cytotoxicity and patient immunogenicity. Therefore, investigations of protein aggregation remain a major focus in pharmaceutical companies and academic institutions. Due to the complexity of the aggregation process and temperature-dependent conformational stability, temperature-induced protein aggregation is often non-Arrhenius over even relatively small temperature windows relevant for product development, and this makes low-temperature extrapolation difficult based simply on accelerated stability studies at high temperatures. This review discusses the non-Arrhenius nature of the temperature dependence of protein aggregation, explores possible causes, and considers inherent hurdles for accurately extrapolating aggregation rates from conventional industrial approaches for selecting accelerated conditions and from conventional or more advanced methods of analyzing the resulting rate data.

  10. Ultrastructure of acetylcholine receptor aggregates parallels mechanisms of aggregation

    PubMed Central

    Kunkel, Dennis D; Lee, Lara K; Stollberg, Jes

    2001-01-01

    Background Acetylcholine receptors become aggregated at the developing neuromuscular synapse shortly after contact by a motorneuron in one of the earliest manifestations of synaptic development. While a major physiological signal for receptor aggregation (agrin) is known, the mechanism(s) by which muscle cells respond to this and other stimuli have yet to be worked out in detail. The question of mechanism is addressed in the present study via a quantitative examination of ultrastructural receptor arrangement within aggregates. Results In receptor rich cell membranes resulting from stimulation by agrin or laminin, or in control membrane showing spontaneous receptor aggregation, receptors were found to be closer to neighboring receptors than would be expected at random. This indicates that aggregation proceeds heterogeneously: nanoaggregates, too small for detection in the light microscope, underlie developing microaggregates of receptors in all three cases. In contrast, the structural arrangement of receptors within nanoaggregates was found to depend on the aggregation stimulus. In laminin induced nanoaggregates receptors were found to be arranged in an unstructured manner, in contrast to the hexagonal array of about 10 nm spacing found for agrin induced nanoaggregates. Spontaneous aggregates displayed an intermediate amount of order, and this was found to be due to two distinct population of nanoaggregates. Conclusions The observations support earlier studies indicating that mechanisms by which agrin and laminin-1 induced receptor aggregates form are distinct and, for the first time, relate mechanisms underlying spontaneous aggregate formation to aggregate structure. PMID:11749670

  11. BioWarehouse: a bioinformatics database warehouse toolkit

    PubMed Central

    Lee, Thomas J; Pouliot, Yannick; Wagner, Valerie; Gupta, Priyanka; Stringer-Calvert, David WJ; Tenenbaum, Jessica D; Karp, Peter D

    2006-01-01

    Background This article addresses the problem of interoperation of heterogeneous bioinformatics databases. Results We introduce BioWarehouse, an open source toolkit for constructing bioinformatics database warehouses using the MySQL and Oracle relational database managers. BioWarehouse integrates its component databases into a common representational framework within a single database management system, thus enabling multi-database queries using the Structured Query Language (SQL) but also facilitating a variety of database integration tasks such as comparative analysis and data mining. BioWarehouse currently supports the integration of a pathway-centric set of databases including ENZYME, KEGG, and BioCyc, and in addition the UniProt, GenBank, NCBI Taxonomy, and CMR databases, and the Gene Ontology. Loader tools, written in the C and JAVA languages, parse and load these databases into a relational database schema. The loaders also apply a degree of semantic normalization to their respective source data, decreasing semantic heterogeneity. The schema supports the following bioinformatics datatypes: chemical compounds, biochemical reactions, metabolic pathways, proteins, genes, nucleic acid sequences, features on protein and nucleic-acid sequences, organisms, organism taxonomies, and controlled vocabularies. As an application example, we applied BioWarehouse to determine the fraction of biochemically characterized enzyme activities for which no sequences exist in the public sequence databases. The answer is that no sequence exists for 36% of enzyme activities for which EC numbers have been assigned. These gaps in sequence data significantly limit the accuracy of genome annotation and metabolic pathway prediction, and are a barrier for metabolic engineering. Complex queries of this type provide examples of the value of the data warehousing approach to bioinformatics research. Conclusion BioWarehouse embodies significant progress on the database integration problem for

  12. Managing demand for laboratory tests: a laboratory toolkit.

    PubMed

    Fryer, Anthony A; Smellie, W Stuart A

    2013-01-01

    Healthcare budgets worldwide are facing increasing pressure to reduce costs and improve efficiency, while maintaining quality. Laboratory testing has not escaped this pressure, particularly since pathology investigations cost the National Health Service £2.5 billion per year. Indeed, the Carter Review, a UK Department of Health-commissioned review of pathology services in England, estimated that 20% of this could be saved by improving pathology services, despite an average annual increase of 8%-10% in workload. One area of increasing importance is managing the demands for pathology tests and reducing inappropriate requesting. The Carter Review estimated that 25% of pathology tests were unnecessary, representing a huge potential waste. Certainly, the large variability in levels of requesting between general practitioners suggests that inappropriate requesting is widespread. Unlocking the key to this variation and implementing measures to reduce inappropriate requesting would have major implications for patients and healthcare resources alike. This article reviews the approaches to demand management. Specifically, it aims to (a) define demand management and inappropriate requesting, (b) assess the drivers for demand management, (c) examine the various approaches used, illustrating the potential of electronic requesting and (d) provide a wider context. It will cover issues, such as educational approaches, information technology opportunities and challenges, vetting, duplicate request identification and management, the role of key performance indicators, profile composition and assessment of downstream impact of inappropriate requesting. Currently, many laboratories are exploring demand management using a plethora of disparate approaches. Hence, this review seeks to provide a 'toolkit' with the view to allowing laboratories to develop a standardised demand management strategy.

  13. Cal-Adapt: California's Climate Data Resource and Interactive Toolkit

    NASA Astrophysics Data System (ADS)

    Thomas, N.; Mukhtyar, S.; Wilhelm, S.; Galey, B.; Lehmer, E.

    2016-12-01

    Cal-Adapt is a web-based application that provides an interactive toolkit and information clearinghouse to help agencies, communities, local planners, resource managers, and the public understand climate change risks and impacts at the local level. The website offers interactive, visually compelling, and useful data visualization tools that show how climate change might affect California using downscaled continental climate data. Cal-Adapt is supporting California's Fourth Climate Change Assessment through providing access to the wealth of modeled and observed data and adaption-related information produced by California's scientific community. The site has been developed by UC Berkeley's Geospatial Innovation Facility (GIF) in collaboration with the California Energy Commission's (CEC) Research Program. The Cal-Adapt website allows decision makers, scientists and residents of California to turn research results and climate projections into effective adaptation decisions and policies. Since its release to the public in June 2011, Cal-Adapt has been visited by more than 94,000 unique visitors from over 180 countries, all 50 U.S. states, and 689 California localities. We will present several key visualizations that have been employed by Cal-Adapt's users to support their efforts to understand local impacts of climate change, indicate the breadth of data available, and delineate specific use cases. Recently, CEC and GIF have been developing and releasing Cal-Adapt 2.0, which includes updates and enhancements that are increasing its ease of use, information value, visualization tools, and data accessibility. We showcase how Cal-Adapt is evolving in response to feedback from a variety of sources to present finer-resolution downscaled data, and offer an open API that allows other organization to access Cal-Adapt climate data and build domain specific visualization and planning tools. Through a combination of locally relevant information, visualization tools, and access to

  14. Mission Analysis, Operations, and Navigation Toolkit Environment (Monte) Version 040

    NASA Technical Reports Server (NTRS)

    Sunseri, Richard F.; Wu, Hsi-Cheng; Evans, Scott E.; Evans, James R.; Drain, Theodore R.; Guevara, Michelle M.

    2012-01-01

    Monte is a software set designed for use in mission design and spacecraft navigation operations. The system can process measurement data, design optimal trajectories and maneuvers, and do orbit determination, all in one application. For the first time, a single software set can be used for mission design and navigation operations. This eliminates problems due to different models and fidelities used in legacy mission design and navigation software. The unique features of Monte 040 include a blowdown thruster model for GRAIL (Gravity Recovery and Interior Laboratory) with associated pressure models, as well as an updated, optimalsearch capability (COSMIC) that facilitated mission design for ARTEMIS. Existing legacy software lacked the capabilities necessary for these two missions. There is also a mean orbital element propagator and an osculating to mean element converter that allows long-term orbital stability analysis for the first time in compiled code. The optimized trajectory search tool COSMIC allows users to place constraints and controls on their searches without any restrictions. Constraints may be user-defined and depend on trajectory information either forward or backwards in time. In addition, a long-term orbit stability analysis tool (morbiter) existed previously as a set of scripts on top of Monte. Monte is becoming the primary tool for navigation operations, a core competency at JPL. The mission design capabilities in Monte are becoming mature enough for use in project proposals as well as post-phase A mission design. Monte has three distinct advantages over existing software. First, it is being developed in a modern paradigm: object- oriented C++ and Python. Second, the software has been developed as a toolkit, which allows users to customize their own applications and allows the development team to implement requirements quickly, efficiently, and with minimal bugs. Finally, the software is managed in accordance with the CMMI (Capability Maturity Model

  15. GATE: a simulation toolkit for PET and SPECT.

    PubMed

    Jan, S; Santin, G; Strul, D; Staelens, S; Assié, K; Autret, D; Avner, S; Barbier, R; Bardiès, M; Bloomfield, P M; Brasse, D; Breton, V; Bruyndonckx, P; Buvat, I; Chatziioannou, A F; Choi, Y; Chung, Y H; Comtat, C; Donnarieix, D; Ferrer, L; Glick, S J; Groiselle, C J; Guez, D; Honore, P F; Kerhoas-Cavata, S; Kirov, A S; Kohli, V; Koole, M; Krieguer, M; van der Laan, D J; Lamare, F; Largeron, G; Lartizien, C; Lazaro, D; Maas, M C; Maigne, L; Mayet, F; Melot, F; Merheb, C; Pennacchio, E; Perez, J; Pietrzyk, U; Rannou, F R; Rey, M; Schaart, D R; Schmidtlein, C R; Simon, L; Song, T Y; Vieira, J M; Visvikis, D; Van de Walle, R; Wieërs, E; Morel, C

    2004-10-07

    Monte Carlo simulation is an essential tool in emission tomography that can assist in the design of new medical imaging devices, the optimization of acquisition protocols and the development or assessment of image reconstruction algorithms and correction techniques. GATE, the Geant4 Application for Tomographic Emission, encapsulates the Geant4 libraries to achieve a modular, versatile, scripted simulation toolkit adapted to the field of nuclear medicine. In particular, GATE allows the description of time-dependent phenomena such as source or detector movement, and source decay kinetics. This feature makes it possible to simulate time curves under realistic acquisition conditions and to test dynamic reconstruction algorithms. This paper gives a detailed description of the design and development of GATE by the OpenGATE collaboration, whose continuing objective is to improve, document and validate GATE by simulating commercially available imaging systems for PET and SPECT. Large effort is also invested in the ability and the flexibility to model novel detection systems or systems still under design. A public release of GATE licensed under the GNU Lesser General Public License can be downloaded at http:/www-lphe.epfl.ch/GATE/. Two benchmarks developed for PET and SPECT to test the installation of GATE and to serve as a tutorial for the users are presented. Extensive validation of the GATE simulation platform has been started, comparing simulations and measurements on commercially available acquisition systems. References to those results are listed. The future prospects towards the gridification of GATE and its extension to other domains such as dosimetry are also discussed.

  16. A toolkit to assess Medical Reserve Corps units' performance.

    PubMed

    Savoia, Elena; Massin-Short, Sarah; Higdon, Melissa Ann; Tallon, Lindsay; Matechi, Emmanuel; Stoto, Michael A

    2010-10-01

    The Medical Reserve Corps (MRC) is a national network of community-based units created to promote the local identification, recruitment, training, and activation of volunteers to assist local health departments in public health activities. This study aimed to develop a toolkit for MRC coordinators to assess and monitor volunteer units' performance and identify barriers limiting volunteerism. In 2008 and 2009, MRC volunteers asked to participate in influenza clinics were surveyed in 7 different locations throughout the United States. Two survey instruments were used to assess the performance of the volunteers who were able to participate, the specific barriers that prevented some volunteers from participating, and the overall attitudes of those who participated and those who did not. Validity and reliability of the instruments were assessed through the use of factor analysis and Cronbach's alpha. Two survey instruments were developed: the Volunteer Self-Assessment Questionnaire and the Barriers to Volunteering Questionnaire. Data were collected from a total of 1059 subjects, 758 participated in the influenza clinics and 301 were unable to attend. Data from the 2 instruments were determined to be suitable for factor analysis. Factor solutions and inter-item correlations supported the hypothesized domain structure for both survey questionnaires. Results on volunteers' performance were consistent with observations of both local health departments' staff and external observers. The survey instruments developed for this study appear to be valid and reliable means to assess the performance and attitudes of MRC volunteers and barriers to their participation. This study found these instruments to have face and content validity and practicality. MRC coordinators can use these questionnaires to monitor their ability to engage volunteers in public health activities.

  17. The influence of erythrocyte aggregation on induced platelet aggregation.

    PubMed

    Ott, C; Lardi, E; Schulzki, T; Reinhart, W H

    2010-01-01

    Red blood cells (RBCs) affect platelet aggregation in flowing blood (primary hemostasis). We tested the hypothesis that RBC aggregation could influence platelet aggregation. RBC aggregation was altered in vitro by: (i) changing plasma aggregatory properties with 3.7 g% dextran 40 (D40), 3.0 g% dextran 70 (D70) or 1.55 g% dextran 500 (D500); (ii) changing RBC aggregatory properties by incubating RBCs in 50 mU/ml neuraminidase for 60 min (reduction of the surface sialic acid content, thus reducing electrostatic repulsion) and subsequent RBC resuspension in platelet rich plasma (PRP) containing 1 g% dextran 70. RBC aggregation was assessed with the sedimentation rate (ESR). Platelet aggregation was measured: (i) in flowing whole blood with a platelet function analyzer PFA-100(R), which simulates in vivo conditions with RBCs flowing in the center and platelets along the wall, where they adhere to collagen and aggregate; and (ii) in a Chrono-log 700 Aggregometer, which measures changes of impedance by platelet aggregation in whole blood or changes in light transmission in PRP. We found that RBC aggregation increased with increasing molecular weight of dextran (ESR: 4 +/- 3 mm/h, 34 +/- 14 mm/h and 89 +/- 23 mm/hfor D40, D70 and D500, respectively, p < 0.0001) and with neuraminidase-treated RBCs (76 +/- 27 mm/h vs 27 +/- 8 mm/h, respectively, p < 0.0001). Platelet aggregation measured in whole blood under flow conditions (PFA-100) and without flow (Chronolog Aggregometer) was not affected by RBC aggregation. Our data suggest that RBC aggregation does not affect platelet aggregation in vitro and plays no role in primary hemostasis.

  18. Making Graphene Resist Aggregation

    NASA Astrophysics Data System (ADS)

    Luo, Jiayan

    Graphene-based sheets have stimulated great interest in many scientific disciplines and shown promise for wide potential applications. Among various ways of creating single atomic layer carbon sheets, a promising route for bulk production is to first chemically exfoliate graphite powders to graphene oxide (GO) sheets, followed by reduction to form chemically modified graphene (CMG). Due to the strong van der Waals attraction between graphene sheets, CMG tends to aggregate. The restacking of sheets is largely uncontrollable and irreversible, thus it reduces their processability and compromises properties such as accessible surface area. Strategies based on colloidal chemistry have been applied to keep CMG dispersed in solvents by introducing electrostatic repulsion to overcome the van der Waals attraction or adding spacers to increase the inter-sheet spacing. In this dissertation, two very different ideas that can prevent CMG aggregation without extensively modifying the material or introducing foreign spacer materials are introduced. The van der Waals potential decreases with reduced overlapping area between sheets. For CMG, reducing the lateral dimension from micrometer to nanometer scale should greatly enhance their colloidal stability with additional advantages of increased charge density and decreased probability to interact. The enhanced colloidal stability of GO and CMG nanocolloids makes them especially promising for spectroscopy based bio-sensing applications. For potential applications in a compact bulk solid form, the sheets were converted into paper-ball like structure using capillary compression in evaporating aerosol droplets. The crumpled graphene balls are stabilized by locally folded pi-pi stacked ridges, and do not unfold or collapse during common processing steps. They can tightly pack without greatly reducing the surface area. This form of graphene leads to scalable performance in energy storage. For example, planer sheets tend to aggregate and

  19. Structure of Viral Aggregates

    NASA Astrophysics Data System (ADS)

    Barr, Stephen; Luijten, Erik

    2010-03-01

    The aggregation of virus particles is a particular form of colloidal self-assembly, since viruses of a give type are monodisperse and have identical, anisotropic surface charge distributions. In small-angle X-ray scattering experiments, the Qbeta virus was found to organize in different crystal structures in the presence of divalent salt and non-adsorbing polymer. Since a simple isotropic potential cannot explain the occurrence of all observed phases, we employ computer simulations to investigate how the surface charge distribution affects the virus interactions. Using a detailed model of the virus particle, we find an asymmetric ion distribution around the virus which gives rise to the different phases observed.

  20. Implementing Project Based Survey Research Skills to Grade Six ELP Students with "The Survey Toolkit" and "TinkerPlots"[R

    ERIC Educational Resources Information Center

    Walsh, Thomas, Jr.

    2011-01-01

    "Survey Toolkit Collecting Information, Analyzing Data and Writing Reports" (Walsh, 2009a) is discussed as a survey research curriculum used by the author's sixth grade students. The report describes the implementation of "The Survey Toolkit" curriculum and "TinkerPlots"[R] software to provide instruction to students learning a project based…

  1. Instructional Improvement Cycle: A Teacher's Toolkit for Collecting and Analyzing Data on Instructional Strategies. REL 2015-080

    ERIC Educational Resources Information Center

    Cherasaro, Trudy L.; Reale, Marianne L.; Haystead, Mark; Marzano, Robert J.

    2015-01-01

    This toolkit, developed by Regional Educational Laboratory (REL) Central in collaboration with York Public Schools in Nebraska, provides a process and tools to help teachers use data from their classroom assessments to evaluate promising practices. The toolkit provides teachers with guidance on how to deliberately apply and study one classroom…

  2. Implementing Project Based Survey Research Skills to Grade Six ELP Students with "The Survey Toolkit" and "TinkerPlots"[R

    ERIC Educational Resources Information Center

    Walsh, Thomas, Jr.

    2011-01-01

    "Survey Toolkit Collecting Information, Analyzing Data and Writing Reports" (Walsh, 2009a) is discussed as a survey research curriculum used by the author's sixth grade students. The report describes the implementation of "The Survey Toolkit" curriculum and "TinkerPlots"[R] software to provide instruction to students learning a project based…

  3. Toolkit of Resources for Engaging Parents and Community as Partners in Education. Part 4: Engaging All in Data Conversations

    ERIC Educational Resources Information Center

    Regional Educational Laboratory Pacific, 2015

    2015-01-01

    This toolkit is designed to guide school staff in strengthening partnerships with families and community members to support student learning. This toolkit offers an integrated approach to family and community engagement, bringing together research, promising practices, and a wide range of useful tools and resources with explanations and directions…

  4. Toolkit of Resources for Engaging Parents and Community as Partners in Education. Part 2: Building a Cultural Bridge

    ERIC Educational Resources Information Center

    Regional Educational Laboratory Pacific, 2015

    2015-01-01

    This toolkit is designed to guide school staff in strengthening partnerships with families and community members to support student learning. This toolkit offers an integrated approach to family and community engagement, bringing together research, promising practices, and a wide range of useful tools and resources with explanations and directions…

  5. WATERSHED HEALTH ASSESSMENT TOOLS-INVESTIGATING FISHERIES (WHAT-IF): A MODELING TOOLKIT FOR WATERSHED AND FISHERIES MANAGEMENT

    EPA Science Inventory

    The Watershed Health Assessment Tools-Investigating Fisheries (WHAT-IF) is a decision-analysis modeling toolkit for personal computers that supports watershed and fisheries management. The WHAT-IF toolkit includes a relational database, help-system functions and documentation, a...

  6. WATERSHED HEALTH ASSESSMENT TOOLS-INVESTIGATING FISHERIES (WHAT-IF): A MODELING TOOLKIT FOR WATERSHED AND FISHERIES MANAGEMENT

    EPA Science Inventory

    The Watershed Health Assessment Tools-Investigating Fisheries (WHAT-IF) is a decision-analysis modeling toolkit for personal computers that supports watershed and fisheries management. The WHAT-IF toolkit includes a relational database, help-system functions and documentation, a...

  7. A Platform to Build Mobile Health Apps: The Personal Health Intervention Toolkit (PHIT).

    PubMed

    Eckhoff, Randall Peter; Kizakevich, Paul Nicholas; Bakalov, Vesselina; Zhang, Yuying; Bryant, Stephanie Patrice; Hobbs, Maria Ann

    2015-06-01

    Personal Health Intervention Toolkit (PHIT) is an advanced cross-platform software framework targeted at personal self-help research on mobile devices. Following the subjective and objective measurement, assessment, and plan methodology for health assessment and intervention recommendations, the PHIT platform lets researchers quickly build mobile health research Android and iOS apps. They can (1) create complex data-collection instruments using a simple extensible markup language (XML) schema; (2) use Bluetooth wireless sensors; (3) create targeted self-help interventions based on collected data via XML-coded logic; (4) facilitate cross-study reuse from the library of existing instruments and interventions such as stress, anxiety, sleep quality, and substance abuse; and (5) monitor longitudinal intervention studies via daily upload to a Web-based dashboard portal. For physiological data, Bluetooth sensors collect real-time data with on-device processing. For example, using the BinarHeartSensor, the PHIT platform processes the heart rate data into heart rate variability measures, and plots these data as time-series waveforms. Subjective data instruments are user data-entry screens, comprising a series of forms with validation and processing logic. The PHIT instrument library consists of over 70 reusable instruments for various domains including cognitive, environmental, psychiatric, psychosocial, and substance abuse. Many are standardized instruments, such as the Alcohol Use Disorder Identification Test, Patient Health Questionnaire-8, and Post-Traumatic Stress Disorder Checklist. Autonomous instruments such as battery and global positioning system location support continuous background data collection. All data are acquired using a schedule appropriate to the app's deployment. The PHIT intelligent virtual advisor (iVA) is an expert system logic layer, which analyzes the data in real time on the device. This data analysis results in a tailored app of interventions

  8. A Platform to Build Mobile Health Apps: The Personal Health Intervention Toolkit (PHIT)

    PubMed Central

    2015-01-01

    Personal Health Intervention Toolkit (PHIT) is an advanced cross-platform software framework targeted at personal self-help research on mobile devices. Following the subjective and objective measurement, assessment, and plan methodology for health assessment and intervention recommendations, the PHIT platform lets researchers quickly build mobile health research Android and iOS apps. They can (1) create complex data-collection instruments using a simple extensible markup language (XML) schema; (2) use Bluetooth wireless sensors; (3) create targeted self-help interventions based on collected data via XML-coded logic; (4) facilitate cross-study reuse from the library of existing instruments and interventions such as stress, anxiety, sleep quality, and substance abuse; and (5) monitor longitudinal intervention studies via daily upload to a Web-based dashboard portal. For physiological data, Bluetooth sensors collect real-time data with on-device processing. For example, using the BinarHeartSensor, the PHIT platform processes the heart rate data into heart rate variability measures, and plots these data as time-series waveforms. Subjective data instruments are user data-entry screens, comprising a series of forms with validation and processing logic. The PHIT instrument library consists of over 70 reusable instruments for various domains including cognitive, environmental, psychiatric, psychosocial, and substance abuse. Many are standardized instruments, such as the Alcohol Use Disorder Identification Test, Patient Health Questionnaire-8, and Post-Traumatic Stress Disorder Checklist. Autonomous instruments such as battery and global positioning system location support continuous background data collection. All data are acquired using a schedule appropriate to the app’s deployment. The PHIT intelligent virtual advisor (iVA) is an expert system logic layer, which analyzes the data in real time on the device. This data analysis results in a tailored app of interventions

  9. IGSTK: Framework and example application using an open source toolkit for image-guided surgery applications

    NASA Astrophysics Data System (ADS)

    Cheng, Peng; Zhang, Hui; Kim, Hee-su; Gary, Kevin; Blake, M. Brian; Gobbi, David; Aylward, Stephen; Jomier, Julien; Enquobahrie, Andinet; Avila, Rick; Ibanez, Luis; Cleary, Kevin

    2006-03-01

    Open source software has tremendous potential for improving the productivity of research labs and enabling the development of new medical applications. The Image-Guided Surgery Toolkit (IGSTK) is an open source software toolkit based on ITK, VTK, and FLTK, and uses the cross-platform tools CMAKE and DART to support common operating systems such as Linux, Windows, and MacOS. IGSTK integrates the basic components needed in surgical guidance applications and provides a common platform for fast prototyping and development of robust image-guided applications. This paper gives an overview of the IGSTK framework and current status of development followed by an example needle biopsy application to demonstrate how to develop an image-guided application using this toolkit.

  10. Kekule.js: An Open Source JavaScript Chemoinformatics Toolkit.

    PubMed

    Jiang, Chen; Jin, Xi; Dong, Ying; Chen, Ming

    2016-06-27

    Kekule.js is an open-source, object-oriented JavaScript toolkit for chemoinformatics. It provides methods for many common tasks in molecular informatics, including chemical data input/output (I/O), two- and three-dimensional (2D/3D) rendering of chemical structure, stereo identification, ring perception, structure comparison, and substructure search. Encapsulated widgets to display and edit chemical structures directly in web context are also supplied. Developed with web standards, the toolkit is ideal for building chemoinformatics applications over the Internet. Moreover, it is highly platform-independent and can also be used in desktop or mobile environments. Some initial applications, such as plugins for inputting chemical structures on the web and uses in chemistry education, have been developed based on the toolkit.

  11. AutoMicromanager: a microscopy scripting toolkit for LABVIEW and other programming environments.

    PubMed

    Ashcroft, Brian Alan; Oosterkamp, Tjerk

    2010-11-01

    We present a scripting toolkit for the acquisition and analysis of a wide variety of imaging data by integrating the ease of use of various programming environments such as LABVIEW, IGOR PRO, MATLAB, SCILAB, and others. This toolkit is designed to allow the user to quickly program a variety of standard microscopy components for custom microscopy applications allowing much more flexibility than other packages. Included are both programming tools as well as graphical user interface classes allowing a standard, consistent, and easy to maintain scripting environment. This programming toolkit allows easy access to most commonly used cameras, stages, and shutters through the Micromanager project so the scripter can focus on their custom application instead of boilerplate code generation.

  12. The Visualization Toolkit (VTK): Rewriting the rendering code for modern graphics cards

    NASA Astrophysics Data System (ADS)

    Hanwell, Marcus D.; Martin, Kenneth M.; Chaudhary, Aashish; Avila, Lisa S.

    2015-09-01

    The Visualization Toolkit (VTK) is an open source, permissively licensed, cross-platform toolkit for scientific data processing, visualization, and data analysis. It is over two decades old, originally developed for a very different graphics card architecture. Modern graphics cards feature fully programmable, highly parallelized architectures with large core counts. VTK's rendering code was rewritten to take advantage of modern graphics cards, maintaining most of the toolkit's programming interfaces. This offers the opportunity to compare the performance of old and new rendering code on the same systems/cards. Significant improvements in rendering speeds and memory footprints mean that scientific data can be visualized in greater detail than ever before. The widespread use of VTK means that these improvements will reap significant benefits.

  13. The GeoViz Toolkit: Using component-oriented coordination methods for geographic visualization and analysis

    PubMed Central

    Hardisty, Frank; Robinson, Anthony C.

    2010-01-01

    In this paper we present the GeoViz Toolkit, an open-source, internet-delivered program for geographic visualization and analysis that features a diverse set of software components which can be flexibly combined by users who do not have programming expertise. The design and architecture of the GeoViz Toolkit allows us to address three key research challenges in geovisualization: allowing end users to create their own geovisualization and analysis component set on-the-fly, integrating geovisualization methods with spatial analysis methods, and making geovisualization applications sharable between users. Each of these tasks necessitates a robust yet flexible approach to inter-tool coordination. The coordination strategy we developed for the GeoViz Toolkit, called Introspective Observer Coordination, leverages and combines key advances in software engineering from the last decade: automatic introspection of objects, software design patterns, and reflective invocation of methods. PMID:21731423

  14. Looking at research consent forms through a participant-centered lens: the PRISM readability toolkit.

    PubMed

    Ridpath, Jessica R; Wiese, Cheryl J; Greene, Sarah M

    2009-01-01

    Communicating in lay language is an underdeveloped skill among many researchers-a limitation that contributes to low readability among research consent forms and may hinder participant understanding of study procedures and risks. We present the Project to Review and Improve Study Materials (PRISM) and its centerpiece, the PRISM Readability Toolkit. The toolkit provides strategies for creating study materials that are readable and participant centered, focusing on consent forms but also addressing other participant materials. Based on plain language principles, this free resource includes a flexible menu of tools, such as an editing checklist, before and after examples, easy-to-read template language, and a list of alternative words. Among PRISM's ongoing goals is to test the toolkit with populations groups.

  15. AutoMicromanager: A microscopy scripting toolkit for LABVIEW and other programming environments

    NASA Astrophysics Data System (ADS)

    Ashcroft, Brian Alan; Oosterkamp, Tjerk

    2010-11-01

    We present a scripting toolkit for the acquisition and analysis of a wide variety of imaging data by integrating the ease of use of various programming environments such as LABVIEW, IGOR PRO, MATLAB, SCILAB, and others. This toolkit is designed to allow the user to quickly program a variety of standard microscopy components for custom microscopy applications allowing much more flexibility than other packages. Included are both programming tools as well as graphical user interface classes allowing a standard, consistent, and easy to maintain scripting environment. This programming toolkit allows easy access to most commonly used cameras, stages, and shutters through the Micromanager project so the scripter can focus on their custom application instead of boilerplate code generation.

  16. The MITK image guided therapy toolkit and its application for augmented reality in laparoscopic prostate surgery

    NASA Astrophysics Data System (ADS)

    Baumhauer, Matthias; Neuhaus, Jochen; Fritzsche, Klaus; Meinzer, Hans-Peter

    2010-02-01

    Image Guided Therapy (IGT) faces researchers with high demands and efforts in system design, prototype implementation, and evaluation. The lack of standardized software tools, like algorithm implementations, tracking device and tool setups, and data processing methods escalate the labor for system development and sustainable system evaluation. In this paper, a new toolkit component of the Medical Imaging and Interaction Toolkit (MITK), the MITK-IGT, and its exemplary application for computer-assisted prostate surgery are presented. MITK-IGT aims at integrating software tools, algorithms and tracking device interfaces into the MITK toolkit to provide a comprehensive software framework for computer aided diagnosis support, therapy planning, treatment support, and radiological follow-up. An exemplary application of the MITK-IGT framework is introduced with a surgical navigation system for laparos-copic prostate surgery. It illustrates the broad range of application possibilities provided by the framework, as well as its simple extensibility with custom algorithms and other software modules.

  17. Improving the Effectiveness of Medication Review: Guidance from the Health Literacy Universal Precautions Toolkit.

    PubMed

    Weiss, Barry D; Brega, Angela G; LeBlanc, William G; Mabachi, Natabhona M; Barnard, Juliana; Albright, Karen; Cifuentes, Maribel; Brach, Cindy; West, David R

    2016-01-01

    Although routine medication reviews in primary care practice are recommended to identify drug therapy problems, it is often difficult to get patients to bring all their medications to office visits. The objective of this study was to determine whether the medication review tool in the Agency for Healthcare Research and Quality Health Literacy Universal Precautions Toolkit can help to improve medication reviews in primary care practices. The toolkit's "Brown Bag Medication Review" was implemented in a rural private practice in Missouri and an urban teaching practice in California. Practices recorded outcomes of medication reviews with 45 patients before toolkit implementation and then changed their medication review processes based on guidance in the toolkit. Six months later we conducted interviews with practice staff to identify changes made as a result of implementing the tool, and practices recorded outcomes of medication reviews with 41 additional patients. Data analyses compared differences in whether all medications were brought to visits, the number of medications reviewed, drug therapy problems identified, and changes in medication regimens before and after implementation. Interviews revealed that practices made the changes recommended in the toolkit to encourage patients to bring medications to office visits. Evaluation before and after implementation revealed a 3-fold increase in the percentage of patients who brought all their prescription medications and a 6-fold increase in the number of prescription medications brought to office visits. The percentage of reviews in which drug therapy problems were identified doubled, as did the percentage of medication regimens revised. Use of the Health Literacy Universal Precautions Toolkit can help to identify drug therapy problems. © Copyright 2016 by the American Board of Family Medicine.

  18. Improving the Effectiveness of Medication Review: Guidance from the Health Literacy Universal Precautions Toolkit

    PubMed Central

    Weiss, Barry D.; Brega, Angela G.; LeBlanc, William G.; Mabachi, Natabhona M.; Barnard, Juliana; Albright, Karen; Cifuentes, Maribel; Brach, Cindy; West, David R.

    2016-01-01

    Background Although routine medication reviews in primary care practice are recommended to identify drug therapy problems, it is often difficult to get patients to bring all their medications to office visits. The objective of this study was to determine whether the medication review tool in the Agency for Healthcare Research and Quality Health Literacy Universal Precautions Toolkit can help to improve medication reviews in primary care practices. Methods The toolkit's “Brown Bag Medication Review” was implemented in a rural private practice in Missouri and an urban teaching practice in California. Practices recorded outcomes of medication reviews with 45 patients before toolkit implementation and then changed their medication review processes based on guidance in the toolkit. Six months later we conducted interviews with practice staff to identify changes made as a result of implementing the tool, and practices recorded outcomes of medication reviews with 41 additional patients. Data analyses compared differences in whether all medications were brought to visits, the number of medications reviewed, drug therapy problems identified, and changes in medication regimens before and after implementation. Results Interviews revealed that practices made the changes recommended in the toolkit to encourage patients to bring medications to office visits. Evaluation before and after implementation revealed a 3-fold increase in the percentage of patients who brought all their prescription medications and a 6-fold increase in the number of prescription medications brought to office visits. The percentage of reviews in which drug therapy problems were identified doubled, as did the percentage of medication regimens revised. Conclusions Use of the Health Literacy Universal Precautions Toolkit can help to identify drug therapy problems. PMID:26769873

  19. Patient-Centered Personal Health Record and Portal Implementation Toolkit for Ambulatory Clinics: A Feasibility Study.

    PubMed

    Nahm, Eun-Shim; Diblasi, Catherine; Gonzales, Eva; Silver, Kristi; Zhu, Shijun; Sagherian, Knar; Kongs, Katherine

    2017-04-01

    Personal health records and patient portals have been shown to be effective in managing chronic illnesses. Despite recent nationwide implementation efforts, the personal health record and patient portal adoption rates among patients are low, and the lack of support for patients using the programs remains a critical gap in most implementation processes. In this study, we implemented the Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit in a large diabetes/endocrinology center and assessed its preliminary impact on personal health record and patient portal knowledge, self-efficacy, patient-provider communication, and adherence to treatment plans. Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit is composed of Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit-General, clinic-level resources for clinicians, staff, and patients, and Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit Plus, an optional 4-week online resource program for patients ("MyHealthPortal"). First, Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit-General was implemented, and all clinicians and staff were educated about the center's personal health record and patient portal. Then general patient education was initiated, while a randomized controlled trial was conducted to test the preliminary effects of "MyHealthPortal" using a small sample (n = 74) with three observations (baseline and 4 and 12 weeks). The intervention group showed significantly greater improvement than the control group in patient-provider communication at 4 weeks (t56 = 3.00, P = .004). For other variables, the intervention group tended to show greater improvement; however, the differences were not significant. In this preliminary study, Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit showed potential for filling the gap in the current

  20. Taurine and platelet aggregation

    SciTech Connect

    Nauss-Karol, C.; VanderWende, C.; Gaut, Z.N.

    1986-03-01

    Taurine is a putative neurotransmitter or neuromodulator. The endogenous taurine concentration in human platelets, determined by amino acid analysis, is 15 ..mu..M/g. In spite of this high level, taurine is actively accumulated. Uptake is saturable, Na/sup +/ and temperature dependent, and suppressed by metabolic inhibitors, structural analogues, and several classes of centrally active substances. High, medium and low affinity transport processes have been characterized, and the platelet may represent a model system for taurine transport in the CNS. When platelets were incubated with /sup 14/C-taurine for 30 minutes, then resuspended in fresh medium and reincubated for one hour, essentially all of the taurine was retained within the cells. Taurine, at concentrations ranging from 10-1000 ..mu..M, had no effect on platelet aggregation induced by ADP or epinephrine. However, taurine may have a role in platelet aggregation since 35-39% of the taurine taken up by human platelets appears to be secreted during the release reaction induced by low concentrations of either epinephrine or ADP, respectively. This release phenomenon would imply that part of the taurine taken up is stored directly in the dense bodies of the platelet.