Science.gov

Sample records for files aggregation toolkit

  1. Small file aggregation in a parallel computing system

    DOEpatents

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Grider, Gary; Zhang, Jingwang

    2014-09-02

    Techniques are provided for small file aggregation in a parallel computing system. An exemplary method for storing a plurality of files generated by a plurality of processes in a parallel computing system comprises aggregating the plurality of files into a single aggregated file; and generating metadata for the single aggregated file. The metadata comprises an offset and a length of each of the plurality of files in the single aggregated file. The metadata can be used to unpack one or more of the files from the single aggregated file.

  2. The Internet Toolkit: File Compression and Archive Utilities.

    ERIC Educational Resources Information Center

    Delfino, Erik

    1993-01-01

    Describes utility programs available for posttransfer processing files that have been downloaded from the Internet. Highlights include file compression; viewing graphics files; converting binary files to ASCII files; and how to find utility programs. (LRW)

  3. PDB@: an offline toolkit for exploration and analysis of PDB files.

    PubMed

    Mani, Udayakumar; Ravisankar, Sadhana; Ramakrishnan, Sai Mukund

    2013-12-01

    Protein Data Bank (PDB) is a freely accessible archive of the 3-D structural data of biological molecules. Structure based studies offers a unique vantage point in inferring the properties of a protein molecule from structural data. This is too big a task to be done manually. Moreover, there is no single tool, software or server that comprehensively analyses all structure-based properties. The objective of the present work is to develop an offline computational toolkit, PDB@ containing in-built algorithms that help categorizing the structural properties of a protein molecule. The user has the facility to view and edit the PDB file to his need. Some features of the present work are unique in itself and others are an improvement over existing tools. Also, the representation of protein properties in both graphical and textual formats helps in predicting all the necessary details of a protein molecule on a single platform.

  4. SeqKit: A Cross-Platform and Ultrafast Toolkit for FASTA/Q File Manipulation

    PubMed Central

    Le, Shuai; Li, Yan; Hu, Fuquan

    2016-01-01

    FASTA and FASTQ are basic and ubiquitous formats for storing nucleotide and protein sequences. Common manipulations of FASTA/Q file include converting, searching, filtering, deduplication, splitting, shuffling, and sampling. Existing tools only implement some of these manipulations, and not particularly efficiently, and some are only available for certain operating systems. Furthermore, the complicated installation process of required packages and running environments can render these programs less user friendly. This paper describes a cross-platform ultrafast comprehensive toolkit for FASTA/Q processing. SeqKit provides executable binary files for all major operating systems, including Windows, Linux, and Mac OSX, and can be directly used without any dependencies or pre-configurations. SeqKit demonstrates competitive performance in execution time and memory usage compared to similar tools. The efficiency and usability of SeqKit enable researchers to rapidly accomplish common FASTA/Q file manipulations. SeqKit is open source and available on Github at https://github.com/shenwei356/seqkit. PMID:27706213

  5. MISR Toolkit

    Atmospheric Science Data Center

    2014-05-07

    ... of a collection of routines that can be used as command line tools or in the development of larger software tools and applications.   ... field into an 2-D data plane (eg. RetrAppMask[0][5]) Convert MISR product files to IDL ENVI files   The latest ... the Toolkit, you may need to  download and install the .NET Framework .   ...

  6. A next-generation open-source toolkit for FITS file image viewing

    NASA Astrophysics Data System (ADS)

    Jeschke, Eric; Inagaki, Takeshi; Kackley, Russell

    2012-09-01

    The astronomical community has a long tradition of sharing and collaborating on FITS file tools, including viewers. Several excellent viewers such as DS9 and Skycat have been successfully reused again and again. Yet this "first generation" of viewers predate the emergence of a new class of powerful object-oriented scripting languages such as Python, which has quickly become a very popular language for astronomical (and general scientific) use. Integration and extension of these viewers by Python is cumbersome. Furthermore, these viewers are also built on older widget toolkits such as Tcl/Tk, which are becoming increasingly difficult to support and extend as time passes. Suburu Telescope's second-generation observation control system (Gen2) is built on a a foundation of Python-based technologies and leverages several important astronomically useful packages such as numpy and pyfits. We have written a new flexible core widget for viewing FITS files which is available in versions for both the modern Gtk and Qt-based desktops. The widget offers seamless integration with pyfits and numpy arrays of FITS data. A full-featured viewer based on this widget has been developed, and supports a plug-in architecture in which new features can be added by scripting simple Python modules. In this paper we will describe and demonstrate the capabilities of the new widget and viewer and discuss the architecture of the software which allows new features and widgets to easily developed by subclassing a powerful abstract base class. The software will be released as open-source.

  7. Basic Internet Software Toolkit.

    ERIC Educational Resources Information Center

    Buchanan, Larry

    1998-01-01

    Once schools are connected to the Internet, the next step is getting network workstations configured for Internet access. This article describes a basic toolkit comprising software currently available on the Internet for free or modest cost. Lists URLs for Web browser, Telnet, FTP, file decompression, portable document format (PDF) reader,…

  8. Geospatial Toolkit

    SciTech Connect

    2010-10-14

    The Geospatial Toolkit is an NREL-developed map-based software application that integrates resource data and other geographic information systems (GIS) data for integrated resource assessment. The non-resource, country-specific data for each toolkit comes from a variety of agencies within each country as well as from global datasets. Originally developed in 2005, the Geospatial Toolkit was completely redesigned and re-released in November 2010 to provide a more modern, easier-to-use interface with considerably faster analytical querying capabilities. The revised version of the Geospatial Toolkit has been released for all original toolkit countries/regions and each software package is made available on NREL's website,

  9. Geospatial Toolkit

    2010-10-14

    The Geospatial Toolkit is an NREL-developed map-based software application that integrates resource data and other geographic information systems (GIS) data for integrated resource assessment. The non-resource, country-specific data for each toolkit comes from a variety of agencies within each country as well as from global datasets. Originally developed in 2005, the Geospatial Toolkit was completely redesigned and re-released in November 2010 to provide a more modern, easier-to-use interface with considerably faster analytical querying capabilities. Themore » revised version of the Geospatial Toolkit has been released for all original toolkit countries/regions and each software package is made available on NREL's website,« less

  10. Literacy Toolkit

    ERIC Educational Resources Information Center

    Center for Best Practices in Early Childhood Education, 2005

    2005-01-01

    The toolkit contains print and electronic resources, including (1) "eMERGing Literacy and Technology: Working Together", A 492 page curriculum guide; (2) "LitTECH Interactive Presents: The Beginning of Literacy", a DVD that provides and overview linking technology to the concepts of emerging literacy; (3) "Your Preschool Classroom Computer Center:…

  11. Local Toolkit

    SciTech Connect

    Lindstrom, P.; Yoon, S. E.; Isenburg, M.

    2007-05-31

    The LOCAL Toolkit contains tools and libraries developed under the LLNL LOCAL LDRD project for managing and processing large unstructured data sets primrily from parallel numerical simulations, such as triangular, tetrahedral, and hexahedral meshes, point sets, and graphs. The tools have three main functionalities: cache-coherent, linear ordering of multidimensional data; lossy and lossless data compression optimized for different data types; and an out-of-core streaming I/O library with simple processing modules for unstructed data.

  12. Tracker Toolkit

    NASA Technical Reports Server (NTRS)

    Lewis, Steven J.; Palacios, David M.

    2013-01-01

    This software can track multiple moving objects within a video stream simultaneously, use visual features to aid in the tracking, and initiate tracks based on object detection in a subregion. A simple programmatic interface allows plugging into larger image chain modeling suites. It extracts unique visual features for aid in tracking and later analysis, and includes sub-functionality for extracting visual features about an object identified within an image frame. Tracker Toolkit utilizes a feature extraction algorithm to tag each object with metadata features about its size, shape, color, and movement. Its functionality is independent of the scale of objects within a scene. The only assumption made on the tracked objects is that they move. There are no constraints on size within the scene, shape, or type of movement. The Tracker Toolkit is also capable of following an arbitrary number of objects in the same scene, identifying and propagating the track of each object from frame to frame. Target objects may be specified for tracking beforehand, or may be dynamically discovered within a tripwire region. Initialization of the Tracker Toolkit algorithm includes two steps: Initializing the data structures for tracked target objects, including targets preselected for tracking; and initializing the tripwire region. If no tripwire region is desired, this step is skipped. The tripwire region is an area within the frames that is always checked for new objects, and all new objects discovered within the region will be tracked until lost (by leaving the frame, stopping, or blending in to the background).

  13. CGAT: computational genomics analysis toolkit.

    PubMed

    Sims, David; Ilott, Nicholas E; Sansom, Stephen N; Sudbery, Ian M; Johnson, Jethro S; Fawcett, Katherine A; Berlanga-Taylor, Antonio J; Luna-Valero, Sebastian; Ponting, Chris P; Heger, Andreas

    2014-05-01

    Computational genomics seeks to draw biological inferences from genomic datasets, often by integrating and contextualizing next-generation sequencing data. CGAT provides an extensive suite of tools designed to assist in the analysis of genome scale data from a range of standard file formats. The toolkit enables filtering, comparison, conversion, summarization and annotation of genomic intervals, gene sets and sequences. The tools can both be run from the Unix command line and installed into visual workflow builders, such as Galaxy.

  14. A new open-source Python-based Space Weather data access, visualization, and analysis toolkit

    NASA Astrophysics Data System (ADS)

    de Larquier, S.; Ribeiro, A.; Frissell, N. A.; Spaleta, J.; Kunduri, B.; Thomas, E. G.; Ruohoniemi, J.; Baker, J. B.

    2013-12-01

    Space weather research relies heavily on combining and comparing data from multiple observational platforms. Current frameworks exist to aggregate some of the data sources, most based on file downloads via web or ftp interfaces. Empirical models are mostly fortran based and lack interfaces with more useful scripting languages. In an effort to improve data and model access, the SuperDARN community has been developing a Python-based Space Science Data Visualization Toolkit (DaViTpy). At the center of this development was a redesign of how our data (from 30 years of SuperDARN radars) was made available. Several access solutions are now wrapped into one convenient Python interface which probes local directories, a new remote NoSQL database, and an FTP server to retrieve the requested data based on availability. Motivated by the efficiency of this interface and the inherent need for data from multiple instruments, we implemented similar modules for other space science datasets (POES, OMNI, Kp, AE...), and also included fundamental empirical models with Python interfaces to enhance data analysis (IRI, HWM, MSIS...). All these modules and more are gathered in a single convenient toolkit, which is collaboratively developed and distributed using Github and continues to grow. While still in its early stages, we expect this toolkit will facilitate multi-instrument space weather research and improve scientific productivity.

  15. The Comprehension Toolkit

    ERIC Educational Resources Information Center

    Harvey, Stephanie; Goudvis, Anne

    2005-01-01

    "The Comprehension Toolkit" focuses on reading, writing, talking, listening, and investigating, to deepen understanding of nonfiction texts. With a focus on strategic thinking, this toolkit's lessons provide a foundation for developing independent readers and learners. It also provides an alternative to the traditional assign and correct…

  16. Student Success Center Toolkit

    ERIC Educational Resources Information Center

    Jobs For the Future, 2014

    2014-01-01

    "Student Success Center Toolkit" is a compilation of materials organized to assist Student Success Center directors as they staff, launch, operate, and sustain Centers. The toolkit features materials created and used by existing Centers, such as staffing and budgeting templates, launch materials, sample meeting agendas, and fundraising…

  17. TOOLKIT, Version 2. 0

    SciTech Connect

    Schroeder, E.; Bagot, B.; McNeill, R.L.

    1990-05-09

    The purpose of this User's Guide is to show by example many of the features of Toolkit II. Some examples will be copies of screens as they appear while running the Toolkit. Other examples will show what the user should enter in various situations; in these instances, what the computer asserts will be in boldface and what the user responds will be in regular type. The User's Guide is divided into four sections. The first section, FOCUS Databases'', will give a broad overview of the Focus administrative databases that are available on the VAX; easy-to-use reports are available for most of them in the Toolkit. The second section, Getting Started'', will cover the steps necessary to log onto the Computer Center VAX cluster and how to start Focus and the Toolkit. The third section, Using the Toolkit'', will discuss some of the features in the Toolkit -- the available reports and how to access them, as well as some utilities. The fourth section, Helpful Hints'', will cover some useful facts about the VAX and Focus as well as some of the more common problems that can occur. The Toolkit is not set in concrete but is continually being revised and improved. If you have any opinions as to changes that you would like to see made to the Toolkit or new features that you would like included, please let us know. Since we do try to respond to the needs of the user and make periodic improvement to the Toolkit, this User's Guide may not correspond exactly to what is available in the computer. In general, changes are made to provide new options or features; rarely is an existing feature deleted.

  18. JAVA Stereo Display Toolkit

    NASA Technical Reports Server (NTRS)

    Edmonds, Karina

    2008-01-01

    This toolkit provides a common interface for displaying graphical user interface (GUI) components in stereo using either specialized stereo display hardware (e.g., liquid crystal shutter or polarized glasses) or anaglyph display (red/blue glasses) on standard workstation displays. An application using this toolkit will work without modification in either environment, allowing stereo software to reach a wider audience without sacrificing high-quality display on dedicated hardware. The toolkit is written in Java for use with the Swing GUI Toolkit and has cross-platform compatibility. It hooks into the graphics system, allowing any standard Swing component to be displayed in stereo. It uses the OpenGL graphics library to control the stereo hardware and to perform the rendering. It also supports anaglyph and special stereo hardware using the same API (application-program interface), and has the ability to simulate color stereo in anaglyph mode by combining the red band of the left image with the green/blue bands of the right image. This is a low-level toolkit that accomplishes simply the display of components (including the JadeDisplay image display component). It does not include higher-level functions such as disparity adjustment, 3D cursor, or overlays all of which can be built using this toolkit.

  19. System Design Toolkit for Integrated Modular Avionics for Space

    NASA Astrophysics Data System (ADS)

    Hann, Mark; Balbastre Betoret, Patricia; Simo Ten, Jose Enrique; De Ferluc, Regis; Ramachandran, Jinesh

    2015-09-01

    The IMA-SP development process identified tools were needed to perform the activities of: i) Partitioning and Resource Allocation and ii) System Feasibility Assessment. This paper describes the definition, design, implementation and test of the tool support required to perform the IMA-SP development process activities. This includes the definition of a data model, with associated files and file formats, describing the complete setup of a partitioned system and allowing system feasibility assessment; the development of a prototype of the tool set, that is called the IMA-SP System Design Toolkit (SDT) and the demonstration of the toolkit on a case study.

  20. The Einstein Toolkit

    NASA Astrophysics Data System (ADS)

    Löffler, Frank

    2012-03-01

    The Einstein Toolkit Consortium is developing and supporting open software for relativistic astrophysics. Its aim is to provide the core computational tools that can enable new science, broaden our community, facilitate interdisciplinary research and take advantage of petascale computers and advanced cyberinfrastructure. The Einstein Toolkit currently consists of an open set of over 100 modules for the Cactus framework, primarily for computational relativity along with associated tools for simulation management and visualization. The toolkit includes solvers for vacuum spacetimes as well as relativistic magneto-hydrodynamics, along with modules for initial data, analysis and computational infrastructure. These modules have been developed and improved over many years by many different researchers. The Einstein Toolkit is supported by a distributed model, combining core support of software, tools, and documentation in its own repositories and through partnerships with other developers who contribute open software and coordinate together on development. As of January 2012 it has 68 registered members from 30 research groups world-wide. This talk will present the current capabilities of the Einstein Toolkit and will point to information how to leverage it for future research.

  1. Knowledge information management toolkit and method

    DOEpatents

    Hempstead, Antoinette R.; Brown, Kenneth L.

    2006-08-15

    A system is provided for managing user entry and/or modification of knowledge information into a knowledge base file having an integrator support component and a data source access support component. The system includes processing circuitry, memory, a user interface, and a knowledge base toolkit. The memory communicates with the processing circuitry and is configured to store at least one knowledge base. The user interface communicates with the processing circuitry and is configured for user entry and/or modification of knowledge pieces within a knowledge base. The knowledge base toolkit is configured for converting knowledge in at least one knowledge base from a first knowledge base form into a second knowledge base form. A method is also provided.

  2. Water Security Toolkit

    SciTech Connect

    2012-09-11

    The Water Security Toolkit (WST) provides software for modeling and analyzing water distribution systems to minimize the potential impact of contamination incidents. WST wraps capabilities for contaminant transport, impact assessment, and sensor network design with response action plans, including source identification, rerouting, and decontamination, to provide a range of water security planning and real-time applications.

  3. THE EPANET PROGRAMMER'S TOOLKIT FOR ANALYSIS OF WATER DISTRIBUTION SYSTEMS

    EPA Science Inventory

    The EPANET Programmer's Toolkit is a collection of functions that helps simplify computer programming of water distribution network analyses. the functions can be used to read in a pipe network description file, modify selected component properties, run multiple hydraulic and wa...

  4. Parallel Climate Analysis Toolkit (ParCAT)

    SciTech Connect

    Smith, Brian Edward

    2013-06-30

    The parallel analysis toolkit (ParCAT) provides parallel statistical processing of large climate model simulation datasets. ParCAT provides parallel point-wise average calculations, frequency distributions, sum/differences of two datasets, and difference-of-average and average-of-difference for two datasets for arbitrary subsets of simulation time. ParCAT is a command-line utility that can be easily integrated in scripts or embedded in other application. ParCAT supports CMIP5 post-processed datasets as well as non-CMIP5 post-processed datasets. ParCAT reads and writes standard netCDF files.

  5. Alma Data Mining Toolkit

    NASA Astrophysics Data System (ADS)

    Friedel, Douglas; Looney, Leslie; Teuben, Peter J.; Pound, Marc W.; Rauch, Kevin P.; Mundy, Lee; Harris, Robert J.; Xu, Lisa

    2016-06-01

    ADMIT (ALMA Data Mining Toolkit) is a Python based pipeline toolkit for the creation and analysis of new science products from ALMA data. ADMIT quickly provides users with a detailed overview of their science products, for example: line identifications, line 'cutout' cubes, moment maps, and emission type analysis (e.g., feature detection). Users can download the small ADMIT pipeline product (< 20MB), analyze the results, then fine-tune and re-run the ADMIT pipeline (or any part thereof) on their own machines and interactively inspect the results. ADMIT has both a web browser and command line interface available for this purpose. By analyzing multiple data cubes simultaneously, data mining between many astronomical sources and line transitions are possible. Users are also able to enhance the capabilities of ADMIT by creating customized ADMIT tasks satisfying any special processing needs. We will present some of the salient features of ADMIT and example use cases.

  6. Self-assessment toolkit.

    PubMed

    2016-09-01

    A new health and integration toolkit has been launched by NHS Clinical Commissioners, in partnership with the Local Government Association, NHS Confederation and the Association of Directors of Adult Services. The self-assessment tool is designed to help local health and care leaders, through health and well-being boards, to assess their ambition, capability, capacity and readiness to integrate local health and social care services. PMID:27581897

  7. The Weather and Climate Toolkit

    NASA Astrophysics Data System (ADS)

    Ansari, S.; Del Greco, S.; Hankins, B.

    2010-12-01

    The Weather and Climate Toolkit (WCT) is free, platform independent software distributed from NOAA’s National Climatic Data Center (NCDC). The WCT allows the visualization and data export of weather and climate data, including Radar, Satellite and Model data. By leveraging the NetCDF for Java library and Common Data Model, the WCT is extremely scalable and capable of supporting many new datasets in the future. Gridded NetCDF files (regular and irregularly spaced, using Climate-Forecast (CF) conventions) are supported, along with many other formats including GRIB. The WCT provides tools for custom data overlays, Web Map Service (WMS) background maps, animations and basic filtering. The export of images and movies is provided in multiple formats. The WCT Data Export Wizard allows for data export in both vector polygon/point (Shapefile, Well-Known Text) and raster (GeoTIFF, ESRI Grid, VTK, Gridded NetCDF) formats. These data export features promote the interoperability of weather and climate information with various scientific communities and common software packages such as ArcGIS, Google Earth, MatLAB, GrADS and R. The WCT also supports an embedded, integrated Google Earth instance. The Google Earth Browser Plugin allows seamless visualization of data on a native 3-D Google Earth instance linked to the standard 2-D map. Level-II NEXRAD data for Hurricane Katrina GPCP (Global Precipitation Product), visualized in 2-D and internal Google Earth view.

  8. Mission Simulation Toolkit

    NASA Technical Reports Server (NTRS)

    Pisaich, Gregory; Flueckiger, Lorenzo; Neukom, Christian; Wagner, Mike; Buchanan, Eric; Plice, Laura

    2007-01-01

    The Mission Simulation Toolkit (MST) is a flexible software system for autonomy research. It was developed as part of the Mission Simulation Facility (MSF) project that was started in 2001 to facilitate the development of autonomous planetary robotic missions. Autonomy is a key enabling factor for robotic exploration. There has been a large gap between autonomy software (at the research level), and software that is ready for insertion into near-term space missions. The MST bridges this gap by providing a simulation framework and a suite of tools for supporting research and maturation of autonomy. MST uses a distributed framework based on the High Level Architecture (HLA) standard. A key feature of the MST framework is the ability to plug in new models to replace existing ones with the same services. This enables significant simulation flexibility, particularly the mixing and control of fidelity level. In addition, the MST provides automatic code generation from robot interfaces defined with the Unified Modeling Language (UML), methods for maintaining synchronization across distributed simulation systems, XML-based robot description, and an environment server. Finally, the MSF supports a number of third-party products including dynamic models and terrain databases. Although the communication objects and some of the simulation components that are provided with this toolkit are specifically designed for terrestrial surface rovers, the MST can be applied to any other domain, such as aerial, aquatic, or space.

  9. Radiation source search toolkit

    NASA Astrophysics Data System (ADS)

    Young, Jason S.

    The newly developed Radiation Source Search Toolkit (RSST) is a toolkit for generating gamma-ray spectroscopy data for use in the testing of source search algorithms. RSST is designed in a modular fashion to allow for ease of use while still maintaining accuracy in developing the output spectra. Users are allowed to define a real-world path for mobile radiation detectors to travel as well as radiation sources for possible detection. RSST can accept measured or simulated radiation spectrum data for generation into a source search simulation. RSST handles traversing the path, computing distance related attenuation, and generating the final output spectra. RSST also has the ability to simulate anisotropic shielding as well as traffic conditions that would impede a ground-based detection platform in a real-world scenario. RSST provides a novel fusion between spectral data and geospatial source search data generation. By utilizing the RSST, researchers can easily generate multiple datasets for testing detection algorithms without the need for actual radiation sources and mobile detector platforms.

  10. The MIS Pipeline Toolkit

    NASA Astrophysics Data System (ADS)

    Teuben, Peter J.; Pound, M. W.; Storm, S.; Mundy, L. G.; Salter, D. M.; Lee, K.; Kwon, W.; Fernandez Lopez, M.; Plunkett, A.

    2013-01-01

    A pipeline toolkit was developed to help organizing, reducing and analyzing a large number of near-identical datasets. This is a very general problem, for which many different solutions have been implemented. In this poster we present one such solution that lends itself to users of the Unix command line, using the Unix "make" utility, and adapts itself easily to observational as well as theoretical projects. Two examples are given, one from the CARMA CLASSy survey, and another from a simulated kinematic survey of early galaxy forming disks. The CLASSy survey (discussed in more detail in three accompanying posters) consists of 5 different star forming regions, observed with CARMA, each containing roughly 10-20 datasets in continuum and 3 different molecular lines, that need to be combined in final data cubes and maps. The strength of such a pipeline toolkit shows itself as new data are accumulated, the data reduction steps are improved and easily re-applied to previously taken data. For this we employed a master script that was run nightly, and collaborators submitted improved script and/or pipeline parameters that control these scripts. MIS is freely available for download.

  11. Multiphysics Application Coupling Toolkit

    SciTech Connect

    Campbell, Michael T.

    2013-12-02

    This particular consortium implementation of the software integration infrastructure will, in large part, refactor portions of the Rocstar multiphysics infrastructure. Development of this infrastructure originated at the University of Illinois DOE ASCI Center for Simulation of Advanced Rockets (CSAR) to support the center's massively parallel multiphysics simulation application, Rocstar, and has continued at IllinoisRocstar, a small company formed near the end of the University-based program. IllinoisRocstar is now licensing these new developments as free, open source, in hopes to help improve their own and others' access to infrastructure which can be readily utilized in developing coupled or composite software systems; with particular attention to more rapid production and utilization of multiphysics applications in the HPC environment. There are two major pieces to the consortium implementation, the Application Component Toolkit (ACT), and the Multiphysics Application Coupling Toolkit (MPACT). The current development focus is the ACT, which is (will be) the substrate for MPACT. The ACT itself is built up from the components described in the technical approach. In particular, the ACT has the following major components: 1.The Component Object Manager (COM): The COM package provides encapsulation of user applications, and their data. COM also provides the inter-component function call mechanism. 2.The System Integration Manager (SIM): The SIM package provides constructs and mechanisms for orchestrating composite systems of multiply integrated pieces.

  12. NAIF Toolkit - Extended

    NASA Technical Reports Server (NTRS)

    Acton, Charles H., Jr.; Bachman, Nathaniel J.; Semenov, Boris V.; Wright, Edward D.

    2010-01-01

    The Navigation Ancillary Infor ma tion Facility (NAIF) at JPL, acting under the direction of NASA s Office of Space Science, has built a data system named SPICE (Spacecraft Planet Instrument Cmatrix Events) to assist scientists in planning and interpreting scientific observations (see figure). SPICE provides geometric and some other ancillary information needed to recover the full value of science instrument data, including correlation of individual instrument data sets with data from other instruments on the same or other spacecraft. This data system is used to produce space mission observation geometry data sets known as SPICE kernels. It is also used to read SPICE kernels and to compute derived quantities such as positions, orientations, lighting angles, etc. The SPICE toolkit consists of a subroutine/ function library, executable programs (both large applications and simple utilities that focus on kernel management), and simple examples of using SPICE toolkit subroutines. This software is very accurate, thoroughly tested, and portable to all computers. It is extremely stable and reusable on all missions. Since the previous version, three significant capabilities have been added: Interactive Data Language (IDL) interface, MATLAB interface, and a geometric event finder subsystem.

  13. Multiphysics Application Coupling Toolkit

    2013-12-02

    This particular consortium implementation of the software integration infrastructure will, in large part, refactor portions of the Rocstar multiphysics infrastructure. Development of this infrastructure originated at the University of Illinois DOE ASCI Center for Simulation of Advanced Rockets (CSAR) to support the center's massively parallel multiphysics simulation application, Rocstar, and has continued at IllinoisRocstar, a small company formed near the end of the University-based program. IllinoisRocstar is now licensing these new developments as free, openmore » source, in hopes to help improve their own and others' access to infrastructure which can be readily utilized in developing coupled or composite software systems; with particular attention to more rapid production and utilization of multiphysics applications in the HPC environment. There are two major pieces to the consortium implementation, the Application Component Toolkit (ACT), and the Multiphysics Application Coupling Toolkit (MPACT). The current development focus is the ACT, which is (will be) the substrate for MPACT. The ACT itself is built up from the components described in the technical approach. In particular, the ACT has the following major components: 1.The Component Object Manager (COM): The COM package provides encapsulation of user applications, and their data. COM also provides the inter-component function call mechanism. 2.The System Integration Manager (SIM): The SIM package provides constructs and mechanisms for orchestrating composite systems of multiply integrated pieces.« less

  14. An object oriented fully 3D tomography visual toolkit.

    PubMed

    Agostinelli, S; Paoli, G

    2001-04-01

    In this paper we present a modern object oriented component object model (COMM) C + + toolkit dedicated to fully 3D cone-beam tomography. The toolkit allows the display and visual manipulation of analytical phantoms, projection sets and volumetric data through a standard Windows graphical user interface. Data input/output is performed using proprietary file formats but import/export of industry standard file formats, including raw binary, Windows bitmap and AVI, ACR/NEMA DICOMM 3 and NCSA HDF is available. At the time of writing built-in implemented data manipulators include a basic phantom ray-tracer and a Matrox Genesis frame grabbing facility. A COMM plug-in interface is provided for user-defined custom backprojector algorithms: a simple Feldkamp ActiveX control, including source code, is provided as an example; our fast Feldkamp plug-in is also available.

  15. Einstein Toolkit for Relativistic Astrophysics

    NASA Astrophysics Data System (ADS)

    Collaborative Effort

    2011-02-01

    The Einstein Toolkit is a collection of software components and tools for simulating and analyzing general relativistic astrophysical systems. Such systems include gravitational wave space-times, collisions of compact objects such as black holes or neutron stars, accretion onto compact objects, core collapse supernovae and Gamma-Ray Bursts. The Einstein Toolkit builds on numerous software efforts in the numerical relativity community including CactusEinstein, Whisky, and Carpet. The Einstein Toolkit currently uses the Cactus Framework as the underlying computational infrastructure that provides large-scale parallelization, general computational components, and a model for collaborative, portable code development.

  16. A Prototype Search Toolkit

    NASA Astrophysics Data System (ADS)

    Knepper, Margaret M.; Fox, Kevin L.; Frieder, Ophir

    Information overload is now a reality. We no longer worry about obtaining a sufficient volume of data; we now are concerned with sifting and understanding the massive volumes of data available to us. To do so, we developed an integrated information processing toolkit that provides the user with a variety of ways to view their information. The views include keyword search results, a domain specific ranking system that allows for adaptively capturing topic vocabularies to customize and focus the search results, navigation pages for browsing, and a geospatial and temporal component to visualize results in time and space, and provide “what if” scenario playing. Integrating the information from different tools and sources gives the user additional information and another way to analyze the data. An example of the integration is illustrated on reports of the avian influenza (bird flu).

  17. Mesh Quality Improvement Toolkit

    2002-11-15

    MESQUITE is a linkable software library to be used by simulation and mesh generation tools to improve the quality of meshes. Mesh quality is improved by node movement and/or local topological modifications. Various aspects of mesh quality such as smoothness, element shape, size, and orientation are controlled by choosing the appropriate mesh qualtiy metric, and objective function tempate, and a numerical optimization solver to optimize the quality of meshes, MESQUITE uses the TSTT mesh interfacemore » specification to provide an interoperable toolkit that can be used by applications which adopt the standard. A flexible code design makes it easy for meshing researchers to add additional mesh quality metrics, templates, and solvers to develop new quality improvement algorithms by making use of the MESQUITE infrastructure.« less

  18. TOOLKIT FOR ADVANCED OPTIMIZATION

    2000-10-13

    The TAO project focuses on the development of software for large scale optimization problems. TAO uses an object-oriented design to create a flexible toolkit with strong emphasis on the reuse of external tools where appropriate. Our design enables bi-directional connection to lower level linear algebra support (for example, parallel sparse matrix data structures) as well as higher level application frameworks. The Toolkist for Advanced Optimization (TAO) is aimed at teh solution of large-scale optimization problemsmore » on high-performance architectures. Our main goals are portability, performance, scalable parallelism, and an interface independent of the architecture. TAO is suitable for both single-processor and massively-parallel architectures. The current version of TAO has algorithms for unconstrained and bound-constrained optimization.« less

  19. Introducing the Ginga FITS Viewer and Toolkit

    NASA Astrophysics Data System (ADS)

    Jeschke, E.; Inagaki, T.; Kackley, R.

    2013-10-01

    We introduce Ginga, a new open-source FITS viewer and toolkit based on Python astronomical packages such as pyfits, numpy, scipy, matplotlib, and pywcs. For developers, we present a set of Python classes for viewing FITS files under the modern Gtk and Qt widget sets and a more full-featured viewer that has a plugin architecture. We further describe how plugins can be written to extend the viewer with many different capabilities. The software may be of interest to software developers who are looking for a solution for integrating FITS visualization into their Python programs and end users interested in a new and different FITS viewer that is not based on Tcl/Tk widget technology. The software has been released under a BSD license.

  20. BIT: Biosignal Igniter Toolkit.

    PubMed

    da Silva, Hugo Plácido; Lourenço, André; Fred, Ana; Martins, Raúl

    2014-06-01

    The study of biosignals has had a transforming role in multiple aspects of our society, which go well beyond the health sciences domains to which they were traditionally associated with. While biomedical engineering is a classical discipline where the topic is amply covered, today biosignals are a matter of interest for students, researchers and hobbyists in areas including computer science, informatics, electrical engineering, among others. Regardless of the context, the use of biosignals in experimental activities and practical projects is heavily bounded by the cost, and limited access to adequate support materials. In this paper we present an accessible, albeit versatile toolkit, composed of low-cost hardware and software, which was created to reinforce the engagement of different people in the field of biosignals. The hardware consists of a modular wireless biosignal acquisition system that can be used to support classroom activities, interface with other devices, or perform rapid prototyping of end-user applications. The software comprehends a set of programming APIs, a biosignal processing toolbox, and a framework for real time data acquisition and postprocessing.

  1. Simplifying operations with an uplink/downlink integration toolkit

    NASA Technical Reports Server (NTRS)

    Murphy, Susan C.; Miller, Kevin J.; Guerrero, Ana Maria; Joe, Chester; Louie, John J.; Aguilera, Christine

    1994-01-01

    The Operations Engineering Lab (OEL) at JPL has developed a simple, generic toolkit to integrate the uplink/downlink processes, (often called closing the loop), in JPL's Multimission Ground Data System. This toolkit provides capabilities for integrating telemetry verification points with predicted spacecraft commands and ground events in the Mission Sequence Of Events (SOE) document. In the JPL ground data system, the uplink processing functions and the downlink processing functions are separate subsystems that are not well integrated because of the nature of planetary missions with large one-way light times for spacecraft-to-ground communication. Our new closed-loop monitoring tool allows an analyst or mission controller to view and save uplink commands and ground events with their corresponding downlinked telemetry values regardless of the delay in downlink telemetry and without requiring real-time intervention by the user. An SOE document is a time-ordered list of all the planned ground and spacecraft events, including all commands, sequence loads, ground events, significant mission activities, spacecraft status, and resource allocations. The SOE document is generated by expansion and integration of spacecraft sequence files, ground station allocations, navigation files, and other ground event files. This SOE generation process has been automated within the OEL and includes a graphical, object-oriented SOE editor and real-time viewing tool running under X/Motif. The SOE toolkit was used as the framework for the integrated implementation. The SOE is used by flight engineers to coordinate their operations tasks, serving as a predict data set in ground operations and mission control. The closed-loop SOE toolkit allows simple, automated integration of predicted uplink events with correlated telemetry points in a single SOE document for on-screen viewing and archiving. It automatically interfaces with existing real-time or non real-time sources of information, to

  2. ParCAT: A Parallel Climate Analysis Toolkit

    NASA Astrophysics Data System (ADS)

    Haugen, B.; Smith, B.; Steed, C.; Ricciuto, D. M.; Thornton, P. E.; Shipman, G.

    2012-12-01

    Climate science has employed increasingly complex models and simulations to analyze the past and predict the future of our climate. The size and dimensionality of climate simulation data has been growing with the complexity of the models. This growth in data is creating a widening gap between the data being produced and the tools necessary to analyze large, high dimensional data sets. With single run data sets increasing into 10's, 100's and even 1000's of gigabytes, parallel computing tools are becoming a necessity in order to analyze and compare climate simulation data. The Parallel Climate Analysis Toolkit (ParCAT) provides basic tools that efficiently use parallel computing techniques to narrow the gap between data set size and analysis tools. ParCAT was created as a collaborative effort between climate scientists and computer scientists in order to provide efficient parallel implementations of the computing tools that are of use to climate scientists. Some of the basic functionalities included in the toolkit are the ability to compute spatio-temporal means and variances, differences between two runs and histograms of the values in a data set. ParCAT is designed to facilitate the "heavy lifting" that is required for large, multidimensional data sets. The toolkit does not focus on performing the final visualizations and presentation of results but rather, reducing large data sets to smaller, more manageable summaries. The output from ParCAT is provided in commonly used file formats (NetCDF, CSV, ASCII) to allow for simple integration with other tools. The toolkit is currently implemented as a command line utility, but will likely also provide a C library for developers interested in tighter software integration. Elements of the toolkit are already being incorporated into projects such as UV-CDAT and CMDX. There is also an effort underway to implement portions of the CCSM Land Model Diagnostics package using ParCAT in conjunction with Python and gnuplot. Par

  3. A Scalable Analysis Toolkit

    NASA Technical Reports Server (NTRS)

    Aiken, Alexander

    2001-01-01

    The Scalable Analysis Toolkit (SAT) project aimed to demonstrate that it is feasible and useful to statically detect software bugs in very large systems. The technical focus of the project was on a relatively new class of constraint-based techniques for analysis software, where the desired facts about programs (e.g., the presence of a particular bug) are phrased as constraint problems to be solved. At the beginning of this project, the most successful forms of formal software analysis were limited forms of automatic theorem proving (as exemplified by the analyses used in language type systems and optimizing compilers), semi-automatic theorem proving for full verification, and model checking. With a few notable exceptions these approaches had not been demonstrated to scale to software systems of even 50,000 lines of code. Realistic approaches to large-scale software analysis cannot hope to make every conceivable formal method scale. Thus, the SAT approach is to mix different methods in one application by using coarse and fast but still adequate methods at the largest scales, and reserving the use of more precise but also more expensive methods at smaller scales for critical aspects (that is, aspects critical to the analysis problem under consideration) of a software system. The principled method proposed for combining a heterogeneous collection of formal systems with different scalability characteristics is mixed constraints. This idea had been used previously in small-scale applications with encouraging results: using mostly coarse methods and narrowly targeted precise methods, useful information (meaning the discovery of bugs in real programs) was obtained with excellent scalability.

  4. Pizza.py Toolkit

    2006-01-01

    Pizza.py is a loosely integrated collection of tools, many of which provide support for the LAMMPS molecular dynamics and ChemCell cell modeling packages. There are tools to create input files. convert between file formats, process log and dump files, create plots, and visualize and animate simulation snapshots. Software packages that are wrapped by Pizza.py. so they can invoked from within Python, include GnuPlot, MatLab, Raster3d. and RasMol. Pizza.py is written in Python and runs onmore » any platform that supports Python. Pizza.py enhances the standard Python interpreter in a few simple ways. Its tools are Python modules which can be invoked interactively, from scripts, or from GUIs when appropriate. Some of the tools require additional Python packages to be installed as part of the users Python. Others are wrappers on software packages (as listed above) which must be available on the users system. It is easy to modify or extend Pizza.py with new functionality or new tools, which need not have anything to do with LAMMPS or ChemCell.« less

  5. Pizza.py Toolkit

    SciTech Connect

    Plimpton, Steve; Jones, Matt; Crozier, Paul

    2006-01-01

    Pizza.py is a loosely integrated collection of tools, many of which provide support for the LAMMPS molecular dynamics and ChemCell cell modeling packages. There are tools to create input files. convert between file formats, process log and dump files, create plots, and visualize and animate simulation snapshots. Software packages that are wrapped by Pizza.py. so they can invoked from within Python, include GnuPlot, MatLab, Raster3d. and RasMol. Pizza.py is written in Python and runs on any platform that supports Python. Pizza.py enhances the standard Python interpreter in a few simple ways. Its tools are Python modules which can be invoked interactively, from scripts, or from GUIs when appropriate. Some of the tools require additional Python packages to be installed as part of the users Python. Others are wrappers on software packages (as listed above) which must be available on the users system. It is easy to modify or extend Pizza.py with new functionality or new tools, which need not have anything to do with LAMMPS or ChemCell.

  6. toolkit computational mesh conceptual model.

    SciTech Connect

    Baur, David G.; Edwards, Harold Carter; Cochran, William K.; Williams, Alan B.; Sjaardema, Gregory D.

    2010-03-01

    The Sierra Toolkit computational mesh is a software library intended to support massively parallel multi-physics computations on dynamically changing unstructured meshes. This domain of intended use is inherently complex due to distributed memory parallelism, parallel scalability, heterogeneity of physics, heterogeneous discretization of an unstructured mesh, and runtime adaptation of the mesh. Management of this inherent complexity begins with a conceptual analysis and modeling of this domain of intended use; i.e., development of a domain model. The Sierra Toolkit computational mesh software library is designed and implemented based upon this domain model. Software developers using, maintaining, or extending the Sierra Toolkit computational mesh library must be familiar with the concepts/domain model presented in this report.

  7. ProtoMD: A prototyping toolkit for multiscale molecular dynamics

    NASA Astrophysics Data System (ADS)

    Somogyi, Endre; Mansour, Andrew Abi; Ortoleva, Peter J.

    2016-05-01

    ProtoMD is a toolkit that facilitates the development of algorithms for multiscale molecular dynamics (MD) simulations. It is designed for multiscale methods which capture the dynamic transfer of information across multiple spatial scales, such as the atomic to the mesoscopic scale, via coevolving microscopic and coarse-grained (CG) variables. ProtoMD can be also be used to calibrate parameters needed in traditional CG-MD methods. The toolkit integrates 'GROMACS wrapper' to initiate MD simulations, and 'MDAnalysis' to analyze and manipulate trajectory files. It facilitates experimentation with a spectrum of coarse-grained variables, prototyping rare events (such as chemical reactions), or simulating nanocharacterization experiments such as terahertz spectroscopy, AFM, nanopore, and time-of-flight mass spectroscopy. ProtoMD is written in python and is freely available under the GNU General Public License from github.com/CTCNano/proto_md.

  8. MCS Systems Administration Toolkit

    2001-09-30

    This package contains a number of systems administration utilities to assist a team of system administrators in managing a computer environment by automating routine tasks and centralizing information. Included are utilities to help install software on a network of computers and programs to make an image of a disk drive, to manage and distribute configuration files for a number of systems, and to run self-testss on systems, as well as an example of using amore » database to manage host information and various utilities.« less

  9. A Toolkit for Teacher Engagement

    ERIC Educational Resources Information Center

    Grantmakers for Education, 2014

    2014-01-01

    Teachers are critical to the success of education grantmaking strategies, yet in talking with them we discovered that the world of philanthropy is often a mystery. GFE's Toolkit for Teacher Engagement aims to assist funders in authentically and effectively involving teachers in the education reform and innovation process. Built directly from the…

  10. pypet: A Python Toolkit for Data Management of Parameter Explorations

    PubMed Central

    Meyer, Robert; Obermayer, Klaus

    2016-01-01

    pypet (Python parameter exploration toolkit) is a new multi-platform Python toolkit for managing numerical simulations. Sampling the space of model parameters is a key aspect of simulations and numerical experiments. pypet is designed to allow easy and arbitrary sampling of trajectories through a parameter space beyond simple grid searches. pypet collects and stores both simulation parameters and results in a single HDF5 file. This collective storage allows fast and convenient loading of data for further analyses. pypet provides various additional features such as multiprocessing and parallelization of simulations, dynamic loading of data, integration of git version control, and supervision of experiments via the electronic lab notebook Sumatra. pypet supports a rich set of data formats, including native Python types, Numpy and Scipy data, Pandas DataFrames, and BRIAN(2) quantities. Besides these formats, users can easily extend the toolkit to allow customized data types. pypet is a flexible tool suited for both short Python scripts and large scale projects. pypet's various features, especially the tight link between parameters and results, promote reproducible research in computational neuroscience and simulation-based disciplines.

  11. pypet: A Python Toolkit for Data Management of Parameter Explorations

    PubMed Central

    Meyer, Robert; Obermayer, Klaus

    2016-01-01

    pypet (Python parameter exploration toolkit) is a new multi-platform Python toolkit for managing numerical simulations. Sampling the space of model parameters is a key aspect of simulations and numerical experiments. pypet is designed to allow easy and arbitrary sampling of trajectories through a parameter space beyond simple grid searches. pypet collects and stores both simulation parameters and results in a single HDF5 file. This collective storage allows fast and convenient loading of data for further analyses. pypet provides various additional features such as multiprocessing and parallelization of simulations, dynamic loading of data, integration of git version control, and supervision of experiments via the electronic lab notebook Sumatra. pypet supports a rich set of data formats, including native Python types, Numpy and Scipy data, Pandas DataFrames, and BRIAN(2) quantities. Besides these formats, users can easily extend the toolkit to allow customized data types. pypet is a flexible tool suited for both short Python scripts and large scale projects. pypet's various features, especially the tight link between parameters and results, promote reproducible research in computational neuroscience and simulation-based disciplines. PMID:27610080

  12. pypet: A Python Toolkit for Data Management of Parameter Explorations.

    PubMed

    Meyer, Robert; Obermayer, Klaus

    2016-01-01

    pypet (Python parameter exploration toolkit) is a new multi-platform Python toolkit for managing numerical simulations. Sampling the space of model parameters is a key aspect of simulations and numerical experiments. pypet is designed to allow easy and arbitrary sampling of trajectories through a parameter space beyond simple grid searches. pypet collects and stores both simulation parameters and results in a single HDF5 file. This collective storage allows fast and convenient loading of data for further analyses. pypet provides various additional features such as multiprocessing and parallelization of simulations, dynamic loading of data, integration of git version control, and supervision of experiments via the electronic lab notebook Sumatra. pypet supports a rich set of data formats, including native Python types, Numpy and Scipy data, Pandas DataFrames, and BRIAN(2) quantities. Besides these formats, users can easily extend the toolkit to allow customized data types. pypet is a flexible tool suited for both short Python scripts and large scale projects. pypet's various features, especially the tight link between parameters and results, promote reproducible research in computational neuroscience and simulation-based disciplines.

  13. pypet: A Python Toolkit for Data Management of Parameter Explorations.

    PubMed

    Meyer, Robert; Obermayer, Klaus

    2016-01-01

    pypet (Python parameter exploration toolkit) is a new multi-platform Python toolkit for managing numerical simulations. Sampling the space of model parameters is a key aspect of simulations and numerical experiments. pypet is designed to allow easy and arbitrary sampling of trajectories through a parameter space beyond simple grid searches. pypet collects and stores both simulation parameters and results in a single HDF5 file. This collective storage allows fast and convenient loading of data for further analyses. pypet provides various additional features such as multiprocessing and parallelization of simulations, dynamic loading of data, integration of git version control, and supervision of experiments via the electronic lab notebook Sumatra. pypet supports a rich set of data formats, including native Python types, Numpy and Scipy data, Pandas DataFrames, and BRIAN(2) quantities. Besides these formats, users can easily extend the toolkit to allow customized data types. pypet is a flexible tool suited for both short Python scripts and large scale projects. pypet's various features, especially the tight link between parameters and results, promote reproducible research in computational neuroscience and simulation-based disciplines. PMID:27610080

  14. Parallel Power Grid Simulation Toolkit

    SciTech Connect

    Smith, Steve; Kelley, Brian; Banks, Lawrence; Top, Philip; Woodward, Carol

    2015-09-14

    ParGrid is a 'wrapper' that integrates a coupled Power Grid Simulation toolkit consisting of a library to manage the synchronization and communication of independent simulations. The included library code in ParGid, named FSKIT, is intended to support the coupling multiple continuous and discrete even parallel simulations. The code is designed using modern object oriented C++ methods utilizing C++11 and current Boost libraries to ensure compatibility with multiple operating systems and environments.

  15. User`s guide for SDDS toolkit Version 1.4

    SciTech Connect

    Borland, M.

    1995-07-06

    The Self Describing Data Sets (SDDS) file protocol is the basis for a powerful and expanding toolkit of over 40 generic programs. These programs are used for simulation postprocessing, graphics, data preparation, program interfacing, and experimental data analysis. This document describes Version 1.4 of the SDDS commandline toolkit. Those wishing to write programs using SDDS should consult the Application Programmer`s Guide for SDDS Version 1.4. The first section of the present document is shared with this reference. This document does not describe SDDS-compliant EPICS applications, of which there are presently 25.

  16. The Connectome Viewer Toolkit: An Open Source Framework to Manage, Analyze, and Visualize Connectomes

    PubMed Central

    Gerhard, Stephan; Daducci, Alessandro; Lemkaddem, Alia; Meuli, Reto; Thiran, Jean-Philippe; Hagmann, Patric

    2011-01-01

    Advanced neuroinformatics tools are required for methods of connectome mapping, analysis, and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration, and sharing. We have designed and implemented the Connectome Viewer Toolkit – a set of free and extensible open source neuroimaging tools written in Python. The key components of the toolkit are as follows: (1) The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. (2) The Connectome File Format Library enables management and sharing of connectome files. (3) The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration, and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/ PMID:21713110

  17. The REACH Youth Program Learning Toolkit

    ERIC Educational Resources Information Center

    Sierra Health Foundation, 2011

    2011-01-01

    Believing in the value of using video documentaries and data as learning tools, members of the REACH technical assistance team collaborated to develop this toolkit. The learning toolkit was designed using and/or incorporating components of the "Engaging Youth in Community Change: Outcomes and Lessons Learned from Sierra Health Foundation's REACH…

  18. Python-ARM Radar Toolkit

    SciTech Connect

    Jonathan Helmus, Scott Collis

    2013-03-17

    The Python-ARM Radar Toolkit (Py-ART) is a collection of radar quality control and retrieval codes which all work on two unifying Python objects: the PyRadar and PyGrid objects. By building ingests to several popular radar formats and then abstracting the interface Py-ART greatly simplifies data processing over several other available utilities. In addition Py-ART makes use of Numpy arrays as its primary storage mechanism enabling use of existing and extensive community software tools.

  19. Design Optimization Toolkit: Users' Manual

    SciTech Connect

    Aguilo Valentin, Miguel Alejandro

    2014-07-01

    The Design Optimization Toolkit (DOTk) is a stand-alone C++ software package intended to solve complex design optimization problems. DOTk software package provides a range of solution methods that are suited for gradient/nongradient-based optimization, large scale constrained optimization, and topology optimization. DOTk was design to have a flexible user interface to allow easy access to DOTk solution methods from external engineering software packages. This inherent flexibility makes DOTk barely intrusive to other engineering software packages. As part of this inherent flexibility, DOTk software package provides an easy-to-use MATLAB interface that enables users to call DOTk solution methods directly from the MATLAB command window.

  20. An Introduction to the Einstein Toolkit

    NASA Astrophysics Data System (ADS)

    Zilhão, Miguel; Löffler, Frank

    2013-09-01

    We give an introduction to the Einstein Toolkit, a mature, open-source computational infrastructure for numerical relativity based on the Cactus Framework, for the target group of new users. This toolkit is composed of several different modules, is developed by researchers from different institutions throughout the world and is in active continuous development. Documentation for the toolkit and its several modules is often scattered across different locations, a difficulty new users may at times have to struggle with. Scientific papers exist describing the toolkit and its methods in detail, but they might be overwhelming at first. With these lecture notes we hope to provide an initial overview for new users. We cover how to obtain, compile and run the toolkit, and give an overview of some of the tools and modules provided with it.

  1. The SCRAM tool-kit

    NASA Astrophysics Data System (ADS)

    Tamir, David; Flanigan, Lee A.; Weeks, Jack L.; Siewert, Thomas A.; Kimbrough, Andrew G.; McClure, Sidney R.

    1994-01-01

    This paper proposes a new series of on-orbit capabilities to support the near-term Hubble Space Telescope, Extended Duration Orbiter, Long Duration Orbiter, Space Station Freedom, other orbital platforms, and even the future manned Lunar/Mars missions. These proposed capabilities form a toolkit termed Space Construction, Repair, and Maintenance (SCRAM). SCRAM addresses both intra-Vehicular Activity (IVA) and Extra-Vehicular Activity (EVA) needs. SCRAM provides a variety of tools which enable welding, brazing, cutting, coating, heating, and cleaning, as well as corresponding nondestructive examination. Near-term IVA-SCRAM applications include repair and modification to fluid lines, structure, and laboratory equipment inside a shirt-sleeve environment (i.e. inside Spacelab or Space Station). Near-term EVA-SCRAM applications include construction of fluid lines and structural members, repair of punctures by orbital debris, refurbishment of surfaces eroded by contaminants. The SCRAM tool-kit also promises future EVA applications involving mass production tasks automated by robotics and artificial intelligence, for construction of large truss, aerobrake, and nuclear reactor shadow shields structures. The leading candidate tool processes for SCRAM, currently undergoing research and development, include Electron Beam, Gas Tungsten Arc, Plasma Arc, and Laser Beam. A series of strategic space flight experiments would make SCRAM available to help conquer the space frontier.

  2. [A biomedical signal processing toolkit programmed by Java].

    PubMed

    Xie, Haiyuan

    2012-09-01

    According to the biomedical signal characteristics, a new biomedical signal processing toolkit is developed. The toolkit is programmed by Java. It is used in basic digital signal processing, random signal processing and etc. All the methods in toolkit has been tested, the program is robust. The feature of the toolkit is detailed explained, easy use and good practicability.

  3. The DLESE Evaluation Toolkit Project

    NASA Astrophysics Data System (ADS)

    Buhr, S. M.; Barker, L. J.; Marlino, M.

    2002-12-01

    The Evaluation Toolkit and Community project is a new Digital Library for Earth System Education (DLESE) collection designed to raise awareness of project evaluation within the geoscience education community, and to enable principal investigators, teachers, and evaluators to implement project evaluation more readily. This new resource is grounded in the needs of geoscience educators, and will provide a virtual home for a geoscience education evaluation community. The goals of the project are to 1) provide a robust collection of evaluation resources useful for Earth systems educators, 2) establish a forum and community for evaluation dialogue within DLESE, and 3) disseminate the resources through the DLESE infrastructure and through professional society workshops and proceedings. Collaboration and expertise in education, geoscience and evaluation are necessary if we are to conduct the best possible geoscience education. The Toolkit allows users to engage in evaluation at whichever level best suits their needs, get more evaluation professional development if desired, and access the expertise of other segments of the community. To date, a test web site has been built and populated, initial community feedback from the DLESE and broader community is being garnered, and we have begun to heighten awareness of geoscience education evaluation within our community. The web site contains features that allow users to access professional development about evaluation, search and find evaluation resources, submit resources, find or offer evaluation services, sign up for upcoming workshops, take the user survey, and submit calendar items. The evaluation resource matrix currently contains resources that have met our initial review. The resources are currently organized by type; they will become searchable on multiple dimensions of project type, audience, objectives and evaluation resource type as efforts to develop a collection-specific search engine mature. The peer review

  4. Pydpiper: a flexible toolkit for constructing novel registration pipelines

    PubMed Central

    Friedel, Miriam; van Eede, Matthijs C.; Pipitone, Jon; Chakravarty, M. Mallar; Lerch, Jason P.

    2014-01-01

    Using neuroimaging technologies to elucidate the relationship between genotype and phenotype and brain and behavior will be a key contribution to biomedical research in the twenty-first century. Among the many methods for analyzing neuroimaging data, image registration deserves particular attention due to its wide range of applications. Finding strategies to register together many images and analyze the differences between them can be a challenge, particularly given that different experimental designs require different registration strategies. Moreover, writing software that can handle different types of image registration pipelines in a flexible, reusable and extensible way can be challenging. In response to this challenge, we have created Pydpiper, a neuroimaging registration toolkit written in Python. Pydpiper is an open-source, freely available software package that provides multiple modules for various image registration applications. Pydpiper offers five key innovations. Specifically: (1) a robust file handling class that allows access to outputs from all stages of registration at any point in the pipeline; (2) the ability of the framework to eliminate duplicate stages; (3) reusable, easy to subclass modules; (4) a development toolkit written for non-developers; (5) four complete applications that run complex image registration pipelines “out-of-the-box.” In this paper, we will discuss both the general Pydpiper framework and the various ways in which component modules can be pieced together to easily create new registration pipelines. This will include a discussion of the core principles motivating code development and a comparison of Pydpiper with other available toolkits. We also provide a comprehensive, line-by-line example to orient users with limited programming knowledge and highlight some of the most useful features of Pydpiper. In addition, we will present the four current applications of the code. PMID:25126069

  5. Flightspeed Integral Image Analysis Toolkit

    NASA Technical Reports Server (NTRS)

    Thompson, David R.

    2009-01-01

    The Flightspeed Integral Image Analysis Toolkit (FIIAT) is a C library that provides image analysis functions in a single, portable package. It provides basic low-level filtering, texture analysis, and subwindow descriptor for applications dealing with image interpretation and object recognition. Designed with spaceflight in mind, it addresses: Ease of integration (minimal external dependencies) Fast, real-time operation using integer arithmetic where possible (useful for platforms lacking a dedicated floatingpoint processor) Written entirely in C (easily modified) Mostly static memory allocation 8-bit image data The basic goal of the FIIAT library is to compute meaningful numerical descriptors for images or rectangular image regions. These n-vectors can then be used directly for novelty detection or pattern recognition, or as a feature space for higher-level pattern recognition tasks. The library provides routines for leveraging training data to derive descriptors that are most useful for a specific data set. Its runtime algorithms exploit a structure known as the "integral image." This is a caching method that permits fast summation of values within rectangular regions of an image. This integral frame facilitates a wide range of fast image-processing functions. This toolkit has applicability to a wide range of autonomous image analysis tasks in the space-flight domain, including novelty detection, object and scene classification, target detection for autonomous instrument placement, and science analysis of geomorphology. It makes real-time texture and pattern recognition possible for platforms with severe computational restraints. The software provides an order of magnitude speed increase over alternative software libraries currently in use by the research community. FIIAT can commercially support intelligent video cameras used in intelligent surveillance. It is also useful for object recognition by robots or other autonomous vehicles

  6. Development of an Integrated Human Factors Toolkit

    NASA Technical Reports Server (NTRS)

    Resnick, Marc L.

    2003-01-01

    An effective integration of human abilities and limitations is crucial to the success of all NASA missions. The Integrated Human Factors Toolkit facilitates this integration by assisting system designers and analysts to select the human factors tools that are most appropriate for the needs of each project. The HF Toolkit contains information about a broad variety of human factors tools addressing human requirements in the physical, information processing and human reliability domains. Analysis of each tool includes consideration of the most appropriate design stage, the amount of expertise in human factors that is required, the amount of experience with the tool and the target job tasks that are needed, and other factors that are critical for successful use of the tool. The benefits of the Toolkit include improved safety, reliability and effectiveness of NASA systems throughout the agency. This report outlines the initial stages of development for the Integrated Human Factors Toolkit.

  7. A universal postprocessing toolkit for accelerator simulation and data analysis.

    SciTech Connect

    Borland, M.

    1998-12-16

    The Self-Describing Data Sets (SDDS) toolkit comprises about 70 generally-applicable programs sharing a common data protocol. At the Advanced Photon Source (APS), SDDS performs the vast majority of operational data collection and processing, most data display functions, and many control functions. In addition, a number of accelerator simulation codes use SDDS for all post-processing and data display. This has three principle advantages: first, simulation codes need not provide customized post-processing tools, thus simplifying development and maintenance. Second, users can enhance code capabilities without changing the code itself, by adding SDDS-based pre- and post-processing. Third, multiple codes can be used together more easily, by employing SDDS for data transfer and adaptation. Given its broad applicability, the SDDS file protocol is surprisingly simple, making it quite easy for simulations to generate SDDS-compliant data. This paper discusses the philosophy behind SDDS, contrasting it with some recent trends, and outlines the capabilities of the toolkit. The paper also gives examples of using SDDS for accelerator simulation.

  8. ISO/IEEE 11073 PHD message generation toolkit to standardize healthcare device.

    PubMed

    Lim, Joon-Ho; Park, Chanyong; Park, Soo-Jun; Lee, Kyu-Chul

    2011-01-01

    As senior population increases, various healthcare devices and services are developed such as fall detection device, home hypertension management service, and etc. However, to vitalize healthcare devices and services market, standardization for interoperability between device and service must precede. To achieve the standardization goal, the IEEE 11073 Personal Health Device (PHD) group has been standardized many healthcare devices, but until now there are few devices compatible with the PHD standard. One of main reasons is that it isn't easy for device manufactures to implement standard communication module by analyzing standard documents of over 600 pages. In this paper, we propose a standard message generation toolkit to easily standardize existing non-standard healthcare devices. The proposed toolkit generates standard PHD messages using inputted device information, and the generated messages are adapted to the device with the standard state machine file. For the experiments, we develop a reference H/W, and test the proposed toolkit with three healthcare devices: blood pressure, weighting scale, and glucose meter. The proposed toolkit has an advantage that even if the user doesn't know the standard in detail, the user can easily standardize the non-standard healthcare devices.

  9. Integrated Systems Health Management (ISHM) Toolkit

    NASA Technical Reports Server (NTRS)

    Venkatesh, Meera; Kapadia, Ravi; Walker, Mark; Wilkins, Kim

    2013-01-01

    A framework of software components has been implemented to facilitate the development of ISHM systems according to a methodology based on Reliability Centered Maintenance (RCM). This framework is collectively referred to as the Toolkit and was developed using General Atomics' Health MAP (TM) technology. The toolkit is intended to provide assistance to software developers of mission-critical system health monitoring applications in the specification, implementation, configuration, and deployment of such applications. In addition to software tools designed to facilitate these objectives, the toolkit also provides direction to software developers in accordance with an ISHM specification and development methodology. The development tools are based on an RCM approach for the development of ISHM systems. This approach focuses on defining, detecting, and predicting the likelihood of system functional failures and their undesirable consequences.

  10. Desensitized Optimal Filtering and Sensor Fusion Toolkit

    NASA Technical Reports Server (NTRS)

    Karlgaard, Christopher D.

    2015-01-01

    Analytical Mechanics Associates, Inc., has developed a software toolkit that filters and processes navigational data from multiple sensor sources. A key component of the toolkit is a trajectory optimization technique that reduces the sensitivity of Kalman filters with respect to model parameter uncertainties. The sensor fusion toolkit also integrates recent advances in adaptive Kalman and sigma-point filters for non-Gaussian problems with error statistics. This Phase II effort provides new filtering and sensor fusion techniques in a convenient package that can be used as a stand-alone application for ground support and/or onboard use. Its modular architecture enables ready integration with existing tools. A suite of sensor models and noise distribution as well as Monte Carlo analysis capability are included to enable statistical performance evaluations.

  11. Comparison of open source visual analytics toolkits.

    SciTech Connect

    Crossno, Patricia Joyce; Harger, John R.

    2010-11-01

    We present the results of the first stage of a two-stage evaluation of open source visual analytics packages. This stage is a broad feature comparison over a range of open source toolkits. Although we had originally intended to restrict ourselves to comparing visual analytics toolkits, we quickly found that very few were available. So we expanded our study to include information visualization, graph analysis, and statistical packages. We examine three aspects of each toolkit: visualization functions, analysis capabilities, and development environments. With respect to development environments, we look at platforms, language bindings, multi-threading/parallelism, user interface frameworks, ease of installation, documentation, and whether the package is still being actively developed.

  12. The Ames MER Microscopic Imager Toolkit

    NASA Technical Reports Server (NTRS)

    Sargent, Randy; Deans, Matthew; Kunz, Clayton; Sims, Michael; Herkenhoff, Ken

    2005-01-01

    The Mars Exploration Rovers, Spirit and Opportunity, have spent several successful months on Mars, returning gigabytes of images and spectral data to scientists on Earth. One of the instruments on the MER rovers, the Athena Microscopic Imager (MI), is a fixed focus, megapixel camera providing a plus or minus mm depth of field and a 3lx31mm field of view at a working distance of 63 mm from the lens to the object being imaged. In order to maximize the science return from this instrument, we developed the Ames MI Toolkit and supported its use during the primary mission. The MI Toolkit is a set of programs that operate on collections of MI images, with the goal of making the data more understandable to the scientists on the ground. Because of the limited depth of field of the camera, and the often highly variable topography of the terrain being imaged, MI images of a given rock are often taken as a stack, with the Instrument Deployment Device (IDD) moving along a computed normal vector, pausing every few millimeters for the MI to acquire an image. The MI Toolkit provides image registration and focal section merging, which combine these images to form a single, maximally in-focus image, while compensating for changes in lighting as well as parallax due to the motion of the camera. The MI Toolkit also provides a 3-D reconstruction of the surface being imaged using stereo and can embed 2-D MI images as texture maps into 3-D meshes produced by other imagers on board the rover to provide context. The 2-D images and 3-D meshes output from the Toolkit are easily viewed by scientists using other mission tools, such as Viz or the MI Browser. This paper describes the MI Toolkit in detail, as well as our experience using it with scientists at JPL during the primary MER mission.

  13. The Ames MER microscopic imager toolkit

    USGS Publications Warehouse

    Sargent, R.; Deans, Matthew; Kunz, C.; Sims, M.; Herkenhoff, K.

    2005-01-01

    12The Mars Exploration Rovers, Spirit and Opportunity, have spent several successful months on Mars, returning gigabytes of images and spectral data to scientists on Earth. One of the instruments on the MER rovers, the Athena Microscopic Imager (MI), is a fixed focus, megapixel camera providing a ??3mm depth of field and a 31??31mm field of view at a working distance of 63 mm from the lens to the object being imaged. In order to maximize the science return from this instrument, we developed the Ames MI Toolkit and supported its use during the primary mission. The MI Toolkit is a set of programs that operate on collections of MI images, with the goal of making the data more understandable to the scientists on the ground. Because of the limited depth of field of the camera, and the often highly variable topography of the terrain being imaged, MI images of a given rock are often taken as a stack, with the Instrument Deployment Device (IDD) moving along a computed normal vector, pausing every few millimeters for the MI to acquire an image. The MI Toolkit provides image registration and focal section merging, which combine these images to form a single, maximally in-focus image, while compensating for changes in lighting as well as parallax due to the motion of the camera. The MI Toolkit also provides a 3-D reconstruction of the surface being imaged using stereo and can embed 2-D MI images as texture maps into 3-D meshes produced by other imagers on board the rover to provide context. The 2-D images and 3-D meshes output from the Toolkit are easily viewed by scientists using other mission tools, such as Viz or the MI Browser.This paper describes the MI Toolkit in detail, as well as our experience using it with scientists at JPL during the primary MER mission. ?? 2005 IEEE.

  14. "Handy Manny" and the Emergent Literacy Technology Toolkit

    ERIC Educational Resources Information Center

    Hourcade, Jack J.; Parette, Howard P., Jr.; Boeckmann, Nichole; Blum, Craig

    2010-01-01

    This paper outlines the use of a technology toolkit to support emergent literacy curriculum and instruction in early childhood education settings. Components of the toolkit include hardware and software that can facilitate key emergent literacy skills. Implementation of the comprehensive technology toolkit enhances the development of these…

  15. TRSkit: A Simple Digital Library Toolkit

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Esler, Sandra L.

    1997-01-01

    This paper introduces TRSkit, a simple and effective toolkit for building digital libraries on the World Wide Web. The toolkit was developed for the creation of the Langley Technical Report Server and the NASA Technical Report Server, but is applicable to most simple distribution paradigms. TRSkit contains a handful of freely available software components designed to be run under the UNIX operating system and served via the World Wide Web. The intended customer is the person that must continuously and synchronously distribute anywhere from 100 - 100,000's of information units and does not have extensive resources to devote to the problem.

  16. Autism Speaks Toolkits: Resources for Busy Physicians.

    PubMed

    Bellando, Jayne; Fussell, Jill J; Lopez, Maya

    2016-02-01

    Given the increased prevalence of autism spectrum disorders (ASD), it is likely that busy primary care providers (PCP) are providing care to individuals with ASD in their practice. Autism Speaks provides a wealth of educational, medical, and treatment/intervention information resources for PCPs and families, including at least 32 toolkits. This article serves to familiarize PCPs and families on the different toolkits that are available on the Autism Speaks website. This article is intended to increase physicians' knowledge on the issues that families with children with ASD frequently encounter, to increase their ability to share evidence-based information to guide treatment and care for affected families in their practice.

  17. WIND Toolkit Power Data Site Index

    DOE Data Explorer

    Draxl, Caroline; Mathias-Hodge, Bri

    2016-10-19

    This spreadsheet contains per-site metadata for the WIND Toolkit sites and serves as an index for the raw data hosted on Globus connect (nrel#globus:/globusro/met_data). Aside from the metadata, per site average power and capacity factor are given. This data was prepared by 3TIER under contract by NREL and is public domain. Authoritative documentation on the creation of the underlying dataset is at: Final Report on the Creation of the Wind Integration National Dataset (WIND) Toolkit and API: http://www.nrel.gov/docs/fy16osti/66189.pdf

  18. SIERRA Toolkit v. 1.0

    2010-02-24

    The SIERRA Toolkit is a collection of libraries to facilitate the development of parallel engineering analysis applications. These libraries supply basic core services that an engineering analysis application may need such as a parallel distributed and dynamic mesh database (for unstructured meshes), mechanics algorithm support (parallel infrastructure only), interfaces to parallel solvers, parallel mesh and data I/O, and various utilities (timers, diagnostic tools, etc.).The toolkit is intended to reduce the effort required to develop anmore » engineering analysis application by removing the need to develop core capabilities that most every application would require.« less

  19. Construction aggregates

    USGS Publications Warehouse

    Tepordei, V.V.

    1995-01-01

    Part of the 1994 Industrial Minerals Review. The production, consumption, and applications of construction aggregates are reviewed. In 1994, the production of construction aggregates, which includes crushed stone and construction sand and gravel combined, increased 7.7 percent to 2.14 Gt compared with the previous year. These record production levels are mostly a result of funding for highway construction work provided by the Intermodal Surface Transportation Efficiency Act of 1991. Demand is expected to increase for construction aggregates in 1995.

  20. The NetLogger Toolkit V2.0

    2003-03-28

    The NetLogger Toolkit is designed to monitor, under actual operating conditions, the behavior of all the elements of the application-to-application communication path in order to determine exactly where time is spent within a complex system Using NetLogger, distnbuted application components are modified to produce timestamped logs of "interesting" events at all the critical points of the distributed system Events from each component are correlated, which allov^ one to characterize the performance of all aspects ofmore » the system and network in detail. The NetLogger Toolkit itself consists of four components an API and library of functions to simplify the generation of application-level event logs, a set of tools for collecting and sorting log files, an event archive system, and a tool for visualization and analysis of the log files In order to instrument an application to produce event logs, the application developer inserts calls to the NetLogger API at all the critical points in the code, then links the application with the NetLogger library All the tools in the NetLogger Toolkit share a common log format, and assume the existence of accurate and synchronized system clocks NetLogger messages can be logged using an easy-to-read text based format based on the lETF-proposed ULM format, or a binary format that can still be used through the same API but that is several times faster and smaller, with performance comparable or better than binary message formats such as MPI, XDR, SDDF-Binary, and PBIO. The NetLogger binary format is both highly efficient and self-describing, thus optimized for the dynamic message construction and parsing of application instrumentation. NetLogger includes an "activation" API that allows NetLogger logging to be turned on, off, or modified by changing an external file This IS useful for activating logging in daemons/services (e g GndFTP server). The NetLogger reliability API provides the ability to specify backup logging locations and

  1. Services development toolkit for Open Research Data (Promarket)

    NASA Astrophysics Data System (ADS)

    Som de Cerff, W.; Schwichtenberg, H.; Gemünd, A.; Claus, S.; Reichwald, J.; Denvil, S.; Mazetti, P.; Nativi, S.

    2012-04-01

    According to the declaration of the Organisation for Economic Co-operation and Development (OECD) on Open Access: "OECD Principles and Guidelines for Access to Research Data from Public Funding" research data should be available for everyone and Europe follows these directions (Digital Agenda, N. Kroes). Data being 'open' does not mean directly applicable: research data are often complex to use and difficult to interpret by non-experts. Also, if extra services are needed, e.g. certain delivery guarantees, SLAs need to be negotiated. Comparable to OSS development models, where software is open and services and support are paid for, there is a large potential for commercial activities and services around this open and free research data. E.g. Climate, weather or data from instruments can be used to generate business values when offered as easy and reliable services for Apps integration. The project will design a toolkit for developers in research data centres. The tools will allow to develop services to provide research data and map business processes e.g. automatic service level agreements to their service to make open research data attractive for commercial and academic use by the centre and others. It will enable to build and develop open, reliable and scalable services and end products, e.g. accessible from end user devices such as smart phones. Researchers, interested citizen or company developers will be enabled to access open data as an "easy-to-use" service and aggregate it with other services. The project will address a broad range of developers and give them a toolkit in well-known settings, portable, scalable, open and useable in public development environments and tools. This topic will be addressed technically by utilizing service-oriented approaches based on open standards and protocols, combined with new programming models and techniques.

  2. Integrated System Health Management Development Toolkit

    NASA Technical Reports Server (NTRS)

    Figueroa, Jorge; Smith, Harvey; Morris, Jon

    2009-01-01

    This software toolkit is designed to model complex systems for the implementation of embedded Integrated System Health Management (ISHM) capability, which focuses on determining the condition (health) of every element in a complex system (detect anomalies, diagnose causes, and predict future anomalies), and to provide data, information, and knowledge (DIaK) to control systems for safe and effective operation.

  3. Sandia multispectral analyst remote sensing toolkit (SMART).

    SciTech Connect

    Post, Brian Nelson; Smith, Jody Lynn; Geib, Peter L.; Nandy, Prabal; Wang, Nancy Nairong

    2003-03-01

    This remote sensing science and exploitation work focused on exploitation algorithms and methods targeted at the analyst. SMART is a 'plug-in' to commercial remote sensing software that provides algorithms to enhance the utility of the Multispectral Thermal Imager (MTI) and other multispectral satellite data. This toolkit has been licensed to 22 government organizations.

  4. Healthy People 2010: Oral Health Toolkit

    ERIC Educational Resources Information Center

    Isman, Beverly

    2007-01-01

    The purpose of this Toolkit is to provide guidance, technical tools, and resources to help states, territories, tribes and communities develop and implement successful oral health components of Healthy People 2010 plans as well as other oral health plans. These plans are useful for: (1) promoting, implementing and tracking oral health objectives;…

  5. A Toolkit for the Effective Teaching Assistant

    ERIC Educational Resources Information Center

    Tyrer, Richard; Gunn, Stuart; Lee, Chris; Parker, Maureen; Pittman, Mary; Townsend, Mark

    2004-01-01

    This book offers the notion of a "toolkit" to allow Teaching Assistants (TAs) and colleagues to review and revise their thinking and practice about real issues and challenges in managing individuals, groups, colleagues and themselves in school. In a rapidly changing educational environment the book focuses on combining the underpinning knowledge…

  6. Portable Extensible Toolkit for Scientific Computation

    1995-06-30

    PETSC2.0 is a software toolkit for portable, parallel (and serial) numerical solution of partial differential equations and minimization problems. It includes software for the solution of linear and nonlinear systems of equations. These codes are written in a data-structure-neutral manner to enable easy reuse and flexibility.

  7. Global Arrays Parallel Programming Toolkit

    SciTech Connect

    Nieplocha, Jaroslaw; Krishnan, Manoj Kumar; Palmer, Bruce J.; Tipparaju, Vinod; Harrison, Robert J.; Chavarría-Miranda, Daniel

    2011-01-01

    The two predominant classes of programming models for parallel computing are distributed memory and shared memory. Both shared memory and distributed memory models have advantages and shortcomings. Shared memory model is much easier to use but it ignores data locality/placement. Given the hierarchical nature of the memory subsystems in modern computers this characteristic can have a negative impact on performance and scalability. Careful code restructuring to increase data reuse and replacing fine grain load/stores with block access to shared data can address the problem and yield performance for shared memory that is competitive with message-passing. However, this performance comes at the cost of compromising the ease of use that the shared memory model advertises. Distributed memory models, such as message-passing or one-sided communication, offer performance and scalability but they are difficult to program. The Global Arrays toolkit attempts to offer the best features of both models. It implements a shared-memory programming model in which data locality is managed by the programmer. This management is achieved by calls to functions that transfer data between a global address space (a distributed array) and local storage. In this respect, the GA model has similarities to the distributed shared-memory models that provide an explicit acquire/release protocol. However, the GA model acknowledges that remote data is slower to access than local data and allows data locality to be specified by the programmer and hence managed. GA is related to the global address space languages such as UPC, Titanium, and, to a lesser extent, Co-Array Fortran. In addition, by providing a set of data-parallel operations, GA is also related to data-parallel languages such as HPF, ZPL, and Data Parallel C. However, the Global Array programming model is implemented as a library that works with most languages used for technical computing and does not rely on compiler technology for achieving

  8. The Reconstruction Toolkit (RTK), an open-source cone-beam CT reconstruction toolkit based on the Insight Toolkit (ITK)

    NASA Astrophysics Data System (ADS)

    Rit, S.; Vila Oliva, M.; Brousmiche, S.; Labarbe, R.; Sarrut, D.; Sharp, G. C.

    2014-03-01

    We propose the Reconstruction Toolkit (RTK, http://www.openrtk.org), an open-source toolkit for fast cone-beam CT reconstruction, based on the Insight Toolkit (ITK) and using GPU code extracted from Plastimatch. RTK is developed by an open consortium (see affiliations) under the non-contaminating Apache 2.0 license. The quality of the platform is daily checked with regression tests in partnership with Kitware, the company supporting ITK. Several features are already available: Elekta, Varian and IBA inputs, multi-threaded Feldkamp-David-Kress reconstruction on CPU and GPU, Parker short scan weighting, multi-threaded CPU and GPU forward projectors, etc. Each feature is either accessible through command line tools or C++ classes that can be included in independent software. A MIDAS community has been opened to share CatPhan datasets of several vendors (Elekta, Varian and IBA). RTK will be used in the upcoming cone-beam CT scanner developed by IBA for proton therapy rooms. Many features are under development: new input format support, iterative reconstruction, hybrid Monte Carlo / deterministic CBCT simulation, etc. RTK has been built to freely share tomographic reconstruction developments between researchers and is open for new contributions.

  9. Construction aggregates

    USGS Publications Warehouse

    Langer, W.H.; Tepordei, V.V.; Bolen, W.P.

    2000-01-01

    Construction aggregates consist primarily of crushed stone and construction sand and gravel. Total estimated production of construction aggregates increased in 1999 by about 2% to 2.39 Gt (2.64 billion st) compared with 1998. This record production level continued an expansion that began in 1992. By commodities, crushed stone production increased 3.3%, while sand and gravel production increased by about 0.5%.

  10. Construction aggregates

    USGS Publications Warehouse

    Tepordei, V.V.

    1994-01-01

    Part of a special section on industrial minerals in 1993. The 1993 production of construction aggregates increased 6.3 percent over the 1992 figure, to reach 2.01 Gt. This represents the highest estimated annual production of combined crushed stone and construction sand and gravel ever recorded in the U.S. The outlook for construction aggregates and the issues facing the industry are discussed.

  11. Observation option toolkit for acute otitis media.

    PubMed

    Rosenfeld, R M

    2001-04-01

    The observation option for acute otitis media (AOM) refers to deferring antibiotic treatment of selected children for up to 3 days, during which time management is limited to analgesics and symptomatic relief. With appropriate follow-up complications are not increased, and clinical outcomes compare favorably with routine initial antibiotic therapy. Although used commonly in the Netherlands and certain Scandinavian countries, this approach has not gained wide acceptance in Europe and the United States. This article describes an evidence-based toolkit developed by the New York Region Otitis Project for judicious use of the observation option. The toolkit is not intended to endorse the observation option as a preferred method of management, nor is it intended as a rigid practice guideline to supplant clinician judgement. Rather, it presents busy clinicians with the tools needed to implement the observation option in everyday patient care should they so desire.

  12. A toolkit for detecting technical surprise.

    SciTech Connect

    Trahan, Michael Wayne; Foehse, Mark C.

    2010-10-01

    The detection of a scientific or technological surprise within a secretive country or institute is very difficult. The ability to detect such surprises would allow analysts to identify the capabilities that could be a military or economic threat to national security. Sandia's current approach utilizing ThreatView has been successful in revealing potential technological surprises. However, as data sets become larger, it becomes critical to use algorithms as filters along with the visualization environments. Our two-year LDRD had two primary goals. First, we developed a tool, a Self-Organizing Map (SOM), to extend ThreatView and improve our understanding of the issues involved in working with textual data sets. Second, we developed a toolkit for detecting indicators of technical surprise in textual data sets. Our toolkit has been successfully used to perform technology assessments for the Science & Technology Intelligence (S&TI) program.

  13. MCS Large Cluster Systems Software Toolkit

    SciTech Connect

    Evard, Remy; Navarro, John-Paul; Nurmi, Daniel

    2002-11-01

    This package contains a number of systems utilities for managing a set of computers joined in a "cluster". The utilities assist a team of systems administrators in managing the cluster by automating routine tasks, centralizing information, and monitoring individual computers within the cluster. Included in the toolkit are scripts used to boot a computer from a floppy, a program to turn on and off the power to a system, and a system for using a database to organize cluster information.

  14. chemf: A purely functional chemistry toolkit

    PubMed Central

    2012-01-01

    Background Although programming in a type-safe and referentially transparent style offers several advantages over working with mutable data structures and side effects, this style of programming has not seen much use in chemistry-related software. Since functional programming languages were designed with referential transparency in mind, these languages offer a lot of support when writing immutable data structures and side-effects free code. We therefore started implementing our own toolkit based on the above programming paradigms in a modern, versatile programming language. Results We present our initial results with functional programming in chemistry by first describing an immutable data structure for molecular graphs together with a couple of simple algorithms to calculate basic molecular properties before writing a complete SMILES parser in accordance with the OpenSMILES specification. Along the way we show how to deal with input validation, error handling, bulk operations, and parallelization in a purely functional way. At the end we also analyze and improve our algorithms and data structures in terms of performance and compare it to existing toolkits both object-oriented and purely functional. All code was written in Scala, a modern multi-paradigm programming language with a strong support for functional programming and a highly sophisticated type system. Conclusions We have successfully made the first important steps towards a purely functional chemistry toolkit. The data structures and algorithms presented in this article perform well while at the same time they can be safely used in parallelized applications, such as computer aided drug design experiments, without further adjustments. This stands in contrast to existing object-oriented toolkits where thread safety of data structures and algorithms is a deliberate design decision that can be hard to implement. Finally, the level of type-safety achieved by Scala highly increased the reliability of our code

  15. HVAC Fault Detection and Diagnosis Toolkit

    2004-12-31

    This toolkit supports component-level model-based fault detection methods in commercial building HVAC systems. The toolbox consists of five basic modules: a parameter estimator for model calibration, a preprocessor, an AHU model simulator, a steady-state detector, and a comparator. Each of these modules and the fuzzy logic rules for fault diagnosis are described in detail. The toolbox is written in C++ and also invokes the SPARK simulation program.

  16. Application experiences with the Globus toolkit.

    SciTech Connect

    Brunett, S.

    1998-06-09

    The Globus grid toolkit is a collection of software components designed to support the development of applications for high-performance distributed computing environments, or ''computational grids'' [14]. The Globus toolkit is an implementation of a ''bag of services'' architecture, which provides application and tool developers not with a monolithic system but rather with a set of stand-alone services. Each Globus component provides a basic service, such as authentication, resource allocation, information, communication, fault detection, and remote data access. Different applications and tools can combine these services in different ways to construct ''grid-enabled'' systems. The Globus toolkit has been used to construct the Globus Ubiquitous Supercomputing Testbed, or GUSTO: a large-scale testbed spanning 20 sites and included over 4000 compute nodes for a total compute power of over 2 TFLOPS. Over the past six months, we and others have used this testbed to conduct a variety of application experiments, including multi-user collaborative environments (tele-immersion), computational steering, distributed supercomputing, and high throughput computing. The goal of this paper is to review what has been learned from these experiments regarding the effectiveness of the toolkit approach. To this end, we describe two of the application experiments in detail, noting what worked well and what worked less well. The two applications are a distributed supercomputing application, SF-Express, in which multiple supercomputers are harnessed to perform large distributed interactive simulations; and a tele-immersion application, CAVERNsoft, in which the focus is on connecting multiple people to a distributed simulated world.

  17. Mission Operations and Navigation Toolkit Environment

    NASA Technical Reports Server (NTRS)

    Sunseri, Richard F.; Wu, Hsi-Cheng; Hanna, Robert A.; Mossey, Michael P.; Duncan, Courtney B.; Evans, Scott E.; Evans, James R.; Drain, Theodore R.; Guevara, Michelle M.; Martin Mur, Tomas J.; Attiyah, Ahlam A.

    2009-01-01

    MONTE (Mission Operations and Navigation Toolkit Environment) Release 7.3 is an extensible software system designed to support trajectory and navigation analysis/design for space missions. MONTE is intended to replace the current navigation and trajectory analysis software systems, which, at the time of this reporting, are used by JPL's Navigation and Mission Design section. The software provides an integrated, simplified, and flexible system that can be easily maintained to serve the needs of future missions in need of navigation services.

  18. An Overview of the Geant4 Toolkit

    NASA Astrophysics Data System (ADS)

    Apostolakis, John; Wright, Dennis H.

    2007-03-01

    Geant4 is a toolkit for the simulation of the transport of radiation through matter. With a flexible kernel and choices between different physics modeling choices, it has been tailored to the requirements of a wide range of applications. With the toolkit a user can describe a setup's or detector's geometry and materials, navigate inside it, simulate the physical interactions using a choice of physics engines, underlying physics cross-sections and models, visualise and store results. Physics models describing electromagnetic and hadronic interactions are provided, as are decays and processes for optical photons. Several models, with different precision and performance are available for many processes. The toolkit includes coherent physics model configurations, which are called physics lists. Users can choose an existing physics list or create their own, depending on their requirements and the application area. A clear structure and readable code, enable the user to investigate the origin of physics results. Application areas include detector simulation and background simulation in High Energy Physics experiments, simulation of accelerator setups, studies in medical imaging and treatment, and the study of the effects of solar radiation on spacecraft instruments.

  19. VIDE: The Void IDentification and Examination toolkit

    NASA Astrophysics Data System (ADS)

    Sutter, P. M.; Lavaux, G.; Hamaus, N.; Pisani, A.; Wandelt, B. D.; Warren, M.; Villaescusa-Navarro, F.; Zivick, P.; Mao, Q.; Thompson, B. B.

    2015-03-01

    We present VIDE, the Void IDentification and Examination toolkit, an open-source Python/C++ code for finding cosmic voids in galaxy redshift surveys and N -body simulations, characterizing their properties, and providing a platform for more detailed analysis. At its core, VIDE uses a substantially enhanced version of ZOBOV (Neyinck 2008) to calculate a Voronoi tessellation for estimating the density field and performing a watershed transform to construct voids. Additionally, VIDE provides significant functionality for both pre- and post-processing: for example, VIDE can work with volume- or magnitude-limited galaxy samples with arbitrary survey geometries, or dark matter particles or halo catalogs in a variety of common formats. It can also randomly subsample inputs and includes a Halo Occupation Distribution model for constructing mock galaxy populations. VIDE uses the watershed levels to place voids in a hierarchical tree, outputs a summary of void properties in plain ASCII, and provides a Python API to perform many analysis tasks, such as loading and manipulating void catalogs and particle members, filtering, plotting, computing clustering statistics, stacking, comparing catalogs, and fitting density profiles. While centered around ZOBOV, the toolkit is designed to be as modular as possible and accommodate other void finders. VIDE has been in development for several years and has already been used to produce a wealth of results, which we summarize in this work to highlight the capabilities of the toolkit. VIDE is publicly available at

  20. Capturing Petascale Application Characteristics with the Sequoia Toolkit

    SciTech Connect

    Vetter, Jeffrey S; Bhatia, Nikhil; Grobelny, Eric M; Roth, Philip C

    2005-09-01

    Characterization of the computation, communication, memory, and I/O demands of current scientific applications is crucial for identifying which technologies will enable petascale scientific computing. In this paper, we present the Sequoia Toolkit for characterizing HPC applications. The Sequoia Toolkit consists of the Sequoia trace capture library and the Sequoia Event Analysis Library, or SEAL, that facilitates the development of tools for analyzing Sequoia event traces. Using the Sequoia Toolkit, we have characterized the behavior of application runs with up to 2048 application processes. To illustrate the use of the Sequoia Toolkit, we present a preliminary characterization of LAMMPS, a molecular dynamics application of great interest to the computational biology community.

  1. An evaluation capacity building toolkit for principal investigators of undergraduate research experiences: A demonstration of transforming theory into practice.

    PubMed

    Rorrer, Audrey S

    2016-04-01

    This paper describes the approach and process undertaken to develop evaluation capacity among the leaders of a federally funded undergraduate research program. An evaluation toolkit was developed for Computer and Information Sciences and Engineering(1) Research Experiences for Undergraduates(2) (CISE REU) programs to address the ongoing need for evaluation capacity among principal investigators who manage program evaluation. The toolkit was the result of collaboration within the CISE REU community with the purpose being to provide targeted instructional resources and tools for quality program evaluation. Challenges were to balance the desire for standardized assessment with the responsibility to account for individual program contexts. Toolkit contents included instructional materials about evaluation practice, a standardized applicant management tool, and a modulated outcomes measure. Resulting benefits from toolkit deployment were having cost effective, sustainable evaluation tools, a community evaluation forum, and aggregate measurement of key program outcomes for the national program. Lessons learned included the imperative of understanding the evaluation context, engaging stakeholders, and building stakeholder trust. Results from project measures are presented along with a discussion of guidelines for facilitating evaluation capacity building that will serve a variety of contexts.

  2. An evaluation capacity building toolkit for principal investigators of undergraduate research experiences: A demonstration of transforming theory into practice.

    PubMed

    Rorrer, Audrey S

    2016-04-01

    This paper describes the approach and process undertaken to develop evaluation capacity among the leaders of a federally funded undergraduate research program. An evaluation toolkit was developed for Computer and Information Sciences and Engineering(1) Research Experiences for Undergraduates(2) (CISE REU) programs to address the ongoing need for evaluation capacity among principal investigators who manage program evaluation. The toolkit was the result of collaboration within the CISE REU community with the purpose being to provide targeted instructional resources and tools for quality program evaluation. Challenges were to balance the desire for standardized assessment with the responsibility to account for individual program contexts. Toolkit contents included instructional materials about evaluation practice, a standardized applicant management tool, and a modulated outcomes measure. Resulting benefits from toolkit deployment were having cost effective, sustainable evaluation tools, a community evaluation forum, and aggregate measurement of key program outcomes for the national program. Lessons learned included the imperative of understanding the evaluation context, engaging stakeholders, and building stakeholder trust. Results from project measures are presented along with a discussion of guidelines for facilitating evaluation capacity building that will serve a variety of contexts. PMID:26788814

  3. Water Security Toolkit User Manual Version 1.2.

    SciTech Connect

    Klise, Katherine A.; Siirola, John Daniel; Hart, David; Hart, William Eugene; Phillips, Cynthia Ann; Haxton, Terranna; Murray, Regan; Janke, Robert; Taxon, Thomas; Laird, Carl; Seth, Arpan; Hackebeil, Gabriel; McGee, Shawn; Mann, Angelica

    2014-08-01

    The Water Security Toolkit (WST) is a suite of open source software tools that can be used by water utilities to create response strategies to reduce the impact of contamination in a water distribution network . WST includes hydraulic and water quality modeling software , optimizati on methodologies , and visualization tools to identify: (1) sensor locations to detect contamination, (2) locations in the network in which the contamination was introduced, (3) hydrants to remove contaminated water from the distribution system, (4) locations in the network to inject decontamination agents to inactivate, remove, or destroy contaminants, (5) locations in the network to take grab sample s to help identify the source of contamination and (6) valves to close in order to isolate contaminate d areas of the network. This user manual describes the different components of WST , along w ith examples and case studies. License Notice The Water Security Toolkit (WST) v.1.2 Copyright c 2012 Sandia Corporation. Under the terms of Contract DE-AC04-94AL85000, there is a non-exclusive license for use of this work by or on behalf of the U.S. government. This software is distributed under the Revised BSD License (see below). In addition, WST leverages a variety of third-party software packages, which have separate licensing policies: Acro Revised BSD License argparse Python Software Foundation License Boost Boost Software License Coopr Revised BSD License Coverage BSD License Distribute Python Software Foundation License / Zope Public License EPANET Public Domain EPANET-ERD Revised BSD License EPANET-MSX GNU Lesser General Public License (LGPL) v.3 gcovr Revised BSD License GRASP AT&T Commercial License for noncommercial use; includes randomsample and sideconstraints executable files LZMA SDK Public Domain nose GNU Lesser General Public License (LGPL) v.2.1 ordereddict MIT License pip MIT License PLY BSD License PyEPANET Revised BSD License Pyro MIT License PyUtilib Revised BSD License Py

  4. The ALMA Data Mining Toolkit I: Archive Setup and User Usage

    NASA Astrophysics Data System (ADS)

    Teuben, P.; Pound, M.; Mundy, L.; Looney, L.; Friedel, D. N.

    2014-05-01

    We report on an ALMA development study and project where we employ a novel approach to add data and data descriptors to ALMA archive data and allowing further flexible data mining on retrieved data. We call our toolkit ADMIT (the ALMA Data Mining Toolkit) that works within the Python based CASA environment. What is described here is a design study, with some exiting toy code to prove the concept. After ingestion of science ready datacubes, ADMIT will compute a number of basic and advanced data products, and their descriptors. Examples of such data products are cube statistics, line identification tables, line cubes, moment maps, an integrated spectrum, overlap integrals and feature extraction tables. Together with a descriptive XML file, a small number of visual aids are added to a ZIP file that is deposited into the archive. Large datasets (such as line cubes) will have to be rederived by the user once they have also downloaded the actual ALMA Data Products, or via VO services if available. ADMIT enables the user to rederive all its products with different methods and parameters, and compare archive product with their own.

  5. Designing and Delivering Intensive Interventions: A Teacher's Toolkit

    ERIC Educational Resources Information Center

    Murray, Christy S.; Coleman, Meghan A.; Vaughn, Sharon; Wanzek, Jeanne; Roberts, Greg

    2012-01-01

    This toolkit provides activities and resources to assist practitioners in designing and delivering intensive interventions in reading and mathematics for K-12 students with significant learning difficulties and disabilities. Grounded in research, this toolkit is based on the Center on Instruction's "Intensive Interventions for Students Struggling…

  6. The Fostering Algebraic Thinking Toolkit: A Guide for Staff Development.

    ERIC Educational Resources Information Center

    Driscoll, Mark; Zawojeski, Judith; Humez, Andrea; Nikula, Johannah; Goldsmith, Lynn; Hammerman, James

    This toolkit contains a set of professional development materials whose goal is to help mathematics teachers in grades 6-10 learn to identify, describe, and foster algebraic thinking in their students. A core belief underlying the Toolkit is that good mathematics teaching begins with understanding how mathematics is learned, so these materials…

  7. Quality Assurance Toolkit for Distance Higher Education Institutions and Programmes

    ERIC Educational Resources Information Center

    Rama, Kondapalli, Ed.; Hope, Andrea, Ed.

    2009-01-01

    The Commonwealth of Learning is proud to partner with the Sri Lankan Ministry of Higher Education and UNESCO to produce this "Quality Assurance Toolkit for Distance Higher Education Institutions and Programmes". The Toolkit has been prepared with three features. First, it is a generic document on quality assurance, complete with a glossary of…

  8. Graph algorithms in the titan toolkit.

    SciTech Connect

    McLendon, William Clarence, III; Wylie, Brian Neil

    2009-10-01

    Graph algorithms are a key component in a wide variety of intelligence analysis activities. The Graph-Based Informatics for Non-Proliferation and Counter-Terrorism project addresses the critical need of making these graph algorithms accessible to Sandia analysts in a manner that is both intuitive and effective. Specifically we describe the design and implementation of an open source toolkit for doing graph analysis, informatics, and visualization that provides Sandia with novel analysis capability for non-proliferation and counter-terrorism.

  9. MCS Large Cluster Systems Software Toolkit

    2002-11-01

    This package contains a number of systems utilities for managing a set of computers joined in a "cluster". The utilities assist a team of systems administrators in managing the cluster by automating routine tasks, centralizing information, and monitoring individual computers within the cluster. Included in the toolkit are scripts used to boot a computer from a floppy, a program to turn on and off the power to a system, and a system for using amore » database to organize cluster information.« less

  10. An Incident Management Preparedness and Coordination Toolkit

    SciTech Connect

    Koch, Daniel B; Payne, Patricia W

    2012-01-01

    Although the use of Geographic Information Systems (GIS) by centrally-located operations staff is well established in the area of emergency response, utilization by first responders in the field is uneven. Cost, complexity, and connectivity are often the deciding factors preventing wider adoption. For the past several years, Oak Ridge National Laboratory (ORNL) has been developing a mobile GIS solution using free and open-source software targeting the needs of front-line personnel. Termed IMPACT, for Incident Management Preparedness and Coordination Toolkit, this ORNL application can complement existing GIS infrastructure and extend its power and capabilities to responders first on the scene of a natural or man-made disaster.

  11. Accelerator physics analysis with an integrated toolkit

    SciTech Connect

    Holt, J.A.; Michelotti, L.; Satogata, T.

    1992-08-01

    Work is in progress on an integrated software toolkit for linear and nonlinear accelerator design, analysis, and simulation. As a first application, beamline'' and MXYZPTLK'' (differential algebra) class libraries, were used with an X Windows graphics library to build an user-friendly, interactive phase space tracker which, additionally, finds periodic orbits. This program was used to analyse a theoretical lattice which contains octupoles and decapoles to find the 20th order, stable and unstable periodic orbits and to explore the local phase space structure.

  12. Tips from the toolkit: 1 - know yourself.

    PubMed

    Steer, Neville

    2010-01-01

    High performance organisations review their strategy and business processes as part of usual business operations. If you are new to the field of general practice, do you have a career plan for the next 5-10 years? If you are an experienced general practitioner, are you using much the same business model and processes as when you started out? The following article sets out some ideas you might use to have a fresh approach to your professional career. It is based on The Royal Australian College of General Practitioners' 'General practice management toolkit'. PMID:20369141

  13. A digital toolkit to implement and manage a multisite study.

    PubMed

    Lasater, Kathie; Johnson, Elizabeth; Hodson-Carlton, Kay; Siktberg, Linda; Sideras, Stephanie

    2012-03-01

    Calls for multisite studies are increasing in nursing education. However, the challenge of implementing consistent protocols and maintaining rigorous standards across sites can be daunting. One purpose of a recent multisite, collaborative, simulation study was to evaluate a digital toolkit's effectiveness for managing a multisite study. We describe the digital toolkit composed of Web-based technologies used to manage a study involving five sites including one United Kingdom site. The digital toolkit included a wiki, a project Web site to coordinate the protocols and study materials, software to organize study materials, and a secure location for sharing data. Most of these are familiar tools; however, combined as a toolkit, they became a useful management system. Web-based communication strategies and coordinated technical support served as key adjuncts to foster collaboration. This article also offers practical implications and recommendations for using a digital toolkit in other multisite studies.

  14. Construction aggregates

    USGS Publications Warehouse

    Nelson, T.I.; Bolen, W.P.

    2007-01-01

    Construction aggregates, primarily stone, sand and gravel, are recovered from widespread naturally occurring mineral deposits and processed for use primarily in the construction industry. They are mined, crushed, sorted by size and sold loose or combined with portland cement or asphaltic cement to make concrete products to build roads, houses, buildings, and other structures. Much smaller quantities are used in agriculture, cement manufacture, chemical and metallurgical processes, glass production and many other products.

  15. Construction aggregates

    USGS Publications Warehouse

    Tepordei, V.V.

    1996-01-01

    Part of the Annual Commodities Review 1995. Production of construction aggregates such as crushed stone and construction sand and gravel showed a marginal increase in 1995. Most of the 1995 increases were due to funding for highway construction work. The major areas of concern to the industry included issues relating to wetlands classification and the classification of crystalline silica as a probable human carcinogen. Despite this, an increase in demand is anticipated for 1996.

  16. Construction aggregates

    USGS Publications Warehouse

    Tepordei, V.V.

    1993-01-01

    Part of a special section on the market performance of industrial minerals in 1992. Production of construction aggregates increased by 4.6 percent in 1992. This increase was due, in part, to the increased funding for transportation and infrastructure projects. The U.S. produced about 1.05 Gt of crushed stone and an estimated 734 Mt of construction sand and gravel in 1992. Demand is expected to increase by about 5 percent in 1993.

  17. ADMIT: The ALMA Data Mining Toolkit

    NASA Astrophysics Data System (ADS)

    Teuben, P.; Pound, M.; Mundy, L.; Rauch, K.; Friedel, D.; Looney, L.; Xu, L.; Kern, J.

    2015-09-01

    ADMIT (ALMA Data Mining ToolkiT), a toolkit for the creation of new science products from ALMA data, is being developed as an ALMA Development Project. It is written in Python and, while specifically targeted for a uniform analysis of the ALMA science products that come out of the ALMA pipeline, it is designed to be generally applicable to (radio) astronomical data. It first provides users with a detailed view of their science products created by ADMIT inside the ALMA pipeline: line identifications, line ‘cutout' cubes, moment maps, emission type analysis (e.g., feature detection). Using descriptor vectors the ALMA data archive is enriched with useful information to make archive data mining possible. Users can also opt to download the (small) ADMIT pipeline product, then fine-tune and re-run the pipeline and inspect their hopefully improved data. By running many projects in a parallel fashion, data mining between many astronomical sources and line transitions will also be possible. Future implementations of ADMIT may include EVLA and other instruments.

  18. The Virtual Physiological Human ToolKit.

    PubMed

    Cooper, Jonathan; Cervenansky, Frederic; De Fabritiis, Gianni; Fenner, John; Friboulet, Denis; Giorgino, Toni; Manos, Steven; Martelli, Yves; Villà-Freixa, Jordi; Zasada, Stefan; Lloyd, Sharon; McCormack, Keith; Coveney, Peter V

    2010-08-28

    The Virtual Physiological Human (VPH) is a major European e-Science initiative intended to support the development of patient-specific computer models and their application in personalized and predictive healthcare. The VPH Network of Excellence (VPH-NoE) project is tasked with facilitating interaction between the various VPH projects and addressing issues of common concern. A key deliverable is the 'VPH ToolKit'--a collection of tools, methodologies and services to support and enable VPH research, integrating and extending existing work across Europe towards greater interoperability and sustainability. Owing to the diverse nature of the field, a single monolithic 'toolkit' is incapable of addressing the needs of the VPH. Rather, the VPH ToolKit should be considered more as a 'toolbox' of relevant technologies, interacting around a common set of standards. The latter apply to the information used by tools, including any data and the VPH models themselves, and also to the naming and categorizing of entities and concepts involved. Furthermore, the technologies and methodologies available need to be widely disseminated, and relevant tools and services easily found by researchers. The VPH-NoE has thus created an online resource for the VPH community to meet this need. It consists of a database of tools, methods and services for VPH research, with a Web front-end. This has facilities for searching the database, for adding or updating entries, and for providing user feedback on entries. Anyone is welcome to contribute. PMID:20643685

  19. The Best Ever Alarm System Toolkit

    SciTech Connect

    Kasemir, Kay; Chen, Xihui; Danilova, Katia

    2009-01-01

    Learning from our experience with the standard Experimental Physics and Industrial Control System (EPICS) alarm handler (ALH) as well as a similar intermediate approach based on script-generated operator screens, we developed the Best Ever Alarm System Toolkit (BEAST). It is based on Java and Eclipse on the Control System Studio (CSS) platform, using a relational database (RDB) to store the configuration and log actions. It employs a Java Message Service (JMS) for communication between the modular pieces of the toolkit, which include an Alarm Server to maintain the current alarm state, an arbitrary number of Alarm Client user interfaces (GUI), and tools to annunciate alarms or log alarm related actions. Web reports allow us to monitor the alarm system performance and spot deficiencies in the alarm configuration. The Alarm Client GUI not only gives the end users various ways to view alarms in tree and table, but also makes it easy to access the guidance information, the related operator displays and other CSS tools. It also allows online configuration to be simply modified from the GUI. Coupled with a good "alarm philosophy" on how to provide useful alarms, we can finally improve the configuration to achieve an effective alarm system.

  20. Applications toolkit for accelerator control and analysis

    SciTech Connect

    Borland, M.

    1997-06-01

    The Advanced Photon Source (APS) has taken a unique approach to creating high-level software applications for accelerator operation and analysis. The approach is based on self-describing data, modular program toolkits, and scripts. Self-describing data provide a communication standard that aids the creation of modular program toolkits by allowing compliant programs to be used in essentially arbitrary combinations. These modular programs can be used as part of an arbitrary number of high-level applications. At APS, a group of about 70 data analysis, manipulation, and display tools is used in concert with about 20 control-system-specific tools to implement applications for commissioning and operations. High-level applications are created using scripts, which are relatively simple interpreted programs. The Tcl/Tk script language is used, allowing creating of graphical user interfaces (GUIs) and a library of algorithms that are separate from the interface. This last factor allows greater automation of control by making it easy to take the human out of the loop. Applications of this methodology to operational tasks such as orbit correction, configuration management, and data review will be discussed.

  1. Construction aggregates

    USGS Publications Warehouse

    Bolen, W.P.; Tepordei, V.V.

    2001-01-01

    The estimated production during 2000 of construction aggregates, crushed stone, and construction sand and gravel increased by about 2.6% to 2.7 Gt (3 billion st), compared with 1999. The expansion that started in 1992 continued with record production levels for the ninth consecutive year. By commodity, construction sand and gravel production increased by 4.5% to 1.16 Gt (1.28 billion st), while crushed stone production increased by 1.3% to 1.56 Gt (1.72 billion st).

  2. The image-guided surgery toolkit IGSTK: an open source C++ software toolkit.

    PubMed

    Enquobahrie, Andinet; Cheng, Patrick; Gary, Kevin; Ibanez, Luis; Gobbi, David; Lindseth, Frank; Yaniv, Ziv; Aylward, Stephen; Jomier, Julien; Cleary, Kevin

    2007-11-01

    This paper presents an overview of the image-guided surgery toolkit (IGSTK). IGSTK is an open source C++ software library that provides the basic components needed to develop image-guided surgery applications. It is intended for fast prototyping and development of image-guided surgery applications. The toolkit was developed through a collaboration between academic and industry partners. Because IGSTK was designed for safety-critical applications, the development team has adopted lightweight software processes that emphasizes safety and robustness while, at the same time, supporting geographically separated developers. A software process that is philosophically similar to agile software methods was adopted emphasizing iterative, incremental, and test-driven development principles. The guiding principle in the architecture design of IGSTK is patient safety. The IGSTK team implemented a component-based architecture and used state machine software design methodologies to improve the reliability and safety of the components. Every IGSTK component has a well-defined set of features that are governed by state machines. The state machine ensures that the component is always in a valid state and that all state transitions are valid and meaningful. Realizing that the continued success and viability of an open source toolkit depends on a strong user community, the IGSTK team is following several key strategies to build an active user community. These include maintaining a users and developers' mailing list, providing documentation (application programming interface reference document and book), presenting demonstration applications, and delivering tutorial sessions at relevant scientific conferences. PMID:17703338

  3. Demonstration of the Health Literacy Universal Precautions Toolkit

    PubMed Central

    Mabachi, Natabhona M.; Cifuentes, Maribel; Barnard, Juliana; Brega, Angela G.; Albright, Karen; Weiss, Barry D.; Brach, Cindy; West, David

    2016-01-01

    The Agency for Healthcare Research and Quality Health Literacy Universal Precautions Toolkit was developed to help primary care practices assess and make changes to improve communication with and support for patients. Twelve diverse primary care practices implemented assigned tools over a 6-month period. Qualitative results revealed challenges practices experienced during implementation, including competing demands, bureaucratic hurdles, technological challenges, limited quality improvement experience, and limited leadership support. Practices used the Toolkit flexibly and recognized the efficiencies of implementing tools in tandem and in coordination with other quality improvement initiatives. Practices recommended reducing Toolkit density and making specific refinements. PMID:27232681

  4. Data Exploration Toolkit for serial diffraction experiments

    DOE PAGES

    Zeldin, Oliver B.; Brewster, Aaron S.; Hattne, Johan; Uervirojnangkoorn, Monarin; Lyubimov, Artem Y.; Zhou, Qiangjun; Zhao, Minglei; Weis, William I.; Sauter, Nicholas K.; Brunger, Axel T.

    2015-01-23

    Ultrafast diffraction at X-ray free-electron lasers (XFELs) has the potential to yield new insights into important biological systems that produce radiation-sensitive crystals. An unavoidable feature of the 'diffraction before destruction' nature of these experiments is that images are obtained from many distinct crystals and/or different regions of the same crystal. Combined with other sources of XFEL shot-to-shot variation, this introduces significant heterogeneity into the diffraction data, complicating processing and interpretation. To enable researchers to get the most from their collected data, a toolkit is presented that provides insights into the quality of, and the variation present in, serial crystallography datamore » sets. These tools operate on the unmerged, partial intensity integration results from many individual crystals, and can be used on two levels: firstly to guide the experimental strategy during data collection, and secondly to help users make informed choices during data processing.« less

  5. NBII-SAIN Data Management Toolkit

    USGS Publications Warehouse

    Burley, Thomas E.; Peine, John D.

    2009-01-01

    percent of the cost of a spatial information system is associated with spatial data collection and management (U.S. General Accounting Office, 2003). These figures indicate that the resources (time, personnel, money) of many agencies and organizations could be used more efficiently and effectively. Dedicated and conscientious data management coordination and documentation is critical for reducing such redundancy. Substantial cost savings and increased efficiency are direct results of a pro-active data management approach. In addition, details of projects as well as data and information are frequently lost as a result of real-world occurrences such as the passing of time, job turnover, and equipment changes and failure. A standardized, well documented database allows resource managers to identify issues, analyze options, and ultimately make better decisions in the context of adaptive management (National Land and Water Resources Audit and the Australia New Zealand Land Information Council on behalf of the Australian National Government, 2003). Many environmentally focused, scientific, or natural resource management organizations collect and create both spatial and non-spatial data in some form. Data management appropriate for those data will be contingent upon the project goal(s) and objectives and thus will vary on a case-by-case basis. This project and the resulting Data Management Toolkit, hereafter referred to as the Toolkit, is therefore not intended to be comprehensive in terms of addressing all of the data management needs of all projects that contain biological, geospatial, and other types of data. The Toolkit emphasizes the idea of connecting a project's data and the related management needs to the defined project goals and objectives from the outset. In that context, the Toolkit presents and describes the fundamental components of sound data and information management that are common to projects involving biological, geospatial, and other related data

  6. Introduction to the Geant4 Simulation toolkit

    SciTech Connect

    Guatelli, S.; Cutajar, D.; Rosenfeld, A. B.; Oborn, B.

    2011-05-05

    Geant4 is a Monte Carlo simulation Toolkit, describing the interactions of particles with matter. Geant4 is widely used in radiation physics research, from High Energy Physics, to medical physics and space science, thanks to its sophisticated physics component, coupled with advanced functionality in geometry description. Geant4 is widely used at the Centre for Medical Radiation Physics (CMRP), at the University of Wollongong, to characterise and optimise novel detector concepts, radiotherapy treatments, and imaging solutions. This lecture consists of an introduction to Monte Carlo method, and to Geant4. Particular attention will be devoted to the Geant4 physics component, and to the physics models describing electromagnetic and hadronic physics interactions. The second part of the lecture will be focused on the methodology to adopt to develop a Geant4 simulation application.

  7. Data Exploration Toolkit for serial diffraction experiments

    PubMed Central

    Zeldin, Oliver B.; Brewster, Aaron S.; Hattne, Johan; Uervirojnangkoorn, Monarin; Lyubimov, Artem Y.; Zhou, Qiangjun; Zhao, Minglei; Weis, William I.; Sauter, Nicholas K.; Brunger, Axel T.

    2015-01-01

    Ultrafast diffraction at X-ray free-electron lasers (XFELs) has the potential to yield new insights into important biological systems that produce radiation-sensitive crystals. An unavoidable feature of the ‘diffraction before destruction’ nature of these experiments is that images are obtained from many distinct crystals and/or different regions of the same crystal. Combined with other sources of XFEL shot-to-shot variation, this introduces significant heterogeneity into the diffraction data, complicating processing and interpretation. To enable researchers to get the most from their collected data, a toolkit is presented that provides insights into the quality of, and the variation present in, serial crystallography data sets. These tools operate on the unmerged, partial intensity integration results from many individual crystals, and can be used on two levels: firstly to guide the experimental strategy during data collection, and secondly to help users make informed choices during data processing. PMID:25664746

  8. Monitoring Extreme-scale Lustre Toolkit

    SciTech Connect

    Brim, Michael J; Lothian, Josh

    2015-01-01

    We discuss the design and ongoing development of the Monitoring Extreme-scale Lustre Toolkit (MELT), a unified Lustre performance monitoring and analysis infrastructure that provides continuous, low-overhead summary information on the health and performance of Lustre, as well as on-demand, in-depth problem diagnosis and root-cause analysis. The MELT infrastructure leverages a distributed overlay network to enable monitoring of center-wide Lustre filesystems where clients are located across many network domains. We preview interactive command-line utilities that help administrators and users to observe Lustre performance at various levels of resolution, from individual servers or clients to whole filesystems, including job-level reporting. Finally, we discuss our future plans for automating the root-cause analysis of common Lustre performance problems.

  9. Introduction to the Geant4 Simulation toolkit

    NASA Astrophysics Data System (ADS)

    Guatelli, S.; Cutajar, D.; Oborn, B.; Rosenfeld, A. B.

    2011-05-01

    Geant4 is a Monte Carlo simulation Toolkit, describing the interactions of particles with matter. Geant4 is widely used in radiation physics research, from High Energy Physics, to medical physics and space science, thanks to its sophisticated physics component, coupled with advanced functionality in geometry description. Geant4 is widely used at the Centre for Medical Radiation Physics (CMRP), at the University of Wollongong, to characterise and optimise novel detector concepts, radiotherapy treatments, and imaging solutions. This lecture consists of an introduction to Monte Carlo method, and to Geant4. Particular attention will be devoted to the Geant4 physics component, and to the physics models describing electromagnetic and hadronic physics interactions. The second part of the lecture will be focused on the methodology to adopt to develop a Geant4 simulation application.

  10. TEVA-SPOT Toolkit 1.2

    2007-07-26

    The TEVA-SPOT Toolkit (SPOT) supports the design of contaminant warning systems (CWSs) that use real-time sensors to detect contaminants in municipal water distribution networks. Specifically, SPOT provides the capability to select the locations for installing sensors in order to maximize the utility and effectiveness of the CWS. SPOT models the sensor placement process as an optimization problem, and the user can specify a wide range of performance objectives for contaminant warning system design, including populationmore » health effects, time to detection, extent of contamination, volume consumed and number of failed detections. For example, a SPOT user can integrate expert knowledge during the design process by specigying required sensor placements or designating network locations as forbidden. Further, cost considerations can be integrated by limiting the design with user-specified installation costs at each location.« less

  11. A Perl toolkit for LIMS development

    PubMed Central

    Morris, James A; Gayther, Simon A; Jacobs, Ian J; Jones, Christopher

    2008-01-01

    Background High throughput laboratory techniques generate huge quantities of scientific data. Laboratory Information Management Systems (LIMS) are a necessary requirement, dealing with sample tracking, data storage and data reporting. Commercial LIMS solutions are available, but these can be both costly and overly complex for the task. The development of bespoke LIMS solutions offers a number of advantages, including the flexibility to fulfil all a laboratory's requirements at a fraction of the price of a commercial system. The programming language Perl is a perfect development solution for LIMS applications because of Perl's powerful but simple to use database and web interaction, it is also well known for enabling rapid application development and deployment, and boasts a very active and helpful developer community. The development of an in house LIMS from scratch however can take considerable time and resources, so programming tools that enable the rapid development of LIMS applications are essential but there are currently no LIMS development tools for Perl. Results We have developed ArrayPipeline, a Perl toolkit providing object oriented methods that facilitate the rapid development of bespoke LIMS applications. The toolkit includes Perl objects that encapsulate key components of a LIMS, providing methods for creating interactive web pages, interacting with databases, error tracking and reporting, and user and session management. The MT_Plate object provides methods for manipulation and management of microtitre plates, while a given LIMS can be encapsulated by extension of the core modules, providing system specific methods for database interaction and web page management. Conclusion This important addition to the Perl developer's library will make the development of in house LIMS applications quicker and easier encouraging laboratories to create bespoke LIMS applications to meet their specific data management requirements. PMID:18353174

  12. Charon Message-Passing Toolkit for Scientific Computations

    NASA Technical Reports Server (NTRS)

    VanderWijngarrt, Rob F.; Saini, Subhash (Technical Monitor)

    1998-01-01

    The Charon toolkit for piecemeal development of high-efficiency parallel programs for scientific computing is described. The portable toolkit, callable from C and Fortran, provides flexible domain decompositions and high-level distributed constructs for easy translation of serial legacy code or design to distributed environments. Gradual tuning can subsequently be applied to obtain high performance, possibly by using explicit message passing. Charon also features general structured communications that support stencil-based computations with complex recurrences. Through the separation of partitioning and distribution, the toolkit can also be used for blocking of uni-processor code, and for debugging of parallel algorithms on serial machines. An elaborate review of recent parallelization aids is presented to highlight the need for a toolkit like Charon. Some performance results of parallelizing the NAS Parallel Benchmark SP program using Charon are given, showing good scalability. Some performance results of parallelizing the NAS Parallel Benchmark SP program using Charon are given, showing good scalability.

  13. Charon Message-Passing Toolkit for Scientific Computations

    NASA Technical Reports Server (NTRS)

    VanderWijngaart, Rob F.; Saini, Subhash (Technical Monitor)

    1998-01-01

    The Charon toolkit for piecemeal development of high-efficiency parallel programs for scientific computing is described. The portable toolkit, callable from C and Fortran, provides flexible domain decompositions and high-level distributed constructs for easy translation of serial legacy code or design to distributed environments. Gradual tuning can subsequently be applied to obtain high performance, possibly by using explicit message passing. Charon also features general structured communications that support stencil-based computations with complex recurrences. Through the separation of partitioning and distribution, the toolkit can also be used for blocking of uni-processor code, and for debugging of parallel algorithms on serial machines. An elaborate review of recent parallelization aids is presented to highlight the need for a toolkit like Charon. Some performance results of parallelizing the NAS Parallel Benchmark SP program using Charon are given, showing good scalability.

  14. General relativistic magneto-hydrodynamics with the Einstein Toolkit

    NASA Astrophysics Data System (ADS)

    Moesta, Philipp; Mundim, Bruno; Faber, Joshua; Noble, Scott; Bode, Tanja; Haas, Roland; Loeffler, Frank; Ott, Christian; Reisswig, Christian; Schnetter, Erik

    2013-04-01

    The Einstein Toolkit Consortium is developing and supporting open software for relativistic astrophysics. Its aim is to provide the core computational tools that can enable new science, broaden our community, facilitate interdisciplinary research and take advantage of petascale computers and advanced cyberinfrastructure. The Einstein Toolkit currently consists of an open set of over 100 modules for the Cactus framework, primarily for computational relativity along with associated tools for simulation management and visualization. The toolkit includes solvers for vacuum spacetimes as well as relativistic magneto-hydrodynamics. This talk will present the current capabilities of the Einstein Toolkit with a particular focus on recent improvements made to the general relativistic magneto-hydrodynamics modeling and will point to information how to leverage it for future research.

  15. Toolkit for evaluating impacts of public participation in scientific research

    NASA Astrophysics Data System (ADS)

    Bonney, R.; Phillips, T.

    2011-12-01

    The Toolkit for Evaluating Impacts of Public Participation in Scientific Research is being developed to meet a major need in the field of visitor studies: To provide project developers and other professionals, especially those with limited knowledge or understanding of evaluation techniques, with a systematic method for assessing project impact that facilitates longitudinal and cross-project comparisons. The need for the toolkit was first identified at the Citizen Science workshop held at the Cornell Lab of Ornithology in 2007 (McEver et al. 2007) and reaffirmed by a CAISE inquiry group that produced the recent report: "Public Participation in Scientific Research: Defining the Field and Assessing its Potential for Informal Science Education" (Bonney et al. 2009). This presentation will introduce the Toolkit, show how it is intended to be used, and describe ways that project directors can use their programmatic goals and use toolkit materials to outline a plan for evaluating the impacts of their project.

  16. Handheld access to radiology teaching files: an automated system for format conversion and content creation

    NASA Astrophysics Data System (ADS)

    Raman, Raghav; Raman, Lalithakala; Raman, Bhargav; Gold, Garry; Beaulieu, Christopher F.

    2002-05-01

    Current handheld Personal Digital Assistants (PDAs) can be used to view radiology teaching files. We have developed a toolkit that allows rapid creation of radiology teaching files in handheld formats from existing repositories. Our toolkit incorporated a desktop converter, a web conversion server and an application programming interface (API). Our API was integrated with an existing pilot teaching file database. We evaluated our system by obtaining test DICOM and JPEG images from our PACS system, our pilot database and from personal collections and converting them on a Windows workstation (Microsoft, Redmond, CA) and on other platforms using the web server. Our toolkit anonymized, annotated and categorized images using DICOM header information and data entered by the authors. Image processing was automatically customized for the target handheld device. We used freeware handheld image viewers as well as our custom applications that allowed window/level manipulation and viewing of additional textual information. Our toolkit provides desktop and web access to image conversion tools to produce organized handheld teaching file packages for most handheld devices and our API allows existing teaching file databases to incorporate handheld compatibility. The distribution of radiology teaching files on PDAs can increase the accessibility to radiology teaching.

  17. Spike Train Analysis Toolkit: Enabling Wider Application of Information-Theoretic Techniques to Neurophysiology

    PubMed Central

    Goldberg, David H.; Victor, Jonathan D.; Gardner, Esther P.

    2009-01-01

    Conventional methods widely available for the analysis of spike trains and related neural data include various time- and frequency-domain analyses, such as peri-event and interspike interval histograms, spectral measures, and probability distributions. Information theoretic methods are increasingly recognized as significant tools for the analysis of spike train data. However, developing robust implementations of these methods can be time-consuming, and determining applicability to neural recordings can require expertise. In order to facilitate more widespread adoption of these informative methods by the neuroscience community, we have developed the Spike Train Analysis Toolkit. STAToolkit is a software package which implements, documents, and guides application of several information-theoretic spike train analysis techniques, thus minimizing the effort needed to adopt and use them. This implementation behaves like a typical Matlab toolbox, but the underlying computations are coded in C for portability, optimized for efficiency, and interfaced with Matlab via the MEX framework. STAToolkit runs on any of three major platforms: Windows, Mac OS, and Linux. The toolkit reads input from files with an easy-to-generate text-based, platform-independent format. STAToolkit, including full documentation and test cases, is freely available open source via http://neuroanalysis.org, maintained as a resource for the computational neuroscience and neuroinformatics communities. Use cases drawn from somatosensory and gustatory neurophysiology, and community use of STAToolkit, demonstrate its utility and scope. PMID:19475519

  18. The Trick Simulation Toolkit: A NASA/Opensource Framework for Running Time Based Physics Models

    NASA Technical Reports Server (NTRS)

    Penn, John M.

    2016-01-01

    The Trick Simulation Toolkit is a simulation development environment used to create high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. Its purpose is to generate a simulation executable from a collection of user-supplied models and a simulation definition file. For each Trick-based simulation, Trick automatically provides job scheduling, numerical integration, the ability to write and restore human readable checkpoints, data recording, interactive variable manipulation, a run-time interpreter, and many other commonly needed capabilities. This allows simulation developers to concentrate on their domain expertise and the algorithms and equations of their models. Also included in Trick are tools for plotting recorded data and various other supporting utilities and libraries. Trick is written in C/C++ and Java and supports both Linux and MacOSX computer operating systems. This paper describes Trick's design and use at NASA Johnson Space Center.

  19. Multi-Physics Demonstration Problem with the SHARP Reactor Simulation Toolkit

    SciTech Connect

    Merzari, E.; Shemon, E. R.; Yu, Y. Q.; Thomas, J. W.; Obabko, A.; Jain, Rajeev; Mahadevan, Vijay; Tautges, Timothy; Solberg, Jerome; Ferencz, Robert Mark; Whitesides, R.

    2015-12-21

    This report describes to employ SHARP to perform a first-of-a-kind analysis of the core radial expansion phenomenon in an SFR. This effort required significant advances in the framework Multi-Physics Demonstration Problem with the SHARP Reactor Simulation Toolkit used to drive the coupled simulations, manipulate the mesh in response to the deformation of the geometry, and generate the necessary modified mesh files. Furthermore, the model geometry is fairly complex, and consistent mesh generation for the three physics modules required significant effort. Fully-integrated simulations of a 7-assembly mini-core test problem have been performed, and the results are presented here. Physics models of a full-core model of the Advanced Burner Test Reactor have also been developed for each of the three physics modules. Standalone results of each of the three physics modules for the ABTR are presented here, which provides a demonstration of the feasibility of the fully-integrated simulation.

  20. Biomechanical ToolKit: Open-source framework to visualize and process biomechanical data.

    PubMed

    Barre, Arnaud; Armand, Stéphane

    2014-04-01

    C3D file format is widely used in the biomechanical field by companies and laboratories to store motion capture systems data. However, few software packages can visualize and modify the integrality of the data in the C3D file. Our objective was to develop an open-source and multi-platform framework to read, write, modify and visualize data from any motion analysis systems using standard (C3D) and proprietary file formats (used by many companies producing motion capture systems). The Biomechanical ToolKit (BTK) was developed to provide cost-effective and efficient tools for the biomechanical community to easily deal with motion analysis data. A large panel of operations is available to read, modify and process data through C++ API, bindings for high-level languages (Matlab, Octave, and Python), and standalone application (Mokka). All these tools are open-source and cross-platform and run on all major operating systems (Windows, Linux, MacOS X).

  1. Biomechanical ToolKit: Open-source framework to visualize and process biomechanical data.

    PubMed

    Barre, Arnaud; Armand, Stéphane

    2014-04-01

    C3D file format is widely used in the biomechanical field by companies and laboratories to store motion capture systems data. However, few software packages can visualize and modify the integrality of the data in the C3D file. Our objective was to develop an open-source and multi-platform framework to read, write, modify and visualize data from any motion analysis systems using standard (C3D) and proprietary file formats (used by many companies producing motion capture systems). The Biomechanical ToolKit (BTK) was developed to provide cost-effective and efficient tools for the biomechanical community to easily deal with motion analysis data. A large panel of operations is available to read, modify and process data through C++ API, bindings for high-level languages (Matlab, Octave, and Python), and standalone application (Mokka). All these tools are open-source and cross-platform and run on all major operating systems (Windows, Linux, MacOS X). PMID:24548899

  2. The Galley Parallel File System

    NASA Technical Reports Server (NTRS)

    Nieuwejaar, Nils; Kotz, David

    1996-01-01

    Most current multiprocessor file systems are designed to use multiple disks in parallel, using the high aggregate bandwidth to meet the growing I/0 requirements of parallel scientific applications. Many multiprocessor file systems provide applications with a conventional Unix-like interface, allowing the application to access multiple disks transparently. This interface conceals the parallelism within the file system, increasing the ease of programmability, but making it difficult or impossible for sophisticated programmers and libraries to use knowledge about their I/O needs to exploit that parallelism. In addition to providing an insufficient interface, most current multiprocessor file systems are optimized for a different workload than they are being asked to support. We introduce Galley, a new parallel file system that is intended to efficiently support realistic scientific multiprocessor workloads. We discuss Galley's file structure and application interface, as well as the performance advantages offered by that interface.

  3. E-ELT modeling and simulation toolkits: philosophy and progress status

    NASA Astrophysics Data System (ADS)

    Sedghi, B.; Muller, M.; Bonnet, H.; Esselborn, M.; Le Louarn, M.; Clare, R.; Koch, F.

    2011-09-01

    To predict the performance of the E-ELT three sets of toolkits are developed at ESO: i) The main structure and associated optical unit dynamical and feedback control toolkit, ii) Active optics and phasing toolkit, and iii) adaptive optics simulation toolkit. There was a deliberate policy not to integrate all of the systems into a massive model and tool. The dynamical and control time scale differences are used to separate the simulation environments and tools. Therefore, each toolkit contains an appropriate detail of the problem and holds sufficient overlap with the others to ensure the consistency of the results. In this paper, these toolkits together with some examples are presented.

  4. CART—a chemical annotation retrieval toolkit

    PubMed Central

    Deghou, Samy; Zeller, Georg; Iskar, Murat; Driessen, Marja; Castillo, Mercedes; van Noort, Vera; Bork, Peer

    2016-01-01

    Motivation: Data on bioactivities of drug-like chemicals are rapidly accumulating in public repositories, creating new opportunities for research in computational systems pharmacology. However, integrative analysis of these data sets is difficult due to prevailing ambiguity between chemical names and identifiers and a lack of cross-references between databases. Results: To address this challenge, we have developed CART, a Chemical Annotation Retrieval Toolkit. As a key functionality, it matches an input list of chemical names into a comprehensive reference space to assign unambiguous chemical identifiers. In this unified space, bioactivity annotations can be easily retrieved from databases covering a wide variety of chemical effects on biological systems. Subsequently, CART can determine annotations enriched in the input set of chemicals and display these in tabular format and interactive network visualizations, thereby facilitating integrative analysis of chemical bioactivity data. Availability and Implementation: CART is available as a Galaxy web service (cart.embl.de). Source code and an easy-to-install command line tool can also be obtained from the web site. Contact: bork@embl.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27256313

  5. UQ Toolkit v 2.0

    SciTech Connect

    2013-10-03

    The Uncertainty Quantification (UQ) Toolkit is a software library for the characterizaton and propagation of uncertainties in computational models. For the characterization of uncertainties, Bayesian inference tools are provided to infer uncertain model parameters, as well as Bayesian compressive sensing methods for discovering sparse representations of high-dimensional input-output response surfaces, and also Karhunen-Loève expansions for representing stochastic processes. Uncertain parameters are treated as random variables and represented with Polynomial Chaos expansions (PCEs). The library implements several spectral basis function types (e.g. Hermite basis functions in terms of Gaussian random variables or Legendre basis functions in terms of uniform random variables) that can be used to represent random variables with PCEs. For propagation of uncertainty, tools are provided to propagate PCEs that describe the input uncertainty through the computational model using either intrusive methods (Galerkin projection of equations onto basis functions) or non-intrusive methods (perform deterministic operation at sampled values of the random values and project the obtained results onto basis functions).

  6. Security Assessment Simulation Toolkit (SAST) Final Report

    SciTech Connect

    Meitzler, Wayne D.; Ouderkirk, Steven J.; Hughes, Chad O.

    2009-11-15

    The Department of Defense Technical Support Working Group (DoD TSWG) investment in the Pacific Northwest National Laboratory (PNNL) Security Assessment Simulation Toolkit (SAST) research planted a technology seed that germinated into a suite of follow-on Research and Development (R&D) projects culminating in software that is used by multiple DoD organizations. The DoD TSWG technology transfer goal for SAST is already in progress. The Defense Information Systems Agency (DISA), the Defense-wide Information Assurance Program (DIAP), the Marine Corps, Office Of Naval Research (ONR) National Center For Advanced Secure Systems Research (NCASSR) and Office Of Secretary Of Defense International Exercise Program (OSD NII) are currently investing to take SAST to the next level. PNNL currently distributes the software to over 6 government organizations and 30 DoD users. For the past five DoD wide Bulwark Defender exercises, the adoption of this new technology created an expanding role for SAST. In 2009, SAST was also used in the OSD NII International Exercise and is currently scheduled for use in 2010.

  7. Data Exploration Toolkit for serial diffraction experiments

    SciTech Connect

    Zeldin, Oliver B.; Brewster, Aaron S.; Hattne, Johan; Uervirojnangkoorn, Monarin; Lyubimov, Artem Y.; Zhou, Qiangjun; Zhao, Minglei; Weis, William I.; Sauter, Nicholas K.; Brunger, Axel T.

    2015-02-01

    This paper describes a set of tools allowing experimentalists insight into the variation present within large serial data sets. Ultrafast diffraction at X-ray free-electron lasers (XFELs) has the potential to yield new insights into important biological systems that produce radiation-sensitive crystals. An unavoidable feature of the ‘diffraction before destruction’ nature of these experiments is that images are obtained from many distinct crystals and/or different regions of the same crystal. Combined with other sources of XFEL shot-to-shot variation, this introduces significant heterogeneity into the diffraction data, complicating processing and interpretation. To enable researchers to get the most from their collected data, a toolkit is presented that provides insights into the quality of, and the variation present in, serial crystallography data sets. These tools operate on the unmerged, partial intensity integration results from many individual crystals, and can be used on two levels: firstly to guide the experimental strategy during data collection, and secondly to help users make informed choices during data processing.

  8. The NOAA Weather and Climate Toolkit

    NASA Astrophysics Data System (ADS)

    Ansari, S.; Hutchins, C.; Del Greco, S.

    2008-12-01

    The NOAA Weather and Climate Toolkit (WCT) is an application that provides simple visualization and data export of weather and climate data archived at the National Climatic Data Center (NCDC) and other organizations. The WCT is built on the Unidata Common Data Model and supports defined feature types such as Grid, Radial, Point, Time Series and Trajectory. Current NCDC datasets supported include NEXRAD Radar data, GOES Satellite imagery, NOMADS Model Data, Integrated Surface Data and the U.S. Drought Monitor (part of the National Integrated Drought Information System (NIDIS)). The WCT Viewer provides tools for displaying custom data overlays, Web Map Services (WMS), animations and basic filters. The export of images and movies is provided in multiple formats. The WCT Data Exporter allows for data export in both vector polygon (Shapefile, Well-Known Text) and raster (GeoTIFF, Arc/Info ASCII Grid, VTK, NetCDF) formats. By decoding and exporting data into multiple common formats, a diverse user community can perform analysis using familiar tools such as ArcGIS, MatLAB and IDL. This brings new users to a vast array of weather and climate data at NCDC.

  9. UQ Toolkit v 2.0

    2013-10-03

    The Uncertainty Quantification (UQ) Toolkit is a software library for the characterizaton and propagation of uncertainties in computational models. For the characterization of uncertainties, Bayesian inference tools are provided to infer uncertain model parameters, as well as Bayesian compressive sensing methods for discovering sparse representations of high-dimensional input-output response surfaces, and also Karhunen-Loève expansions for representing stochastic processes. Uncertain parameters are treated as random variables and represented with Polynomial Chaos expansions (PCEs). The library implementsmore » several spectral basis function types (e.g. Hermite basis functions in terms of Gaussian random variables or Legendre basis functions in terms of uniform random variables) that can be used to represent random variables with PCEs. For propagation of uncertainty, tools are provided to propagate PCEs that describe the input uncertainty through the computational model using either intrusive methods (Galerkin projection of equations onto basis functions) or non-intrusive methods (perform deterministic operation at sampled values of the random values and project the obtained results onto basis functions).« less

  10. STAR: Software Toolkit for Analysis Research

    SciTech Connect

    Doak, J.; Prommel, J.; Hoffbauer, B.

    1995-09-01

    This paper provides an update on the development of the Software Toolkit for Analysis Research (STAR). The goal of the STAR project is to produce a research tool that facilitates the development and interchange of algorithms for locating phenomena of interest in large quantities of data. The authors have applied this technology to the Space and Atmospheric Burst Reporting System (SAVRS) in support of the Knowledge Fusion Technologies project. Data from satellites was enhanced to contain evidence of nuclear detonations (nudets). the resulting data was analyzed to determine which algorithms were the most effective at detecting nudets; these algorithms will eventually be implemented in hardware to allow processing on board future Global Positioning System satellites. They wanted to not only solve specific problems from within this domain but also to solve them in a computational environment that would permit the developed tools to be applied to new domains or different problems in the same domain. This paper presents an overview of the software developed for this knowledge fusion project. They also present the results of applying the software to the enhanced satellite data.

  11. Asteroids Outreach Toolkit Development: Using Iterative Feedback In Informal Education

    NASA Astrophysics Data System (ADS)

    White, Vivian; Berendsen, M.; Gurton, S.; Dusenbery, P. B.

    2011-01-01

    The Night Sky Network is a collaboration of close to 350 astronomy clubs across the US that actively engage in public outreach within their communities. Since 2004, the Astronomical Society of the Pacific has been creating outreach ToolKits filled with carefully crafted sets of physical materials designed to help these volunteer clubs explain the wonders of the night sky to the public. The effectiveness of the ToolKit activities and demonstrations is the direct result of a thorough testing and vetting process. Find out how this iterative assessment process can help other programs create useful tools for both formal and informal educators. The current Space Rocks Outreach ToolKit focuses on explaining asteroids, comets, and meteorites to the general public using quick, big-picture activities that get audiences involved. Eight previous ToolKits cover a wide range of topics from the Moon to black holes. In each case, amateur astronomers and the public helped direct the development the activities along the way through surveys, focus groups, and active field-testing. The resulting activities have been embraced by the larger informal learning community and are enthusiastically being delivered to millions of people across the US and around the world. Each ToolKit is delivered free of charge to active Night Sky Network astronomy clubs. All activity write-ups are available free to download at the website listed here. Amateur astronomers receive frequent questions from the public about Earth impacts, meteors, and comets so this set of activities will help them explain the dynamics of these phenomena to the public. The Space Rocks ToolKit resources complement the Great Balls of Fire museum exhibit produced by Space Science Institute's National Center for Interactive Learning and scheduled for release in 2011. NSF has funded this national traveling exhibition and outreach ToolKit under Grant DRL-0813528.

  12. The Configuration Space Toolkit (C-Space Toolkit or CSTK) Ver. 2.5 beta

    SciTech Connect

    Chen, Pang-Chieh; Hwang, Yong; Xavier, Patrick; Lewis, Christopher; Lafarge, Robert; & Watterberg, Peter

    2010-02-24

    The C-Space Toolkit provides a software library that makes it easier to program motion planning, simulation, robotics, and virtual reality codes using the Configuration Space abstraction. Key functionality (1) enables the user to special create representations of movable and stationary rigid geometric objects, and (2) perform fast distance, interference (clash) detection, collision detection, closest-feature pairs, and contact queries in terms of object configuration. Not only can queries be computed at any given point in configuration space, but they can be done exactly over linear-translational path segments and approximately for rotational path segments. Interference detection and distance computations can be done with respect to the Minkowski sum of the original geometry and a piece of convex geometry. The Toolkit takes as raw model input (1) collections of convex polygons that form the boundaries of models and (2) convex polyhedra, cones, cylinders, and discs that are models and model components. Configurations are given in terms of homogeneous transforms. A simple OpenGL-based system for displaying and animating the geometric objects is included in the implementation. This version, 2.5 Beta, incorporates feature additions and enhancements, improvements in algorithms, improved robustness, bug fixes and cleaned-up source code, better compliance with standards and recent programming convention, changes to the build process for the software, support for more recent hardware and software platforms, and improvements to documentation and source-code comments.

  13. The Configuration Space Toolkit (C-Space Toolkit or CSTK) Ver. 2.5 beta

    2010-02-24

    The C-Space Toolkit provides a software library that makes it easier to program motion planning, simulation, robotics, and virtual reality codes using the Configuration Space abstraction. Key functionality (1) enables the user to special create representations of movable and stationary rigid geometric objects, and (2) perform fast distance, interference (clash) detection, collision detection, closest-feature pairs, and contact queries in terms of object configuration. Not only can queries be computed at any given point in configurationmore » space, but they can be done exactly over linear-translational path segments and approximately for rotational path segments. Interference detection and distance computations can be done with respect to the Minkowski sum of the original geometry and a piece of convex geometry. The Toolkit takes as raw model input (1) collections of convex polygons that form the boundaries of models and (2) convex polyhedra, cones, cylinders, and discs that are models and model components. Configurations are given in terms of homogeneous transforms. A simple OpenGL-based system for displaying and animating the geometric objects is included in the implementation. This version, 2.5 Beta, incorporates feature additions and enhancements, improvements in algorithms, improved robustness, bug fixes and cleaned-up source code, better compliance with standards and recent programming convention, changes to the build process for the software, support for more recent hardware and software platforms, and improvements to documentation and source-code comments.« less

  14. A Toolkit for Eye Recognition of LAMOST Spectroscopy

    NASA Astrophysics Data System (ADS)

    Yuan, H.; Zhang, H.; Zhang, Y.; Lei, Y.; Dong, Y.; Zhao, Y.

    2014-05-01

    The Large sky Area Multi-Object fiber Spectroscopic Telescope (LAMOST, also named the Guo Shou Jing Telescope) has finished the pilot survey and now begun the normal survey by the end of 2012 September. There have already been millions of targets observed, including thousands of quasar candidates. Because of the difficulty in the automatic identification of quasar spectra, eye recognition is always necessary and efficient. However massive spectra identification by eye is a huge job. In order to improve the efficiency and effectiveness of spectra , a toolkit for eye recognition of LAMOST spectroscopy is developed. Spectral cross-correlation templates from the Sloan Digital Sky Survey (SDSS) are applied as references, including O star, O/B transition star, B star, A star, F/A transition star, F star, G star, K star, M1 star, M3 star,M5 star,M8 star, L1 star, magnetic white dwarf, carbon star, white dwarf, B white dwarf, low metallicity K sub-dwarf, "Early-type" galaxy, galaxy, "Later-type" galaxy, Luminous Red Galaxy, QSO, QSO with some BAL activity and High-luminosity QSO. By adjusting the redshift and flux ratio of the template spectra in an interactive graphic interface, the spectral type of the target can be discriminated in a easy and feasible way and the redshift is estimated at the same time with a precision of about millesimal. The advantage of the tool in dealing with low quality spectra is indicated. Spectra from the Pilot Survey of LAMSOT are applied as examples and spectra from SDSS are also tested from comparison. Target spectra in both image format and fits format are supported. For convenience several spectra accessing manners are provided. All the spectra from LAMOST pilot survey can be located and acquired via the VOTable files on the internet as suggested by International Virtual Observatory Alliance (IVOA). After the construction of the Simple Spectral Access Protocol (SSAP) service by the Chinese Astronomical Data Center (CAsDC), spectra can be

  15. The Einstein Toolkit: a community computational infrastructure for relativistic astrophysics

    NASA Astrophysics Data System (ADS)

    Löffler, Frank; Faber, Joshua; Bentivegna, Eloisa; Bode, Tanja; Diener, Peter; Haas, Roland; Hinder, Ian; Mundim, Bruno C.; Ott, Christian D.; Schnetter, Erik; Allen, Gabrielle; Campanelli, Manuela; Laguna, Pablo

    2012-06-01

    We describe the Einstein Toolkit, a community-driven, freely accessible computational infrastructure intended for use in numerical relativity, relativistic astrophysics, and other applications. The toolkit, developed by a collaboration involving researchers from multiple institutions around the world, combines a core set of components needed to simulate astrophysical objects such as black holes, compact objects, and collapsing stars, as well as a full suite of analysis tools. The Einstein Toolkit is currently based on the Cactus framework for high-performance computing and the Carpet adaptive mesh refinement driver. It implements spacetime evolution via the BSSN evolution system and general relativistic hydrodynamics in a finite-volume discretization. The toolkit is under continuous development and contains many new code components that have been publicly released for the first time and are described in this paper. We discuss the motivation behind the release of the toolkit, the philosophy underlying its development, and the goals of the project. A summary of the implemented numerical techniques is included, as are results of numerical test covering a variety of sample astrophysical problems.

  16. Developing and testing the health literacy universal precautions toolkit

    PubMed Central

    DeWalt, Darren A.; Broucksou, Kimberly A.; Hawk, Victoria; Brach, Cindy; Hink, Ashley; Rudd, Rima; Callahan, Leigh

    2016-01-01

    The health literacy demands of the healthcare system often exceed the health literacy skills of Americans. This article reviews the development of the Health Literacy Universal Precautions (HLUP) Toolkit, commissioned by the Agency for Healthcare Research and Quality and designed to help primary care practices structure the delivery of care as if every patient may have limited health literacy. The development of the toolkit spanned 2 years and consisted of 3 major tasks: (1) developing individual tools (modules explaining how to use or implement a strategy to minimize the effects of low health literacy), using existing health literacy resources when possible, (2) testing individual tools in clinical practice and assembling them into a prototype toolkit, and (3) testing the prototype toolkit in clinical practice. Testing revealed that practices will use tools that are concise and actionable and are not perceived as being resource intensive. Conducting practice self-assessments and generating enthusiasm among staff were key elements for successful implementation. Implementing practice changes required more time than anticipated and some knowledge of quality improvement techniques. In sum, the HLUP Toolkit holds promise as a means of improving primary care for people with limited health literacy, but further testing is needed. PMID:21402204

  17. The MPI Bioinformatics Toolkit for protein sequence analysis.

    PubMed

    Biegert, Andreas; Mayer, Christian; Remmert, Michael; Söding, Johannes; Lupas, Andrei N

    2006-07-01

    The MPI Bioinformatics Toolkit is an interactive web service which offers access to a great variety of public and in-house bioinformatics tools. They are grouped into different sections that support sequence searches, multiple alignment, secondary and tertiary structure prediction and classification. Several public tools are offered in customized versions that extend their functionality. For example, PSI-BLAST can be run against regularly updated standard databases, customized user databases or selectable sets of genomes. Another tool, Quick2D, integrates the results of various secondary structure, transmembrane and disorder prediction programs into one view. The Toolkit provides a friendly and intuitive user interface with an online help facility. As a key feature, various tools are interconnected so that the results of one tool can be forwarded to other tools. One could run PSI-BLAST, parse out a multiple alignment of selected hits and send the results to a cluster analysis tool. The Toolkit framework and the tools developed in-house will be packaged and freely available under the GNU Lesser General Public Licence (LGPL). The Toolkit can be accessed at http://toolkit.tuebingen.mpg.de.

  18. Cyber Security Audit and Attack Detection Toolkit

    SciTech Connect

    Peterson, Dale

    2012-05-31

    This goal of this project was to develop cyber security audit and attack detection tools for industrial control systems (ICS). Digital Bond developed and released a tool named Bandolier that audits ICS components commonly used in the energy sector against an optimal security configuration. The Portaledge Project developed a capability for the PI Historian, the most widely used Historian in the energy sector, to aggregate security events and detect cyber attacks.

  19. Measure Up Pressure Down: Provider Toolkit to Improve Hypertension Control.

    PubMed

    Torres, Jennifer

    2016-05-01

    Hypertension is one of the most important risk factors for heart disease, stroke, kidney failure, and diabetes complications. Nearly one in three Americans adults has high blood pressure, and the cost associated with treating this condition is staggering. The Measure Up Pressure Down: Provider Toolkit to Improve Hypertension Control is a resource developed by the American Medical Group Foundation in partnership with the American Medical Group Association. The goal of this toolkit is to mobilize health care practitioners to work together through team-based approaches to achieve an 80% control rate of high blood pressure among their patient population. The toolkit can be used by health educators, clinic administrators, physicians, students, and other clinic staff as a step-by-step resource for developing the infrastructure needed to better identify and treat individuals with high blood pressure or other chronic conditions.

  20. An epigenetic toolkit allows for diverse genome architectures in eukaryotes.

    PubMed

    Maurer-Alcalá, Xyrus X; Katz, Laura A

    2015-12-01

    Genome architecture varies considerably among eukaryotes in terms of both size and structure (e.g. distribution of sequences within the genome, elimination of DNA during formation of somatic nuclei). The diversity in eukaryotic genome architectures and the dynamic processes are only possible due to the well-developed epigenetic toolkit, which probably existed in the Last Eukaryotic Common Ancestor (LECA). This toolkit may have arisen as a means of navigating the genomic conflict that arose from the expansion of transposable elements within the ancestral eukaryotic genome. This toolkit has been coopted to support the dynamic nature of genomes in lineages across the eukaryotic tree of life. Here we highlight how the changes in genome architecture in diverse eukaryotes are regulated by epigenetic processes, such as DNA elimination, genome rearrangements, and adaptive changes to genome architecture. The ability to epigenetically modify and regulate genomes has contributed greatly to the diversity of eukaryotes observed today.

  1. Validation of Power Output for the WIND Toolkit

    SciTech Connect

    King, J.; Clifton, A.; Hodge, B. M.

    2014-09-01

    Renewable energy integration studies require wind data sets of high quality with realistic representations of the variability, ramping characteristics, and forecast performance for current wind power plants. The Wind Integration National Data Set (WIND) Toolkit is meant to be an update for and expansion of the original data sets created for the weather years from 2004 through 2006 during the Western Wind and Solar Integration Study and the Eastern Wind Integration Study. The WIND Toolkit expands these data sets to include the entire continental United States, increasing the total number of sites represented, and it includes the weather years from 2007 through 2012. In addition, the WIND Toolkit has a finer resolution for both the temporal and geographic dimensions. Three separate data sets will be created: a meteorological data set, a wind power data set, and a forecast data set. This report describes the validation of the wind power data set.

  2. Measure Up Pressure Down: Provider Toolkit to Improve Hypertension Control.

    PubMed

    Torres, Jennifer

    2016-05-01

    Hypertension is one of the most important risk factors for heart disease, stroke, kidney failure, and diabetes complications. Nearly one in three Americans adults has high blood pressure, and the cost associated with treating this condition is staggering. The Measure Up Pressure Down: Provider Toolkit to Improve Hypertension Control is a resource developed by the American Medical Group Foundation in partnership with the American Medical Group Association. The goal of this toolkit is to mobilize health care practitioners to work together through team-based approaches to achieve an 80% control rate of high blood pressure among their patient population. The toolkit can be used by health educators, clinic administrators, physicians, students, and other clinic staff as a step-by-step resource for developing the infrastructure needed to better identify and treat individuals with high blood pressure or other chronic conditions. PMID:27440782

  3. The PRIDE (Partnership to Improve Diabetes Education) Toolkit

    PubMed Central

    Wolff, Kathleen; Chambers, Laura; Bumol, Stefan; White, Richard O.; Gregory, Becky Pratt; Davis, Dianne; Rothman, Russell L.

    2016-01-01

    Purpose Patients with low literacy, low numeracy, and/or linguistic needs can experience challenges understanding diabetes information and applying concepts to their self-management. The authors designed a toolkit of education materials that are sensitive to patients' literacy and numeracy levels, language preferences, and cultural norms and that encourage shared goal setting to improve diabetes self-management and health outcomes. The Partnership to Improve Diabetes Education (PRIDE) toolkit was developed to facilitate diabetes self-management education and support. Methods The PRIDE toolkit includes a comprehensive set of 30 interactive education modules in English and Spanish to support diabetes self-management activities. The toolkit builds upon the authors' previously validated Diabetes Literacy and Numeracy Education Toolkit (DLNET) by adding a focus on shared goal setting, addressing the needs of Spanish-speaking patients, and including a broader range of diabetes management topics. Each PRIDE module was evaluated using the Suitability Assessment of Materials (SAM) instrument to determine the material's cultural appropriateness and its sensitivity to the needs of patients with low literacy and low numeracy. Reading grade level was also assessed using the Automated Readability Index (ARI), Coleman-Liau, Flesch-Kincaid, Fry, and SMOG formulas. Conclusions The average reading grade level of the materials was 5.3 (SD 1.0), with a mean SAM of 91.2 (SD 5.4). All of the 30 modules received a “superior” score (SAM >70%) when evaluated by 2 independent raters. The PRIDE toolkit modules can be used by all members of a multidisciplinary team to assist patients with low literacy and low numeracy in managing their diabetes. PMID:26647414

  4. Deconstructing the toolkit: creativity and risk in the NHS workforce.

    PubMed

    Allen, Von; Brodzinski, Emma

    2009-12-01

    Deconstructing the Toolkit explores the current desire for toolkits that promise failsafe structures to facilitate creative success. The paper examines this cultural phenomenon within the context of the risk-averse workplace-with particular focus on the NHS. The writers draw on Derrida and deconstructionism to reflect upon the principles of creativity and the possibilities for being creative within the workplace. Through reference to The Extra Mile project facilitated by Open Art, the paper examines the importance of engaging with an aesthetic of creativity and embracing a more holistic approach to the problems and potential of the creative process. PMID:19821031

  5. WIRM: An Open Source Toolkit for Building Biomedical Web Applications

    PubMed Central

    Jakobovits, Rex M.; Rosse, Cornelius; Brinkley, James F.

    2002-01-01

    This article describes an innovative software toolkit that allows the creation of web applications that facilitate the acquisition, integration, and dissemination of multimedia biomedical data over the web, thereby reducing the cost of knowledge sharing. There is a lack of high-level web application development tools suitable for use by researchers, clinicians, and educators who are not skilled programmers. Our Web Interfacing Repository Manager (WIRM) is a software toolkit that reduces the complexity of building custom biomedical web applications. WIRM’s visual modeling tools enable domain experts to describe the structure of their knowledge, from which WIRM automatically generates full-featured, customizable content management systems. PMID:12386108

  6. A clinical research analytics toolkit for cohort study.

    PubMed

    Yu, Yiqin; Zhu, Yu; Sun, Xingzhi; Tao, Ying; Zhang, Shuo; Xu, Linhao; Pan, Yue

    2012-01-01

    This paper presents a clinical informatics toolkit that can assist physicians to conduct cohort studies effectively and efficiently. The toolkit has three key features: 1) support of procedures defined in epidemiology, 2) recommendation of statistical methods in data analysis, and 3) automatic generation of research reports. On one hand, our system can help physicians control research quality by leveraging the integrated knowledge of epidemiology and medical statistics; on the other hand, it can improve productivity by reducing the complexities for physicians during their cohort studies.

  7. FAST TRACK COMMUNICATION: A stable toolkit method in quantum control

    NASA Astrophysics Data System (ADS)

    Belhadj, M.; Salomon, J.; Turinici, G.

    2008-09-01

    Recently the 'toolkit' discretization introduced to accelerate the numerical resolution of the time-dependent Schrödinger equation arising in quantum optimal control problems demonstrated good results on a large range of models. However, when coupling this class of methods with the so-called monotonically convergent algorithms, numerical instabilities affect the convergence of the discretized scheme. We present an adaptation of the 'toolkit' method which preserves the monotonicity of the procedure. The theoretical properties of the new algorithm are illustrated by numerical simulations.

  8. RAVE—a Detector-independent vertex reconstruction toolkit

    NASA Astrophysics Data System (ADS)

    Waltenberger, Wolfgang; Mitaroff, Winfried; Moser, Fabian

    2007-10-01

    A detector-independent toolkit for vertex reconstruction (RAVE ) is being developed, along with a standalone framework (VERTIGO ) for testing, analyzing and debugging. The core algorithms represent state of the art for geometric vertex finding and fitting by both linear (Kalman filter) and robust estimation methods. Main design goals are ease of use, flexibility for embedding into existing software frameworks, extensibility, and openness. The implementation is based on modern object-oriented techniques, is coded in C++ with interfaces for Java and Python, and follows an open-source approach. A beta release is available. VERTIGO = "vertex reconstruction toolkit and interface to generic objects".

  9. PyCogent: a toolkit for making sense from sequence

    PubMed Central

    Knight, Rob; Maxwell, Peter; Birmingham, Amanda; Carnes, Jason; Caporaso, J Gregory; Easton, Brett C; Eaton, Michael; Hamady, Micah; Lindsay, Helen; Liu, Zongzhi; Lozupone, Catherine; McDonald, Daniel; Robeson, Michael; Sammut, Raymond; Smit, Sandra; Wakefield, Matthew J; Widmann, Jeremy; Wikman, Shandy; Wilson, Stephanie; Ying, Hua; Huttley, Gavin A

    2007-01-01

    We have implemented in Python the COmparative GENomic Toolkit, a fully integrated and thoroughly tested framework for novel probabilistic analyses of biological sequences, devising workflows, and generating publication quality graphics. PyCogent includes connectors to remote databases, built-in generalized probabilistic techniques for working with biological sequences, and controllers for third-party applications. The toolkit takes advantage of parallel architectures and runs on a range of hardware and operating systems, and is available under the general public license from . PMID:17708774

  10. Incident Management Preparedness and Coordination Toolkit

    SciTech Connect

    Koch, Daniel B.

    2013-04-01

    As with many professions, safety planners and first responders tend to be specialists in certain areas. To be truly useful, tools should be tailored to meet their specific needs. Thus, general software suites aimed at the professional geographic information system (GIS) community might not be the best solution for a first responder with little training in GIS terminology and techniques. On the other hand, commonly used web-based map viewers may not have the capability to be customized for the planning, response, and recovery (PR&R) mission. Data formats should be open and foster easy information flow among local, state, and federal partners. Tools should be free or low-cost to address real-world budget constraints at the local level. They also need to work both with and without a network connection to be robust. The Incident Management Preparedness and Coordination Toolkit (IMPACT) can satisfy many of these needs while working in harmony with established systems at the local, state, and federal levels. The IMPACT software framework, termed the Geospatial Integrated Problem Solving Environment (GIPSE), organizes tasks, tools, and resources for the end user. It uses the concept of software wizards to both customize and extend its functionality. On the Tasks panel are a number of buttons used to initiate various operations. Similar to macros, these task buttons launch scripts that utilize the full functionality of the underlying foundational components such as the SQL spatial database and ORNL-developed map editor. The user is presented with a series of instruction pages which are implemented with HTML for interactivity. On each page are links which initiate specific actions such as creating a map showing various features. Additional tasks may be quickly programmed and added to the panel. The end user can customize the graphical interface to faciltate its use during an emergency. One of the major components of IMPACT is the ORNL Geospatial Viewer (OGV). It is used to

  11. Incident Management Preparedness and Coordination Toolkit

    2013-04-01

    As with many professions, safety planners and first responders tend to be specialists in certain areas. To be truly useful, tools should be tailored to meet their specific needs. Thus, general software suites aimed at the professional geographic information system (GIS) community might not be the best solution for a first responder with little training in GIS terminology and techniques. On the other hand, commonly used web-based map viewers may not have the capability tomore » be customized for the planning, response, and recovery (PR&R) mission. Data formats should be open and foster easy information flow among local, state, and federal partners. Tools should be free or low-cost to address real-world budget constraints at the local level. They also need to work both with and without a network connection to be robust. The Incident Management Preparedness and Coordination Toolkit (IMPACT) can satisfy many of these needs while working in harmony with established systems at the local, state, and federal levels. The IMPACT software framework, termed the Geospatial Integrated Problem Solving Environment (GIPSE), organizes tasks, tools, and resources for the end user. It uses the concept of software wizards to both customize and extend its functionality. On the Tasks panel are a number of buttons used to initiate various operations. Similar to macros, these task buttons launch scripts that utilize the full functionality of the underlying foundational components such as the SQL spatial database and ORNL-developed map editor. The user is presented with a series of instruction pages which are implemented with HTML for interactivity. On each page are links which initiate specific actions such as creating a map showing various features. Additional tasks may be quickly programmed and added to the panel. The end user can customize the graphical interface to faciltate its use during an emergency. One of the major components of IMPACT is the ORNL Geospatial Viewer (OGV). It is

  12. SatelliteDL: a Toolkit for Analysis of Heterogeneous Satellite Datasets

    NASA Astrophysics Data System (ADS)

    Galloy, M. D.; Fillmore, D.

    2014-12-01

    SatelliteDL is an IDL toolkit for the analysis of satellite Earth observations from a diverse set of platforms and sensors. The core function of the toolkit is the spatial and temporal alignment of satellite swath and geostationary data. The design features an abstraction layer that allows for easy inclusion of new datasets in a modular way. Our overarching objective is to create utilities that automate the mundane aspects of satellite data analysis, are extensible and maintainable, and do not place limitations on the analysis itself. IDL has a powerful suite of statistical and visualization tools that can be used in conjunction with SatelliteDL. Toward this end we have constructed SatelliteDL to include (1) HTML and LaTeX API document generation,(2) a unit test framework,(3) automatic message and error logs,(4) HTML and LaTeX plot and table generation, and(5) several real world examples with bundled datasets available for download. For ease of use, datasets, variables and optional workflows may be specified in a flexible format configuration file. Configuration statements may specify, for example, a region and date range, and the creation of images, plots and statistical summary tables for a long list of variables. SatelliteDL enforces data provenance; all data should be traceable and reproducible. The output NetCDF file metadata holds a complete history of the original datasets and their transformations, and a method exists to reconstruct a configuration file from this information. Release 0.1.0 distributes with ingest methods for GOES, MODIS, VIIRS and CERES radiance data (L1) as well as select 2D atmosphere products (L2) such as aerosol and cloud (MODIS and VIIRS) and radiant flux (CERES). Future releases will provide ingest methods for ocean and land surface products, gridded and time averaged datasets (L3 Daily, Monthly and Yearly), and support for 3D products such as temperature and water vapor profiles. Emphasis will be on NPP Sensor, Environmental and

  13. Geospatial Toolkits and Resource Maps for Selected Countries from the National Renewable Energy Laboratory (NREL)

    DOE Data Explorer

    NREL developed the Geospatial Toolkit (GsT), a map-based software application that integrates resource data and geographic information systems (GIS) for integrated resource assessment. A variety of agencies within countries, along with global datasets, provided country-specific data. Originally developed in 2005, the Geospatial Toolkit was completely redesigned and re-released in November 2010 to provide a more modern, easier-to-use interface with considerably faster analytical querying capabilities. Toolkits are available for 21 countries and each one can be downloaded separately. The source code for the toolkit is also available. [Taken and edited from http://www.nrel.gov/international/geospatial_toolkits.html

  14. ';Best' Practices for Aggregating Subset Results from Archived Datasets

    NASA Astrophysics Data System (ADS)

    Baskin, W. E.; Perez, J.

    2013-12-01

    In response to the exponential growth in science data analysis and visualization capabilities Data Centers have been developing new delivery mechanisms to package and deliver large volumes of aggregated subsets of archived data. New standards are evolving to help data providers and application programmers deal with growing needs of the science community. These standards evolve from the best practices gleaned from new products and capabilities. The NASA Atmospheric Sciences Data Center (ASDC) has developed and deployed production provider-specific search and subset web applications for the CALIPSO, CERES, TES, and MOPITT missions. This presentation explores several use cases that leverage aggregated subset results and examines the standards and formats ASDC developers applied to the delivered files as well as the implementation strategies for subsetting and processing the aggregated products. The following topics will be addressed: - Applications of NetCDF CF conventions to aggregated level 2 satellite subsets - Data-Provider-Specific format requirements vs. generalized standards - Organization of the file structure of aggregated NetCDF subset output - Global Attributes of individual subsetted files vs. aggregated results - Specific applications and framework used for subsetting and delivering derivative data files

  15. Using AASL's "Health and Wellness" and "Crisis Toolkits"

    ERIC Educational Resources Information Center

    Logan, Debra Kay

    2009-01-01

    Whether a school library program is the picture of good health in a state that mandates a professionally staffed library media center in every building or is suffering in a low-wealth district that is facing drastic cuts, the recently launched toolkits by the American Association of School Librarians (AASL) are stocked with useful strategies and…

  16. Policy to Performance Toolkit: Transitioning Adults to Opportunity

    ERIC Educational Resources Information Center

    Alamprese, Judith A.; Limardo, Chrys

    2012-01-01

    The "Policy to Performance Toolkit" is designed to provide state adult education staff and key stakeholders with guidance and tools to use in developing, implementing, and monitoring state policies and their associated practices that support an effective state adult basic education (ABE) to postsecondary education and training transition…

  17. New MISR Toolkit Version 1.4.1 Available

    Atmospheric Science Data Center

    2014-09-03

    ... of the MISR Toolkit (MTK) is now available from the The Open Channel Foundation .  The MTK is a simplified programming ... HDF-EOS to access MISR Level 1B2, Level 2, and ancillary data products. It also handles the MISR conventional format. The interface ...

  18. Cubit Mesh Generation Toolkit V11.1

    2009-03-25

    CUBIT prepares models to be used in computer-based simulation of real-world events. CUBIT is a full-featured software toolkit for robust generation of two- and three-dimensional finite element meshes (grids) and geometry preparation. Its main goal is to reduce the time to generate meshes, particularly large hex meshes of complicated, interlocking assemblies.

  19. The Data Toolkit: Ten Tools for Supporting School Improvement

    ERIC Educational Resources Information Center

    Hess, Robert T.; Robbins, Pam

    2012-01-01

    Using data for school improvement is a key goal of Race to the Top, and now is the time to make data-driven school improvement a priority. However, many educators are drowning in data. Boost your professional learning community's ability to translate data into action with this new book from Pam Robbins and Robert T. Hess. "The Data Toolkit"…

  20. The MPI Bioinformatics Toolkit for protein sequence analysis

    PubMed Central

    Biegert, Andreas; Mayer, Christian; Remmert, Michael; Söding, Johannes; Lupas, Andrei N.

    2006-01-01

    The MPI Bioinformatics Toolkit is an interactive web service which offers access to a great variety of public and in-house bioinformatics tools. They are grouped into different sections that support sequence searches, multiple alignment, secondary and tertiary structure prediction and classification. Several public tools are offered in customized versions that extend their functionality. For example, PSI-BLAST can be run against regularly updated standard databases, customized user databases or selectable sets of genomes. Another tool, Quick2D, integrates the results of various secondary structure, transmembrane and disorder prediction programs into one view. The Toolkit provides a friendly and intuitive user interface with an online help facility. As a key feature, various tools are interconnected so that the results of one tool can be forwarded to other tools. One could run PSI-BLAST, parse out a multiple alignment of selected hits and send the results to a cluster analysis tool. The Toolkit framework and the tools developed in-house will be packaged and freely available under the GNU Lesser General Public Licence (LGPL). The Toolkit can be accessed at . PMID:16845021

  1. Building a "Motivation Toolkit" for Teaching Information Literacy.

    ERIC Educational Resources Information Center

    Moyer, Susan L.; Small, Ruth V.

    2001-01-01

    Discusses the need to motivate students to make information literacy programs successful and demonstrates how a middle school library media specialist used Small and Arnone's Motivation Overlay for Information Skills Instruction to build a set of customized toolkits to improve student research that includes the Big6[TM] approach to library and…

  2. A Toolkit to Implement Graduate Attributes in Geography Curricula

    ERIC Educational Resources Information Center

    Spronken-Smith, Rachel; McLean, Angela; Smith, Nell; Bond, Carol; Jenkins, Martin; Marshall, Stephen; Frielick, Stanley

    2016-01-01

    This article uses findings from a project on engagement with graduate outcomes across higher education institutions in New Zealand to produce a toolkit for implementing graduate attributes in geography curricula. Key facets include strong leadership; academic developers to facilitate conversations about graduate attributes and teaching towards…

  3. Manufacturer’s CORBA Interface Testing Toolkit: Overview

    PubMed Central

    Flater, David

    1999-01-01

    The Manufacturer’s CORBA Interface Testing Toolkit (MCITT) is a software package that supports testing of CORBA components and interfaces. It simplifies the testing of complex distributed systems by producing “dummy components” from Interface Testing Language and Component Interaction Specifications and by automating some error-prone programming tasks. It also provides special commands to support conformance, performance, and stress testing.

  4. Using Toolkits to Achieve STEM Enterprise Learning Outcomes

    ERIC Educational Resources Information Center

    Watts, Carys A.; Wray, Katie

    2012-01-01

    Purpose: The purpose of this paper is to evaluate the effectiveness of using several commercial tools in science, technology, engineering and maths (STEM) subjects for enterprise education at Newcastle University, UK. Design/methodology/approach: The paper provides an overview of existing toolkit use in higher education, before reviewing where and…

  5. The Archivists' Toolkit: Another Step toward Streamlined Archival Processing

    ERIC Educational Resources Information Center

    Westbrook, Bradley D.; Mandell, Lee; Shepherd, Kelcy; Stevens, Brian; Varghese, Jason

    2006-01-01

    The Archivists' Toolkit is a software application currently in development and designed to support the creation and management of archival information. This article summarizes the development of the application, including some of the problems the application is designed to resolve. Primary emphasis is placed on describing the application's…

  6. Capturing and Using Knowledge about the Use of Visualization Toolkits

    SciTech Connect

    Del Rio, Nicholas R.; Pinheiro da Silva, Paulo

    2012-11-02

    When constructing visualization pipelines using toolkits such as Visualization Toolkit (VTK) and Generic Mapping Tools (GMT), developers must understand (1) what toolkit operators will transform their data from its raw state to some required view state and (2) what viewers are available to present the generated view. Traditionally, developers learn about how to construct visualization pipelines by reading documentation and inspecting code examples, which can be costly in terms of the time and effort expended. Once an initial pipeline is constructed, developers may still have to undergo a trial and error process before a satisfactory visualization is generated. This paper presents the Visualization Knowledge Project (VisKo) that is built on a knowledge base of visualization toolkit operators and how they can be piped together to form visualization pipelines. Developers may now rely on VisKo to guide them when constructing visualization pipelines and in some cases, when VisKo has complete knowledge about some set of operators (i.e., sequencing and parameter settings), automatically generate a fully functional visualization pipeline.

  7. Educating Globally Competent Citizens: A Toolkit. Second Edition

    ERIC Educational Resources Information Center

    Elliott-Gower, Steven; Falk, Dennis R.; Shapiro, Martin

    2012-01-01

    Educating Globally Competent Citizens, a product of AASCU's American Democracy Project and its Global Engagement Initiative, introduces readers to a set of global challenges facing society based on the Center for Strategic and International Studies' 7 Revolutions. The toolkit is designed to aid faculty in incorporating global challenges into new…

  8. Reproductive Health Assessment After Disaster: introduction to the RHAD toolkit.

    PubMed

    Zotti, Marianne E; Williams, Amy M

    2011-08-01

    This article reviews associations between disaster and the reproductive health of women, describes how Hurricane Katrina influenced our understanding about postdisaster reproductive health needs, and introduces a new toolkit that can help health departments assess postdisaster health needs among women of reproductive age.

  9. Simulation toolkit with CMOS detector in the framework of hadrontherapy

    NASA Astrophysics Data System (ADS)

    Rescigno, R.; Finck, Ch.; Juliani, D.; Baudot, J.; Dauvergne, D.; Dedes, G.; Krimmer, J.; Ray, C.; Reithinger, V.; Rousseau, M.; Testa, E.; Winter, M.

    2014-03-01

    Proton imaging can be seen as a powerful technique for on-line monitoring of ion range during carbon ion therapy irradiation. The protons detection technique uses, as three-dimensional tracking system, a set of CMOS sensor planes. A simulation toolkit based on GEANT4 and ROOT is presented including detector response and reconstruction algorithm.

  10. A Beginning Rural Principal's Toolkit: A Guide for Success

    ERIC Educational Resources Information Center

    Ashton, Brian; Duncan, Heather E.

    2012-01-01

    The purpose of this article is to explore both the challenges and skills needed to effectively assume a leadership position and thus to create an entry plan or "toolkit" for a new rural school leader. The entry plan acts as a guide beginning principals may use to navigate the unavoidable confusion that comes with leadership. It also assists…

  11. Dataset of aggregate producers in New Mexico

    USGS Publications Warehouse

    Orris, Greta J.

    2000-01-01

    This report presents data, including latitude and longitude, for aggregate sites in New Mexico that were believed to be active in the period 1997-1999. The data are presented in paper form in Part A of this report and as Microsoft Excel 97 and Data Interchange Format (DIF) files in Part B. The work was undertaken as part of the effort to update information for the National Atlas. This compilation includes data from: the files of U.S. Geological Survey (USGS); company contacts; the New Mexico Bureau of Mines and Mineral Resources, New Mexico Bureau of Mine Inspection, and the Mining and Minerals Division of the New Mexico Energy, Minerals and Natural Resources Department (Hatton and others, 1998); the Bureau of Land Management Information; and direct communications with some of the aggregate operators. Additional information on most of the sites is available in Hatton and others (1998).

  12. Thermodynamics of Protein Aggregation

    NASA Astrophysics Data System (ADS)

    Osborne, Kenneth L.; Barz, Bogdan; Bachmann, Michael; Strodel, Birgit

    Amyloid protein aggregation characterizes many neurodegenerative disorders, including Alzheimer's, Parkinson's, and Creutz- feldt-Jakob disease. Evidence suggests that amyloid aggregates may share similar aggregation pathways, implying simulation of full-length amyloid proteins is not necessary for understanding amyloid formation. In this study we simulate GNNQQNY, the N-terminal prion-determining domain of the yeast protein Sup35 to investigate the thermodynamics of structural transitions during aggregation. We use a coarse-grained model with replica-exchange molecular dynamics to investigate the association of 3-, 6-, and 12-chain GNNQQNY systems and we determine the aggregation pathway by studying aggregation states of GN- NQQNY. We find that the aggregation of the hydrophilic GNNQQNY sequence is mainly driven by H-bond formation, leading to the formation of /3-sheets from the very beginning of the assembly process. Condensation (aggregation) and ordering take place simultaneously, which is underpinned by the occurrence of a single heat capacity peak only.

  13. Dissemination of Earth Remote Sensing Data for Use in the NOAA/NWS Damage Assessment Toolkit

    NASA Technical Reports Server (NTRS)

    Molthan, Andrew; Burks, Jason; Camp, Parks; McGrath, Kevin; Bell, Jordan

    2015-01-01

    The National Weather Service has developed the Damage Assessment Toolkit (DAT), an application for smartphones and tablets that allows for the collection, geolocation, and aggregation of various damage indicators that are collected during storm surveys. The DAT supports the often labor-intensive process where meteorologists venture into the storm-affected area, allowing them to acquire geotagged photos of the observed damage while also assigning estimated EF-scale categories based upon their observations. Once the data are collected, the DAT infrastructure aggregates the observations into a server that allows other meteorologists to perform quality control and other analysis steps before completing their survey and making the resulting data available to the public. In addition to in-person observations, Earth remote sensing from operational, polar-orbiting satellites can support the damage assessment process by identifying portions of damage tracks that may be missed due to road limitations, access to private property, or time constraints. Products resulting from change detection techniques can identify damage to vegetation and the land surface, aiding in the survey process. In addition, higher resolution commercial imagery can corroborate ground-based surveys by examining higher-resolution commercial imagery. As part of an ongoing collaboration, NASA and NOAA are working to integrate near real-time Earth remote sensing observations into the NOAA/NWS Damage Assessment Toolkit. This presentation will highlight recent developments in a streamlined approach for disseminating Earth remote sensing data via web mapping services and a new menu interface that has been integrated within the DAT. A review of current and future products will be provided, including products derived from MODIS and VIIRS for preliminary track identification, along with conduits for higher-resolution Landsat, ASTER, and commercial imagery as they become available. In addition to tornado damage

  14. Developing Mixed Reality Educational Applications: The Virtual Touch Toolkit.

    PubMed

    Mateu, Juan; Lasala, María José; Alamán, Xavier

    2015-08-31

    In this paper, we present Virtual Touch, a toolkit that allows the development of educational activities through a mixed reality environment such that, using various tangible elements, the interconnection of a virtual world with the real world is enabled. The main goal of Virtual Touch is to facilitate the installation, configuration and programming of different types of technologies, abstracting the creator of educational applications from the technical details involving the use of tangible interfaces and virtual worlds. Therefore, it is specially designed to enable teachers to themselves create educational activities for their students in a simple way, taking into account that teachers generally lack advanced knowledge in computer programming and electronics. The toolkit has been used to develop various educational applications that have been tested in two secondary education high schools in Spain.

  15. A toolkit for epithermal neutron beam characterisation in BNCT.

    PubMed

    Auterinen, Iiro; Serén, Tom; Uusi-Simola, Jouni; Kosunen, Antti; Savolainen, Sauli

    2004-01-01

    Methods for dosimetry of epithermal neutron beams used in boron neutron capture therapy (BNCT) have been developed and utilised within the Finnish BNCT project as well as within a European project for a code of practise for the dosimetry of BNCT. One outcome has been a travelling toolkit for BNCT dosimetry. It consists of activation detectors and ionisation chambers. The free-beam neutron spectrum is measured with a set of activation foils of different isotopes irradiated both in a Cd-capsule and without it. Neutron flux (thermal and epithermal) distribution in phantoms is measured using activation of Mn and Au foils, and Cu wire. Ionisation chamber (IC) measurements are performed both in-free-beam and in-phantom for determination of the neutron and gamma dose components. This toolkit has also been used at other BNCT facilities in Europe, the USA, Argentina and Japan.

  16. Developing Mixed Reality Educational Applications: The Virtual Touch Toolkit

    PubMed Central

    Mateu, Juan; Lasala, María José; Alamán, Xavier

    2015-01-01

    In this paper, we present Virtual Touch, a toolkit that allows the development of educational activities through a mixed reality environment such that, using various tangible elements, the interconnection of a virtual world with the real world is enabled. The main goal of Virtual Touch is to facilitate the installation, configuration and programming of different types of technologies, abstracting the creator of educational applications from the technical details involving the use of tangible interfaces and virtual worlds. Therefore, it is specially designed to enable teachers to themselves create educational activities for their students in a simple way, taking into account that teachers generally lack advanced knowledge in computer programming and electronics. The toolkit has been used to develop various educational applications that have been tested in two secondary education high schools in Spain. PMID:26334275

  17. Pervasive Collaboratorive Computing Environment Jabber Toolkit

    2004-05-15

    PCCE Project background: Our experience in building distributed collaboratories has shown us that there is a growing need for simple, non-intrusive, and flexible ways to stay in touch and work together. Towards this goal we are developing a Pervasive Collaborative Computing Environment (PCCE) within which participants can rendezvous and interact with each other. The PCCE aims to support continuous or ad hoc collaboration, target daily tasks and base connectivity, be easy to use and installmore » across multiple platforms, leverage off of existing components when possible, use standards-based components, and leverage off of Grid services (e.g., security and directory services). A key concept for this work is "incremental trust", which allows the system's "trust" of a given user to change dynamically. PCCE Jabber client software: This leverages Jabber. an open Instant Messaging (IM) protocol and the related Internet Engineering Task Force (IETF) standards "XMPP" and "XMPP-IM" to allow collaborating parties to chat either one-on-one or in "chat rooms". Standard Jabber clients will work within this framework, but the software will also include extensions to a (multi-platform) GUI client (Gaim) for X.509-based security, search, and incremental trust. This software also includes Web interfaces for managing user registration to a Jabber server. PCCE Jabber server software: Extensions to the code, database, and configuration files for the dominant open-source Jabber server, "jabberd". Extensions for search, X.509 security, and incremental trust. Note that the jabberd software is not included as part of this software.« less

  18. Needs assessment: blueprint for a nurse graduate orientation employer toolkit.

    PubMed

    Cylke, Katherine

    2012-01-01

    Southern Nevada nurse employers are resistant to hiring new graduate nurses (NGNs) because of their difficulties in making the transition into the workplace. At the same time, employers consider nurse residencies cost-prohibitive. Therefore, an alternative strategy was developed to assist employers with increasing the effectiveness of existing NGN orientation programs. A needs assessment of NGNs, employers, and nursing educators was completed, and the results were used to develop a toolkit for employers.

  19. A medical imaging and visualization toolkit in Java.

    PubMed

    Huang, Su; Baimouratov, Rafail; Xiao, Pengdong; Ananthasubramaniam, Anand; Nowinski, Wieslaw L

    2006-03-01

    Medical imaging research and clinical applications usually require combination and integration of various techniques ranging from image processing and analysis to realistic visualization to user-friendly interaction. Researchers with different backgrounds coming from diverse areas have been using numerous types of hardware, software, and environments to obtain their results. We also observe that students often build their tools from scratch resulting in redundant work. A generic and flexible medical imaging and visualization toolkit would be helpful in medical research and educational institutes to reduce redundant development work and hence increase research efficiency. This paper presents our experience in developing a Medical Imaging and Visualization Toolkit (BIL-kit) that is a set of comprehensive libraries as well as a number of interactive tools. The BIL-kit covers a wide range of fundamental functions from image conversion and transformation, image segmentation, and analysis to geometric model generation and manipulation, all the way up to 3D visualization and interactive simulation. The toolkit design and implementation emphasize the reusability and flexibility. BIL-kit is implemented in the Java language so that it works in hybrid and dynamic research and educational environments. This also allows the toolkit to extend its usage for the development of Web-based applications. Several BIL-kit-based tools and applications are presented including image converter, image processor, general anatomy model simulator, vascular modeling environment, and volume viewer. BIL-kit is a suitable platform for researchers and students to develop visualization and simulation prototypes, and it can also be used for the development of clinical applications.

  20. Needs assessment: blueprint for a nurse graduate orientation employer toolkit.

    PubMed

    Cylke, Katherine

    2012-01-01

    Southern Nevada nurse employers are resistant to hiring new graduate nurses (NGNs) because of their difficulties in making the transition into the workplace. At the same time, employers consider nurse residencies cost-prohibitive. Therefore, an alternative strategy was developed to assist employers with increasing the effectiveness of existing NGN orientation programs. A needs assessment of NGNs, employers, and nursing educators was completed, and the results were used to develop a toolkit for employers. PMID:22449877

  1. Risk of Resource Failure and Toolkit Variation in Small-Scale Farmers and Herders

    PubMed Central

    Collard, Mark; Ruttle, April; Buchanan, Briggs; O’Brien, Michael J.

    2012-01-01

    Recent work suggests that global variation in toolkit structure among hunter-gatherers is driven by risk of resource failure such that as risk of resource failure increases, toolkits become more diverse and complex. Here we report a study in which we investigated whether the toolkits of small-scale farmers and herders are influenced by risk of resource failure in the same way. In the study, we applied simple linear and multiple regression analysis to data from 45 small-scale food-producing groups to test the risk hypothesis. Our results were not consistent with the hypothesis; none of the risk variables we examined had a significant impact on toolkit diversity or on toolkit complexity. It appears, therefore, that the drivers of toolkit structure differ between hunter-gatherers and small-scale food-producers. PMID:22844421

  2. Guide to Using the WIND Toolkit Validation Code

    SciTech Connect

    Lieberman-Cribbin, W.; Draxl, C.; Clifton, A.

    2014-12-01

    In response to the U.S. Department of Energy's goal of using 20% wind energy by 2030, the Wind Integration National Dataset (WIND) Toolkit was created to provide information on wind speed, wind direction, temperature, surface air pressure, and air density on more than 126,000 locations across the United States from 2007 to 2013. The numerical weather prediction model output, gridded at 2-km and at a 5-minute resolution, was further converted to detail the wind power production time series of existing and potential wind facility sites. For users of the dataset it is important that the information presented in the WIND Toolkit is accurate and that errors are known, as then corrective steps can be taken. Therefore, we provide validation code written in R that will be made public to provide users with tools to validate data of their own locations. Validation is based on statistical analyses of wind speed, using error metrics such as bias, root-mean-square error, centered root-mean-square error, mean absolute error, and percent error. Plots of diurnal cycles, annual cycles, wind roses, histograms of wind speed, and quantile-quantile plots are created to visualize how well observational data compares to model data. Ideally, validation will confirm beneficial locations to utilize wind energy and encourage regional wind integration studies using the WIND Toolkit.

  3. Matlab based Toolkits used to Interface with Optical Design Software for NASA's James Webb Space Telescope

    NASA Technical Reports Server (NTRS)

    Howard, Joseph

    2007-01-01

    The viewgraph presentation provides an introduction to the James Webb Space Telescope (JWST). The first part provides a brief overview of Matlab toolkits including CodeV, OSLO, and Zemax Toolkits. The toolkit overview examines purpose, layout, how Matlab gets data from CodeV, function layout, and using cvHELP. The second part provides examples of use with JWST, including wavefront sensitivities and alignment simulations.

  4. The GBIF Integrated Publishing Toolkit: Facilitating the Efficient Publishing of Biodiversity Data on the Internet

    PubMed Central

    Robertson, Tim; Döring, Markus; Guralnick, Robert; Bloom, David; Wieczorek, John; Braak, Kyle; Otegui, Javier; Russell, Laura; Desmet, Peter

    2014-01-01

    The planet is experiencing an ongoing global biodiversity crisis. Measuring the magnitude and rate of change more effectively requires access to organized, easily discoverable, and digitally-formatted biodiversity data, both legacy and new, from across the globe. Assembling this coherent digital representation of biodiversity requires the integration of data that have historically been analog, dispersed, and heterogeneous. The Integrated Publishing Toolkit (IPT) is a software package developed to support biodiversity dataset publication in a common format. The IPT’s two primary functions are to 1) encode existing species occurrence datasets and checklists, such as records from natural history collections or observations, in the Darwin Core standard to enhance interoperability of data, and 2) publish and archive data and metadata for broad use in a Darwin Core Archive, a set of files following a standard format. Here we discuss the key need for the IPT, how it has developed in response to community input, and how it continues to evolve to streamline and enhance the interoperability, discoverability, and mobilization of new data types beyond basic Darwin Core records. We close with a discussion how IPT has impacted the biodiversity research community, how it enhances data publishing in more traditional journal venues, along with new features implemented in the latest version of the IPT, and future plans for more enhancements. PMID:25099149

  5. PAT: a protein analysis toolkit for integrated biocomputing on the web

    PubMed Central

    Gracy, Jérôme; Chiche, Laurent

    2005-01-01

    PAT, for Protein Analysis Toolkit, is an integrated biocomputing server. The main goal of its design was to facilitate the combination of different processing tools for complex protein analyses and to simplify the automation of repetitive tasks. The PAT server provides a standardized web interface to a wide range of protein analysis tools. It is designed as a streamlined analysis environment that implements many features which strongly simplify studies dealing with protein sequences and structures and improve productivity. PAT is able to read and write data in many bioinformatics formats and to create any desired pipeline by seamlessly sending the output of a tool to the input of another tool. PAT can retrieve protein entries from identifier-based queries by using pre-computed database indexes. Users can easily formulate complex queries combining different analysis tools with few mouse clicks, or via a dedicated macro language, and a web session manager provides direct access to any temporary file generated during the user session. PAT is freely accessible on the Internet at . PMID:15980554

  6. The GBIF integrated publishing toolkit: facilitating the efficient publishing of biodiversity data on the internet.

    PubMed

    Robertson, Tim; Döring, Markus; Guralnick, Robert; Bloom, David; Wieczorek, John; Braak, Kyle; Otegui, Javier; Russell, Laura; Desmet, Peter

    2014-01-01

    The planet is experiencing an ongoing global biodiversity crisis. Measuring the magnitude and rate of change more effectively requires access to organized, easily discoverable, and digitally-formatted biodiversity data, both legacy and new, from across the globe. Assembling this coherent digital representation of biodiversity requires the integration of data that have historically been analog, dispersed, and heterogeneous. The Integrated Publishing Toolkit (IPT) is a software package developed to support biodiversity dataset publication in a common format. The IPT's two primary functions are to 1) encode existing species occurrence datasets and checklists, such as records from natural history collections or observations, in the Darwin Core standard to enhance interoperability of data, and 2) publish and archive data and metadata for broad use in a Darwin Core Archive, a set of files following a standard format. Here we discuss the key need for the IPT, how it has developed in response to community input, and how it continues to evolve to streamline and enhance the interoperability, discoverability, and mobilization of new data types beyond basic Darwin Core records. We close with a discussion how IPT has impacted the biodiversity research community, how it enhances data publishing in more traditional journal venues, along with new features implemented in the latest version of the IPT, and future plans for more enhancements.

  7. The GBIF integrated publishing toolkit: facilitating the efficient publishing of biodiversity data on the internet.

    PubMed

    Robertson, Tim; Döring, Markus; Guralnick, Robert; Bloom, David; Wieczorek, John; Braak, Kyle; Otegui, Javier; Russell, Laura; Desmet, Peter

    2014-01-01

    The planet is experiencing an ongoing global biodiversity crisis. Measuring the magnitude and rate of change more effectively requires access to organized, easily discoverable, and digitally-formatted biodiversity data, both legacy and new, from across the globe. Assembling this coherent digital representation of biodiversity requires the integration of data that have historically been analog, dispersed, and heterogeneous. The Integrated Publishing Toolkit (IPT) is a software package developed to support biodiversity dataset publication in a common format. The IPT's two primary functions are to 1) encode existing species occurrence datasets and checklists, such as records from natural history collections or observations, in the Darwin Core standard to enhance interoperability of data, and 2) publish and archive data and metadata for broad use in a Darwin Core Archive, a set of files following a standard format. Here we discuss the key need for the IPT, how it has developed in response to community input, and how it continues to evolve to streamline and enhance the interoperability, discoverability, and mobilization of new data types beyond basic Darwin Core records. We close with a discussion how IPT has impacted the biodiversity research community, how it enhances data publishing in more traditional journal venues, along with new features implemented in the latest version of the IPT, and future plans for more enhancements. PMID:25099149

  8. molSimplify: A toolkit for automating discovery in inorganic chemistry.

    PubMed

    Ioannidis, Efthymios I; Gani, Terry Z H; Kulik, Heather J

    2016-08-15

    We present an automated, open source toolkit for the first-principles screening and discovery of new inorganic molecules and intermolecular complexes. Challenges remain in the automatic generation of candidate inorganic molecule structures due to the high variability in coordination and bonding, which we overcome through a divide-and-conquer tactic that flexibly combines force-field preoptimization of organic fragments with alignment to first-principles-trained metal-ligand distances. Exploration of chemical space is enabled through random generation of ligands and intermolecular complexes from large chemical databases. We validate the generated structures with the root mean squared (RMS) gradients evaluated from density functional theory (DFT), which are around 0.02 Ha/au across a large 150 molecule test set. Comparison of molSimplify results to full optimization with the universal force field reveals that RMS DFT gradients are improved by 40%. Seamless generation of input files, preparation and execution of electronic structure calculations, and post-processing for each generated structure aids interpretation of underlying chemical and energetic trends. © 2016 Wiley Periodicals, Inc. PMID:27364957

  9. FASTAptamer: A Bioinformatic Toolkit for High-throughput Sequence Analysis of Combinatorial Selections

    PubMed Central

    Alam, Khalid K; Chang, Jonathan L; Burke, Donald H

    2015-01-01

    High-throughput sequence (HTS) analysis of combinatorial selection populations accelerates lead discovery and optimization and offers dynamic insight into selection processes. An underlying principle is that selection enriches high-fitness sequences as a fraction of the population, whereas low-fitness sequences are depleted. HTS analysis readily provides the requisite numerical information by tracking the evolutionary trajectory of individual sequences in response to selection pressures. Unlike genomic data, for which a number of software solutions exist, user-friendly tools are not readily available for the combinatorial selections field, leading many users to create custom software. FASTAptamer was designed to address the sequence-level analysis needs of the field. The open source FASTAptamer toolkit counts, normalizes and ranks read counts in a FASTQ file, compares populations for sequence distribution, generates clusters of sequence families, calculates fold-enrichment of sequences throughout the course of a selection and searches for degenerate sequence motifs. While originally designed for aptamer selections, FASTAptamer can be applied to any selection strategy that can utilize next-generation DNA sequencing, such as ribozyme or deoxyribozyme selections, in vivo mutagenesis and various surface display technologies (peptide, antibody fragment, mRNA, etc.). FASTAptamer software, sample data and a user's guide are available for download at http://burkelab.missouri.edu/fastaptamer.html. PMID:25734917

  10. ApiNATOMY: a novel toolkit for visualizing multiscale anatomy schematics with phenotype-related information.

    PubMed

    de Bono, Bernard; Grenon, Pierre; Sammut, Stephen John

    2012-05-01

    A significant proportion of biomedical resources carries information that cross references to anatomical structures across multiple scales. To improve the visualization of such resources in their anatomical context, we developed an automated methodology that produces anatomy schematics in a consistent manner,and provides for the overlay of anatomy-related resource information onto the same diagram. This methodology, called ApiNATOMY, draws upon the topology of ontology graphs to automatically lay out treemaps representing body parts as well as semantic metadata linking to such ontologies. More generally, ApiNATOMY treemaps provide an efficient and manageable way to visualize large biomedical ontologies in a meaningful and consistent manner. In the anatomy domain, such treemaps will allow epidemiologists, clinicians, and biomedical scientists to review, and interact with, anatomically aggregated heterogeneous data and model resources. Such an approach supports the visual identification of functional relations between anatomically colocalized resources that may not be immediately amenable to automation by ontology-based inferencing. We also describe the application of ApiNATOMY schematics to integrate, and add value to, human phenotype-related information—results are found at http://apinatomy.org. The long-term goal for the ApiNATOMY toolkit is to support clinical and scientific graphical user interfaces and dashboards for biomedical resource management and data analytics.

  11. Aggregations in Flatworms.

    ERIC Educational Resources Information Center

    Liffen, C. L.; Hunter, M.

    1980-01-01

    Described is a school project to investigate aggregations in flatworms which may be influenced by light intensity, temperature, and some form of chemical stimulus released by already aggregating flatworms. Such investigations could be adopted to suit many educational levels of science laboratory activities. (DS)

  12. Using the PhenX Toolkit to Add Standard Measures to a Study.

    PubMed

    Hendershot, Tabitha; Pan, Huaqin; Haines, Jonathan; Harlan, William R; Marazita, Mary L; McCarty, Catherine A; Ramos, Erin M; Hamilton, Carol M

    2015-07-01

    The PhenX (consensus measures for Phenotypes and eXposures) Toolkit (https://www.phenxtoolkit.org/) offers high-quality, well-established measures of phenotypes and exposures for use by the scientific community. The goal is to promote the use of standard measures, enhance data interoperability, and help investigators identify opportunities for collaborative and translational research. The Toolkit contains 395 measures drawn from 22 research domains (fields of research), along with additional collections of measures for Substance Abuse and Addiction (SAA) research, Mental Health Research (MHR), and Tobacco Regulatory Research (TRR). Additional measures for TRR that are expected to be released in 2015 include Obesity, Eating Disorders, and Sickle Cell Disease. Measures are selected by working groups of domain experts using a consensus process that includes input from the scientific community. The Toolkit provides a description of each PhenX measure, the rationale for including it in the Toolkit, protocol(s) for collecting the measure, and supporting documentation. Users can browse measures in the Toolkit or can search the Toolkit using the Smart Query Tool or a full text search. PhenX Toolkit users select measures of interest to add to their Toolkit. Registered Toolkit users can save their Toolkit and return to it later to revise or complete. They then have options to download a customized Data Collection Worksheet that specifies the data to be collected, and a Data Dictionary that describes each variable included in the Data Collection Worksheet. The Toolkit also has a Register Your Study feature that facilitates cross-study collaboration by allowing users to find other investigators using the same PhenX measures.

  13. Adoption of Test Driven Development and Continuous Integration for the Development of the Trick Simulation Toolkit

    NASA Technical Reports Server (NTRS)

    Penn, John M.

    2013-01-01

    This paper describes the adoption of a Test Driven Development approach and a Continuous Integration System in the development of the Trick Simulation Toolkit, a generic simulation development environment for creating high fidelity training and engineering simulations at the NASA/Johnson Space Center and many other NASA facilities. It describes what was learned and the significant benefits seen, such as fast, thorough, and clear test feedback every time code is checked-in to the code repository. It also describes a system that encourages development of code that is much more flexible, maintainable, and reliable. The Trick Simulation Toolkit development environment provides a common architecture for user-defined simulations. Trick builds executable simulations using user-supplied simulation-definition files (S_define) and user supplied "model code". For each Trick-based simulation, Trick automatically provides job scheduling, checkpoint / restore, data-recording, interactive variable manipulation (variable server), and an input-processor. Also included are tools for plotting recorded data and various other supporting tools and libraries. Trick is written in C/C++ and Java and supports both Linux and MacOSX. Prior to adopting this new development approach, Trick testing consisted primarily of running a few large simulations, with the hope that their complexity and scale would exercise most of Trick's code and expose any recently introduced bugs. Unsurprising, this approach yielded inconsistent results. It was obvious that a more systematic, thorough approach was required. After seeing examples of some Java-based projects that used the JUnit test framework, similar test frameworks for C and C++ were sought. Several were found, all clearly inspired by JUnit. Googletest, a freely available Open source testing framework, was selected as the most appropriate and capable. The new approach was implemented while rewriting the Trick memory management component, to eliminate a

  14. Developing Climate Resilience Toolkit Decision Support Training Sectio

    NASA Astrophysics Data System (ADS)

    Livezey, M. M.; Herring, D.; Keck, J.; Meyers, J. C.

    2014-12-01

    The Climate Resilience Toolkit (CRT) is a Federal government effort to address the U.S. President's Climate Action Plan and Executive Order for Climate Preparedness. The toolkit will provide access to tools and products useful for climate-sensitive decision making. To optimize the user experience, the toolkit will also provide access to training materials. The National Oceanic and Atmospheric Administration (NOAA) has been building a climate training capability for 15 years. The target audience for the training has historically been mainly NOAA staff with some modified training programs for external users and stakeholders. NOAA is now using this climate training capacity for the CRT. To organize the CRT training section, we collaborated with the Association of Climate Change Officers to determine the best strategy and identified four additional complimentary skills needed for successful decision making: climate literacy, environmental literacy, risk assessment and management, and strategic execution and monitoring. Developing the climate literacy skills requires knowledge of climate variability and change, as well as an introduction to the suite of available products and services. For the development of an environmental literacy category, specific topics needed include knowledge of climate impacts on specific environmental systems. Climate risk assessment and management introduces a process for decision making and provides knowledge on communication of climate information and integration of climate information in planning processes. The strategic execution and monitoring category provides information on use of NOAA climate products, services, and partnership opportunities for decision making. In order to use the existing training modules, it was necessary to assess their level of complexity, catalog them, and develop guidance for users on a curriculum to take advantage of the training resources to enhance their learning experience. With the development of this CRT

  15. The advanced computational testing and simulation toolkit (ACTS)

    SciTech Connect

    Drummond, L.A.; Marques, O.

    2002-05-21

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  16. The Customer Flow Toolkit: A Framework for Designing High Quality Customer Services.

    ERIC Educational Resources Information Center

    New York Association of Training and Employment Professionals, Albany.

    This document presents a toolkit to assist staff involved in the design and development of New York's one-stop system. Section 1 describes the preplanning issues to be addressed and the intended outcomes that serve as the framework for creation of the customer flow toolkit. Section 2 outlines the following strategies to assist in designing local…

  17. The Development of a Curriculum Toolkit with American Indian and Alaska Native Communities

    ERIC Educational Resources Information Center

    Thompson, Nicole L.; Hare, Dwight; Sempier, Tracie T.; Grace, Cathy

    2008-01-01

    This article explains the creation of the "Growing and Learning with Young Native Children" curriculum toolkit. The curriculum toolkit was designed to give American Indian and Alaska Native early childhood educators who work in a variety of settings the framework for developing a research-based, developmentally appropriate, tribally specific…

  18. Building Emergency Contraception Awareness among Adolescents. A Toolkit for Schools and Community-Based Organizations.

    ERIC Educational Resources Information Center

    Simkin, Linda; Radosh, Alice; Nelsesteun, Kari; Silverstein, Stacy

    This toolkit presents emergency contraception (EC) as a method to help adolescent women avoid pregnancy and abortion after unprotected sexual intercourse. The sections of this toolkit are designed to help increase your knowledge of EC and stay up to date. They provide suggestions for increasing EC awareness in the workplace, whether it is a school…

  19. Language Access Toolkit: An Organizing and Advocacy Resource for Community-Based Youth Programs

    ERIC Educational Resources Information Center

    Beyersdorf, Mark Ro

    2013-01-01

    Asian American Legal Defense and Education Fund (AALDEF) developed this language access toolkit to share the expertise and experiences of National Asian American Education Advocates Network (NAAEA) member organizations with other community organizations interested in developing language access campaigns. This toolkit includes an overview of…

  20. Charon Toolkit for Parallel, Implicit Structured-Grid Computations: Functional Design

    NASA Technical Reports Server (NTRS)

    VanderWijngaart, Rob F.; Kutler, Paul (Technical Monitor)

    1997-01-01

    Charon is a software toolkit that enables engineers to develop high-performing message-passing programs in a convenient and piecemeal fashion. Emphasis is on rapid program development and prototyping. In this report a detailed description of the functional design of the toolkit is presented. It is illustrated by the stepwise parallelization of two representative code examples.

  1. Toolkit of Available EPA Green Infrastructure Modeling Software. National Stormwater Calculator

    EPA Science Inventory

    This webinar will present a toolkit consisting of five EPA green infrastructure models and tools, along with communication material. This toolkit can be used as a teaching and quick reference resource for use by planners and developers when making green infrastructure implementat...

  2. A Data Audit and Analysis Toolkit To Support Assessment of the First College Year.

    ERIC Educational Resources Information Center

    Paulson, Karen

    This "toolkit" provides a process by which institutions can identify and use information resources to enhance the experiences and outcomes of first-year students. The toolkit contains a "Technical Manual" designed for use by the technical personnel who will be conducting the data audit and associated analyses. Administrators who want more…

  3. Toolkit for Evaluating Alignment of Instructional and Assessment Materials to the Common Core State Standards

    ERIC Educational Resources Information Center

    Achieve, Inc., 2014

    2014-01-01

    In joint partnership, Achieve, The Council of Chief State School Officers, and Student Achievement Partners have developed a Toolkit for Evaluating the Alignment of Instructional and Assessment Materials to the Common Core State Standards. The Toolkit is a set of interrelated, freely available instruments for evaluating alignment to the CCSS; each…

  4. Toolkit for Evaluating Alignment of Instructional and Assessment Materials to the Common Core State Standards

    ERIC Educational Resources Information Center

    Achieve, Inc., 2014

    2014-01-01

    In joint partnership, Achieve, The Council of Chief State School Officers, and Student Achievement Partners have developed a Toolkit for Evaluating the Alignment of Instructional and Assessment Materials to the Common Core State Standards (CCSS). The Toolkit is a set of interrelated, freely available instruments for evaluating alignment to the…

  5. Growing and Sustaining Parent Engagement: A Toolkit for Parents and Community Partners

    ERIC Educational Resources Information Center

    Center for the Study of Social Policy, 2010

    2010-01-01

    The Toolkit is a quick and easy guide to help support and sustain parent engagement. It provides how to's for implementing three powerful strategies communities can use to maintain and grow parent engagement work that is already underway: Creating a Parent Engagement 1) Roadmap, 2) Checklist and 3) Support Network. This toolkit includes…

  6. Toolkit for a Workshop on Building a Culture of Data Use. REL 2015-063

    ERIC Educational Resources Information Center

    Gerzon, Nancy; Guckenburg, Sarah

    2015-01-01

    The Culture of Data Use Workshop Toolkit helps school and district teams apply research to practice as they establish and support a culture of data use in their educational setting. The field-tested workshop toolkit guides teams through a set of structured activities to develop an understanding of data-use research in schools and to analyze…

  7. How to develop a second victim support program: a toolkit for health care organizations.

    PubMed

    Pratt, Stephen; Kenney, Linda; Scott, Susan D; Wu, Albert W

    2012-05-01

    A toolkit was developed to help health care organizations implement support programs for clinicians suffering from the emotional impact of errors and adverse events. Based on the best available evidence related to the second victim experience, the toolkit consists of 10 modules, each with a series of specific action steps, references, and exemplars.

  8. Practitioner Data Use in Schools: Workshop Toolkit. REL 2015-043

    ERIC Educational Resources Information Center

    Bocala, Candice; Henry, Susan F.; Mundry, Susan; Morgan, Claire

    2014-01-01

    The "Practitioner Data Use in Schools: Workshop Toolkit" is designed to help practitioners systematically and accurately use data to inform their teaching practice. The toolkit includes an agenda, slide deck, participant workbook, and facilitator's guide and covers the following topics: developing data literacy, engaging in a cycle…

  9. Quality Management in Local Authority Educational Psychology Services 2: Self-Evaluation Toolkit

    ERIC Educational Resources Information Center

    Her Majesty's Inspectorate of Education, 2007

    2007-01-01

    The toolkit has been developed by the profession and the two training universities of Dundee and Strathclyde to support self-evaluation. Educational psychologists from across Scotland, representing all professional levels, have been directly involved in the consultation and development of this document. The toolkit has been designed to provide a…

  10. A Parallel Program Analysis Framework for the ACTS Toolkit

    SciTech Connect

    Allen D. Malony

    2002-06-21

    OAK 270 - The final report summarizes the technical progress achieved during the project. A Parallel Program Analysis Framework for the acts toolkit, referred to as the TAU project. Described are the results in four work areas: (1) creation of a performance system for integrated instrumentation, measurement, analysis and visualization. (2) development of a performance measurement system for parallel profiling and tracing (3) development of an advanced program analysis system to enable creation of source-based performance and programing tools (4) development of parallel program interaction technology for accessing, performance information and application data during execution.

  11. The Wind Integration National Dataset (WIND) toolkit (Presentation)

    SciTech Connect

    Caroline Draxl: NREL

    2014-01-01

    Regional wind integration studies require detailed wind power output data at many locations to perform simulations of how the power system will operate under high penetration scenarios. The wind datasets that serve as inputs into the study must realistically reflect the ramping characteristics, spatial and temporal correlations, and capacity factors of the simulated wind plants, as well as being time synchronized with available load profiles.As described in this presentation, the WIND Toolkit fulfills these requirements by providing a state-of-the-art national (US) wind resource, power production and forecast dataset.

  12. Tips from the toolkit: 2--assessing organisational strengths.

    PubMed

    Steer, Neville

    2010-03-01

    'SWOT' is a familiar term used in the development of business strategy. It is based on the identification of strengths, weaknesses, opportunities and threats as part of a strategic analysis approach. While there are a range of more sophisticated models for analysing and developing business strategy, it is a useful model for general practice as it is less time consuming than other approaches. The following article discusses some ways to apply this framework to assess organisational strengths (and weaknesses). It is based on The Royal Australian College of General Practitioners' "General practice management toolkit".

  13. GENFIT — a Generic Track-Fitting Toolkit

    NASA Astrophysics Data System (ADS)

    Rauch, Johannes; Schlüter, Tobias

    2015-05-01

    GENFIT is an experiment-independent track-fitting toolkit that combines fitting algorithms, track representations, and measurement geometries into a modular framework. We report on a significantly improved version of GENFIT, based on experience gained in the Belle II, P¯ANDA, and FOPI experiments. Improvements concern the implementation of additional track-fitting algorithms, enhanced implementations of Kalman fitters, enhanced visualization capabilities, and additional implementations of measurement types suited for various kinds of tracking detectors. The data model has been revised, allowing for efficient track merging, smoothing, residual calculation, alignment, and storage.

  14. TECA: A Parallel Toolkit for Extreme Climate Analysis

    SciTech Connect

    Prabhat, Mr; Ruebel, Oliver; Byna, Surendra; Wu, Kesheng; Li, Fuyu; Wehner, Michael; Bethel, E. Wes

    2012-03-12

    We present TECA, a parallel toolkit for detecting extreme events in large climate datasets. Modern climate datasets expose parallelism across a number of dimensions: spatial locations, timesteps and ensemble members. We design TECA to exploit these modes of parallelism and demonstrate a prototype implementation for detecting and tracking three classes of extreme events: tropical cyclones, extra-tropical cyclones and atmospheric rivers. We process a modern TB-sized CAM5 simulation dataset with TECA, and demonstrate good runtime performance for the three case studies.

  15. The RAVE/VERTIGO vertex reconstruction toolkit and framework

    NASA Astrophysics Data System (ADS)

    Waltenberger, W.; Mitaroff, W.; Moser, F.; Pflugfelder, B.; Riedel, H. V.

    2008-07-01

    A detector-independent toolkit for vertex reconstruction (RAVE1) is being developed, along with a standalone framework (VERTIGO2) for testing, analyzing and debugging. The core algorithms represent state-of-the-art for geometric vertex finding and fitting by both linear (Kalman filter) and robust estimation methods. Main design goals are ease of use, flexibility for embedding into existing software frameworks, extensibility, and openness. The implementation is based on modern object-oriented techniques, is coded in C++ with interfaces for Java and Python, and follows an open-source approach. A beta release is available.

  16. A unified toolkit for information and scientific visualization

    NASA Astrophysics Data System (ADS)

    Wylie, Brian; Baumes, Jeffrey

    2009-01-01

    We present an expansion of the popular open source Visualization Toolkit (VTK) to support the ingestion, processing, and display of informatics data. The result is a flexible, component-based pipeline framework for the integration and deployment of algorithms in the scientific and informatics fields. This project, code named "Titan", is one of the first efforts to address the unification of information and scientific visualization in a systematic fashion. The result includes a wide range of informatics-oriented functionality: database access, graph algorithms, graph layouts, views, charts, UI components and more. Further, the data distribution, parallel processing and client/server capabilities of VTK provide an excellent platform for scalable analysis.

  17. A flexible open-source toolkit for lava flow simulations

    NASA Astrophysics Data System (ADS)

    Mossoux, Sophie; Feltz, Adelin; Poppe, Sam; Canters, Frank; Kervyn, Matthieu

    2014-05-01

    Lava flow hazard modeling is a useful tool for scientists and stakeholders confronted with imminent or long term hazard from basaltic volcanoes. It can improve their understanding of the spatial distribution of volcanic hazard, influence their land use decisions and improve the city evacuation during a volcanic crisis. Although a range of empirical, stochastic and physically-based lava flow models exists, these models are rarely available or require a large amount of physical constraints. We present a GIS toolkit which models lava flow propagation from one or multiple eruptive vents, defined interactively on a Digital Elevation Model (DEM). It combines existing probabilistic (VORIS) and deterministic (FLOWGO) models in order to improve the simulation of lava flow spatial spread and terminal length. Not only is this toolkit open-source, running in Python, which allows users to adapt the code to their needs, but it also allows users to combine the models included in different ways. The lava flow paths are determined based on the probabilistic steepest slope (VORIS model - Felpeto et al., 2001) which can be constrained in order to favour concentrated or dispersed flow fields. Moreover, the toolkit allows including a corrective factor in order for the lava to overcome small topographical obstacles or pits. The lava flow terminal length can be constrained using a fixed length value, a Gaussian probability density function or can be calculated based on the thermo-rheological properties of the open-channel lava flow (FLOWGO model - Harris and Rowland, 2001). These slope-constrained properties allow estimating the velocity of the flow and its heat losses. The lava flow stops when its velocity is zero or the lava temperature reaches the solidus. Recent lava flows of Karthala volcano (Comoros islands) are here used to demonstrate the quality of lava flow simulations with the toolkit, using a quantitative assessment of the match of the simulation with the real lava flows. The

  18. Aggregate and the environment

    USGS Publications Warehouse

    Langer, William H.; Drew, Lawrence J.; Sachs, J.S.

    2004-01-01

    This book is designed to help you understand our aggregate resources-their importance, where they come from, how they are processed for our use, the environmental concerns related to their mining and processing, how those concerns are addressed, and the policies and regulations designed to safeguard workers, neighbors, and the environment from the negative impacts of aggregate mining. We hope this understanding will help prepare you to be involved in decisions that need to be made-individually and as a society-to be good stewards of our aggregate resources and our living planet.

  19. A case control study to improve accuracy of an electronic fall prevention toolkit.

    PubMed

    Dykes, Patricia C; I-Ching, Evita Hou; Soukup, Jane R; Chang, Frank; Lipsitz, Stuart

    2012-01-01

    Patient falls are a serious and commonly report adverse event in hospitals. In 2009, our team conducted the first randomized control trial of a health information technology-based intervention that significantly reduced falls in acute care hospitals. However, some patients on intervention units with access to the electronic toolkit fell. The purpose of this case control study was to use data mining and modeling techniques to identify the factors associated with falls in hospitalized patients when the toolkit was in place. Our ultimate aim was to apply our findings to improve the toolkit logic and to generate practice recommendations. The results of our evaluation suggest that the fall prevention toolkit logic is accurate but strategies are needed to improve adherence with the fall prevention intervention recommendations generated by the electronic toolkit.

  20. Census of Population and Housing, 1980: Summary Tape File 1F, School Districts. Technical Documentation.

    ERIC Educational Resources Information Center

    Bureau of the Census (DOC), Washington, DC. Data User Services Div.

    This report provides technical documentation associated with a 1980 Census of Population and Housing Summary Tape File 1F--the School Districts File. The file contains complete-count data of population and housing aggregated by school district. Population items tabulated include age, race (provisional data), sex, marital status, Spanish origin…

  1. Protein Colloidal Aggregation Project

    NASA Technical Reports Server (NTRS)

    Oliva-Buisson, Yvette J. (Compiler)

    2014-01-01

    To investigate the pathways and kinetics of protein aggregation to allow accurate predictive modeling of the process and evaluation of potential inhibitors to prevalent diseases including cataract formation, chronic traumatic encephalopathy, Alzheimer's Disease, Parkinson's Disease and others.

  2. Cell aggregation and sedimentation.

    PubMed

    Davis, R H

    1995-01-01

    The aggregation of cells into clumps or flocs has been exploited for decades in such applications as biological wastewater treatment, beer brewing, antibiotic fermentation, and enhanced sedimentation to aid in cell recovery or retention. More recent research has included the use of cell aggregation and sedimentation to selectively separate subpopulations of cells. Potential biotechnological applications include overcoming contamination, maintaining plasmid-bearing cells in continuous fermentors, and selectively removing nonviable hybridoma cells from perfusion cultures.

  3. Developing an evidence-based, multimedia group counseling curriculum toolkit

    PubMed Central

    Brooks, Adam C.; DiGuiseppi, Graham; Laudet, Alexandre; Rosenwasser, Beth; Knoblach, Dan; Carpenedo, Carolyn M.; Carise, Deni; Kirby, Kimberly C.

    2013-01-01

    Training community-based addiction counselors in empirically supported treatments (ESTs) far exceeds the ever-decreasing resources of publicly funded treatment agencies. This feasibility study describes the development and pilot testing of a group counseling toolkit (an approach adapted from the education field) focused on relapse prevention (RP). When counselors (N = 17) used the RP toolkit after 3 hours of training, their content adherence scores on “coping with craving” and “drug refusal skills” showed significant improvement, as indicated by very large effect sizes (Cohen’s d = 1.49 and 1.34, respectively). Counselor skillfulness, in the “adequate-to-average” range at baseline, did not change. Although this feasibility study indicates some benefit to counselor EST acquisition, it is important to note that the impact of the curriculum on client outcomes is unknown. Because a majority of addiction treatment is delivered in group format, a multimedia curriculum approach may assist counselors in applying ESTs in the context of actual service delivery. PMID:22301082

  4. Regulatory and Permitting Information Desktop (RAPID) Toolkit (Poster)

    SciTech Connect

    Young, K. R.; Levine, A.

    2014-09-01

    The Regulatory and Permitting Information Desktop (RAPID) Toolkit combines the former Geothermal Regulatory Roadmap, National Environmental Policy Act (NEPA) Database, and other resources into a Web-based tool that gives the regulatory and utility-scale geothermal developer communities rapid and easy access to permitting information. RAPID currently comprises five tools - Permitting Atlas, Regulatory Roadmap, Resource Library, NEPA Database, and Best Practices. A beta release of an additional tool, the Permitting Wizard, is scheduled for late 2014. Because of the huge amount of information involved, RAPID was developed in a wiki platform to allow industry and regulatory agencies to maintain the content in the future so that it continues to provide relevant and accurate information to users. In 2014, the content was expanded to include regulatory requirements for utility-scale solar and bulk transmission development projects. Going forward, development of the RAPID Toolkit will focus on expanding the capabilities of current tools, developing additional tools, including additional technologies, and continuing to increase stakeholder involvement.

  5. The Bioperl toolkit: Perl modules for the life sciences.

    PubMed

    Stajich, Jason E; Block, David; Boulez, Kris; Brenner, Steven E; Chervitz, Stephen A; Dagdigian, Chris; Fuellen, Georg; Gilbert, James G R; Korf, Ian; Lapp, Hilmar; Lehväslaiho, Heikki; Matsalla, Chad; Mungall, Chris J; Osborne, Brian I; Pocock, Matthew R; Schattner, Peter; Senger, Martin; Stein, Lincoln D; Stupka, Elia; Wilkinson, Mark D; Birney, Ewan

    2002-10-01

    The Bioperl project is an international open-source collaboration of biologists, bioinformaticians, and computer scientists that has evolved over the past 7 yr into the most comprehensive library of Perl modules available for managing and manipulating life-science information. Bioperl provides an easy-to-use, stable, and consistent programming interface for bioinformatics application programmers. The Bioperl modules have been successfully and repeatedly used to reduce otherwise complex tasks to only a few lines of code. The Bioperl object model has been proven to be flexible enough to support enterprise-level applications such as EnsEMBL, while maintaining an easy learning curve for novice Perl programmers. Bioperl is capable of executing analyses and processing results from programs such as BLAST, ClustalW, or the EMBOSS suite. Interoperation with modules written in Python and Java is supported through the evolving BioCORBA bridge. Bioperl provides access to data stores such as GenBank and SwissProt via a flexible series of sequence input/output modules, and to the emerging common sequence data storage format of the Open Bioinformatics Database Access project. This study describes the overall architecture of the toolkit, the problem domains that it addresses, and gives specific examples of how the toolkit can be used to solve common life-sciences problems. We conclude with a discussion of how the open-source nature of the project has contributed to the development effort. PMID:12368254

  6. Clinical Trial of a Home Safety Toolkit for Alzheimer's Disease

    PubMed Central

    Trudeau, Scott A.; Rudolph, James L.; Trudeau, Paulette A.; Duffy, Mary E.; Berlowitz, Dan

    2013-01-01

    This randomized clinical trial tested a new self-directed educational intervention to improve caregiver competence to create a safer home environment for persons with dementia living in the community. The sample included 108 patient/caregiver dyads: the intervention group (n = 60) received the Home Safety Toolkit (HST), including a new booklet based on health literacy principles, and sample safety items to enhance self-efficacy to make home safety modifications. The control group (n = 48) received customary care. Participants completed measures at baseline and at twelve-week follow-up. Multivariate Analysis of Covariance (MANCOVA) was used to test for significant group differences. All caregiver outcome variables improved in the intervention group more than in the control. Home safety was significant at P ≤ 0.001, caregiver strain at P ≤ 0.001, and caregiver self-efficacy at P = 0.002. Similarly, the care receiver outcome of risky behaviors and accidents was lower in the intervention group (P ≤ 0.001). The self-directed use of this Home Safety Toolkit activated the primary family caregiver to make the home safer for the person with dementia of Alzheimer's type (DAT) or related disorder. Improving the competence of informal caregivers is especially important for patients with DAT in light of all stakeholders reliance on their unpaid care. PMID:24195007

  7. Fibronectin Aggregation and Assembly

    PubMed Central

    Ohashi, Tomoo; Erickson, Harold P.

    2011-01-01

    The mechanism of fibronectin (FN) assembly and the self-association sites are still unclear and contradictory, although the N-terminal 70-kDa region (I1–9) is commonly accepted as one of the assembly sites. We previously found that I1–9 binds to superfibronectin, which is an artificial FN aggregate induced by anastellin. In the present study, we found that I1–9 bound to the aggregate formed by anastellin and a small FN fragment, III1–2. An engineered disulfide bond in III2, which stabilizes folding, inhibited aggregation, but a disulfide bond in III1 did not. A gelatin precipitation assay showed that I1–9 did not interact with anastellin, III1, III2, III1–2, or several III1–2 mutants including III1–2KADA. (In contrast to previous studies, we found that the III1–2KADA mutant was identical in conformation to wild-type III1–2.) Because I1–9 only bound to the aggregate and the unfolding of III2 played a role in aggregation, we generated a III2 domain that was destabilized by deletion of the G strand. This mutant bound I1–9 as shown by the gelatin precipitation assay and fluorescence resonance energy transfer analysis, and it inhibited FN matrix assembly when added to cell culture. Next, we introduced disulfide mutations into full-length FN. Three disulfide locks in III2, III3, and III11 were required to dramatically reduce anastellin-induced aggregation. When we tested the disulfide mutants in cell culture, only the disulfide bond in III2 reduced the FN matrix. These results suggest that the unfolding of III2 is one of the key factors for FN aggregation and assembly. PMID:21949131

  8. A prototype forensic toolkit for industrial-control-systems incident response

    NASA Astrophysics Data System (ADS)

    Carr, Nickolas B.; Rowe, Neil C.

    2015-05-01

    Industrial control systems (ICSs) are an important part of critical infrastructure in cyberspace. They are especially vulnerable to cyber-attacks because of their legacy hardware and software and the difficulty of changing it. We first survey the history of intrusions into ICSs, the more serious of which involved a continuing adversary presence on an ICS network. We discuss some common vulnerabilities and the categories of possible attacks, noting the frequent use of software written a long time ago. We propose a framework for designing ICS incident response under the constraints that no new software must be required and that interventions cannot impede the continuous processing that is the norm for such systems. We then discuss a prototype toolkit we built using the Windows Management Instrumentation Command-Line tool for host-based analysis and the Bro intrusion-detection software for network-based analysis. Particularly useful techniques we used were learning the historical range of parameters of numeric quantities so as to recognize anomalies, learning the usual addresses of connections to a node, observing Internet addresses (usually rare), observing anomalous network protocols such as unencrypted data transfers, observing unusual scheduled tasks, and comparing key files through registry entries and hash values to find malicious modifications. We tested our methods on actual data from ICSs including publicly-available data, voluntarily-submitted data, and researcher-provided "advanced persistent threat" data. We found instances of interesting behavior in our experiments. Intrusions were generally easy to see because of the repetitive nature of most processing on ICSs, but operators need to be motivated to look.

  9. A lightweight, flow-based toolkit for parallel and distributed bioinformatics pipelines

    PubMed Central

    2011-01-01

    Background Bioinformatic analyses typically proceed as chains of data-processing tasks. A pipeline, or 'workflow', is a well-defined protocol, with a specific structure defined by the topology of data-flow interdependencies, and a particular functionality arising from the data transformations applied at each step. In computer science, the dataflow programming (DFP) paradigm defines software systems constructed in this manner, as networks of message-passing components. Thus, bioinformatic workflows can be naturally mapped onto DFP concepts. Results To enable the flexible creation and execution of bioinformatics dataflows, we have written a modular framework for parallel pipelines in Python ('PaPy'). A PaPy workflow is created from re-usable components connected by data-pipes into a directed acyclic graph, which together define nested higher-order map functions. The successive functional transformations of input data are evaluated on flexibly pooled compute resources, either local or remote. Input items are processed in batches of adjustable size, all flowing one to tune the trade-off between parallelism and lazy-evaluation (memory consumption). An add-on module ('NuBio') facilitates the creation of bioinformatics workflows by providing domain specific data-containers (e.g., for biomolecular sequences, alignments, structures) and functionality (e.g., to parse/write standard file formats). Conclusions PaPy offers a modular framework for the creation and deployment of parallel and distributed data-processing workflows. Pipelines derive their functionality from user-written, data-coupled components, so PaPy also can be viewed as a lightweight toolkit for extensible, flow-based bioinformatics data-processing. The simplicity and flexibility of distributed PaPy pipelines may help users bridge the gap between traditional desktop/workstation and grid computing. PaPy is freely distributed as open-source Python code at http://muralab.org/PaPy, and includes extensive

  10. Use of Remote Sensing Data to Enhance NWS Storm Damage Toolkit

    NASA Technical Reports Server (NTRS)

    Jedlove, Gary J.; Molthan, Andrew L.; White, Kris; Burks, Jason; Stellman, Keith; Smith, Mathew

    2012-01-01

    In the wake of a natural disaster such as a tornado, the National Weather Service (NWS) is required to provide a very detailed and timely storm damage assessment to local, state and federal homeland security officials. The Post ]Storm Data Acquisition (PSDA) procedure involves the acquisition and assembly of highly perishable data necessary for accurate post ]event analysis and potential integration into a geographic information system (GIS) available to its end users and associated decision makers. Information gained from the process also enables the NWS to increase its knowledge of extreme events, learn how to better use existing equipment, improve NWS warning programs, and provide accurate storm intensity and damage information to the news media and academia. To help collect and manage all of this information, forecasters in NWS Southern Region are currently developing a Storm Damage Assessment Toolkit (SDAT), which incorporates GIS ]capable phones and laptops into the PSDA process by tagging damage photography, location, and storm damage details with GPS coordinates for aggregation within the GIS database. However, this tool alone does not fully integrate radar and ground based storm damage reports nor does it help to identify undetected storm damage regions. In many cases, information on storm damage location (beginning and ending points, swath width, etc.) from ground surveys is incomplete or difficult to obtain. Geographic factors (terrain and limited roads in rural areas), manpower limitations, and other logistical constraints often prevent the gathering of a comprehensive picture of tornado or hail damage, and may allow damage regions to go undetected. Molthan et al. (2011) have shown that high resolution satellite data can provide additional valuable information on storm damage tracks to augment this database. This paper presents initial development to integrate satellitederived damage track information into the SDAT for near real ]time use by forecasters

  11. Use of Remote Sensing Data to Enhance NWS Storm Damage Toolkit

    NASA Astrophysics Data System (ADS)

    Jedlovec, G.; Molthan, A.; White, K.; Burks, J.; Stellman, K.; Smith, M. R.

    2012-12-01

    In the wake of a natural disaster such as a tornado, the National Weather Service (NWS) is required to provide a very detailed and timely storm damage assessment to local, state and federal homeland security officials. The Post-Storm Data Acquisition (PSDA) procedure involves the acquisition and assembly of highly perishable data necessary for accurate post-event analysis and potential integration into a geographic information system (GIS) available to its end users and associated decision makers. Information gained from the process also enables the NWS to increase its knowledge of extreme events, learn how to better use existing equipment, improve NWS warning programs, and provide accurate storm intensity and damage information to the news media and academia. To help collect and manage all of this information, forecasters in NWS Southern Region are currently developing a Storm Damage Assessment Toolkit (SDAT), which incorporates GIS-capable phones and laptops into the PSDA process by tagging damage photography, location, and storm damage details with GPS coordinates for aggregation within the GIS database. However, this tool alone does not fully integrate radar and ground based storm damage reports nor does it help to identify undetected storm damage regions. In many cases, information on storm damage location (beginning and ending points, swath width, etc.) from ground surveys is incomplete or difficult to obtain. Geographic factors (terrain and limited roads in rural areas), manpower limitations, and other logistical constraints often prevent the gathering of a comprehensive picture of tornado or hail damage, and may allow damage regions to go undetected. Molthan et al. (2011) have shown that high resolution satellite data can provide additional valuable information on storm damage tracks to augment this database. This paper presents initial development to integrate satellite-derived damage track information into the SDAT for near real-time use by forecasters

  12. Integration of Earth Remote Sensing into the NOAA/NWS Damage Assessment Toolkit

    NASA Technical Reports Server (NTRS)

    Molthan, Andrew; Burks, Jason; Camp, Parks; McGrath, Kevin; Bell, Jordan

    2014-01-01

    Following the occurrence of severe weather, NOAA/NWS meteorologists are tasked with performing a storm damage survey to assess the type and severity of the weather event, primarily focused with the confirmation and assessment of tornadoes. This labor-intensive process requires meteorologists to venture into the affected area, acquire damage indicators through photos, eyewitness accounts, and other documentation, then aggregation of data in order to make a final determination of the tornado path length, width, maximum intensity, and other characteristics. Earth remote sensing from operational, polar-orbiting satellites can support the damage assessment process by helping to identify portions of damage tracks that are difficult to access due to road limitations or time constraints by applying change detection techniques. In addition, higher resolution commercial imagery can corroborate ground-based surveys by examining higher-resolution commercial imagery. As part of an ongoing collaboration, NASA and NOAA are working to integrate near real-time Earth remote sensing observations into the NOAA/NWS Damage Assessment Toolkit, a handheld application used by meteorologists in the survey process. The team has recently developed a more streamlined approach for delivering data via a web mapping service and menu interface, allowing for caching of imagery before field deployment. Near real-time products have been developed using MODIS and VIIRS imagery and change detection for preliminary track identification, along with conduits for higher-resolution Landsat, ASTER, and commercial imagery as they become available. In addition to tornado damage assessments, the team is also investigating the use of near real-time imagery for identifying hail damage to vegetation, which also results in large swaths of damage, particularly in the central United States during the peak growing season months of June, July, and August. This presentation will present an overview of recent activities

  13. Transverse space charge effect calculation in the Synergia accelerator modeling toolkit

    SciTech Connect

    Okonechnikov, Konstantin; Amundson, James; Macridin, Alexandru; /Fermilab

    2009-09-01

    This paper describes a transverse space charge effect calculation algorithm, developed in the context of accelerator modeling toolkit Synergia. The introduction to the space charge problem and the Synergia modeling toolkit short description are given. The developed algorithm is explained and the implementation is described in detail. As a result of this work a new space charge solver was developed and integrated into the Synergia toolkit. The solver showed correct results in comparison to existing Synergia solvers and delivered better performance in the regime where it is applicable.

  14. Overview and Meteorological Validation of the Wind Integration National Dataset toolkit

    SciTech Connect

    Draxl, C.; Hodge, B. M.; Clifton, A.; McCaa, J.

    2015-04-13

    The Wind Integration National Dataset (WIND) Toolkit described in this report fulfills these requirements, and constitutes a state-of-the-art national wind resource data set covering the contiguous United States from 2007 to 2013 for use in a variety of next-generation wind integration analyses and wind power planning. The toolkit is a wind resource data set, wind forecast data set, and wind power production and forecast data set derived from the Weather Research and Forecasting (WRF) numerical weather prediction model. WIND Toolkit data are available online for over 116,000 land-based and 10,000 offshore sites representing existing and potential wind facilities.

  15. Demonstration of the Health Literacy Universal Precautions Toolkit: Lessons for Quality Improvement.

    PubMed

    Mabachi, Natabhona M; Cifuentes, Maribel; Barnard, Juliana; Brega, Angela G; Albright, Karen; Weiss, Barry D; Brach, Cindy; West, David

    2016-01-01

    The Agency for Healthcare Research and Quality Health Literacy Universal Precautions Toolkit was developed to help primary care practices assess and make changes to improve communication with and support for patients. Twelve diverse primary care practices implemented assigned tools over a 6-month period. Qualitative results revealed challenges practices experienced during implementation, including competing demands, bureaucratic hurdles, technological challenges, limited quality improvement experience, and limited leadership support. Practices used the Toolkit flexibly and recognized the efficiencies of implementing tools in tandem and in coordination with other quality improvement initiatives. Practices recommended reducing Toolkit density and making specific refinements.

  16. Technology meets aggregate

    SciTech Connect

    Wilson, C.; Swan, C.

    2007-07-01

    New technology carried out at Tufts University and the University of Massachusetts on synthetic lightweight aggregate has created material from various qualities of fly ash from coal-fired power plants for use in different engineered applications. In pilot scale manufacturing tests an 'SLA' containing 80% fly ash and 20% mixed plastic waste from packaging was produced by 'dry blending' mixed plastic with high carbon fly ash. A trial run was completed to produce concrete masonry unit (CMU) blocks at a full-scale facility. It has been shown that SLA can be used as a partial substitution of a traditional stone aggregate in hot asphalt mix. 1 fig., 2 photos.

  17. The interactive learning toolkit: technology and the classroom

    NASA Astrophysics Data System (ADS)

    Lukoff, Brian; Tucker, Laura

    2011-04-01

    Peer Instruction (PI) and Just-in-Time-Teaching (JiTT) have been shown to increase both students' conceptual understanding and problem-solving skills. However, the time investment for the instructor to prepare appropriate conceptual questions and manage student JiTT responses is one of the main implementation hurdles. To overcome this we have developed the Interactive Learning Toolkit (ILT), a course management system specifically designed to support PI and JiTT. We are working to integrate the ILT with a fully interactive classroom system where students can use their laptops and smartphones to respond to ConcepTests in class. The goal is to use technology to engage students in conceptual thinking both in and out of the classroom.

  18. Integrating surgical robots into the next medical toolkit.

    PubMed

    Lai, Fuji; Entin, Eileen

    2006-01-01

    Surgical robots hold much promise for revolutionizing the field of surgery and improving surgical care. However, despite the potential advantages they offer, there are multiple barriers to adoption and integration into practice that may prevent these systems from realizing their full potential benefit. This study elucidated some of the most salient considerations that need to be addressed for integration of new technologies such as robotic systems into the operating room of the future as it evolves into a complex system of systems. We conducted in-depth interviews with operating room team members and other stakeholders to identify potential barriers in areas of workflow, teamwork, training, clinical acceptance, and human-system interaction. The findings of this study will inform an approach for the design and integration of robotics and related computer-assisted technologies into the next medical toolkit for "computer-enhanced surgery" to improve patient safety and healthcare quality.

  19. Migration of 1970s Minicomputer Controls to Modern Toolkit Software

    SciTech Connect

    Juras, R.C.; Meigs, M.J.; Sinclair, J.A.; Tatum, B.A.

    1999-11-13

    Controls for accelerators and associated systems at the Holifield Radioactive Ion Beam Facility (HRIBF) at Oak Ridge National Laboratory have been migrated from 197Os-vintage minicomputers to a modern system based on Vista and EPICS toolkit software. Stability and capabilities of EPICS software have motivated increasing use of EPICS for accelerator controls. In addition, very inexpensive subsystems based on EPICS and the EPICS portable CA server running on Linux PCs have been implemented to control an ion source test facility and to control a building-access badge reader system. A new object-oriented, extensible display manager has been developed for EPICS to facilitate the transition to EPICS and will be used in place of MEDM. EPICS device support has been developed for CAMAC serial highway controls.

  20. A toolkit for MSDs prevention--WHO and IEA context.

    PubMed

    Caple, David C

    2012-01-01

    Many simple MSD risk management tools have been developed by ergonomists for use by workers and employers with little or no training to undertake injury prevention programs in their workplace. However, currently there is no "toolkit" which places such tools within an holistic, participative ergonomics framework and provides guidance on how best to use individual tools. It is proposed that such an holistic approach should entail initial analysis and evaluation of underlying systems of work and related health and performance indicators, prior to focusing in assessment of MSD risks stemming from particular hazards. Depending on the context, more narrowly focused tools might then be selected to assess risk associated with jobs or tasks identified as problematic. This approach ensures that biomechanical risk factors are considered within a broad context of organizational and psychosocial risk factors. This is consistent with current research evidence on work- related causes of MSDs. PMID:22317323

  1. PARAMESH: A Parallel Adaptive Mesh Refinement Community Toolkit

    NASA Technical Reports Server (NTRS)

    MacNeice, Peter; Olson, Kevin M.; Mobarry, Clark; deFainchtein, Rosalinda; Packer, Charles

    1999-01-01

    In this paper, we describe a community toolkit which is designed to provide parallel support with adaptive mesh capability for a large and important class of computational models, those using structured, logically cartesian meshes. The package of Fortran 90 subroutines, called PARAMESH, is designed to provide an application developer with an easy route to extend an existing serial code which uses a logically cartesian structured mesh into a parallel code with adaptive mesh refinement. Alternatively, in its simplest use, and with minimal effort, it can operate as a domain decomposition tool for users who want to parallelize their serial codes, but who do not wish to use adaptivity. The package can provide them with an incremental evolutionary path for their code, converting it first to uniformly refined parallel code, and then later if they so desire, adding adaptivity.

  2. Parametrization of macrolide antibiotics using the Force Field Toolkit

    PubMed Central

    Pavlova, Anna; Gumbart, James C

    2015-01-01

    Macrolides are an important class of antibiotics that target the bacterial ribosome. Computer simulations of macrolides are limited since specific force field parameters have not been previously developed for them. Here we determine CHARMM-compatible force field parameters for erythromycin, azithromycin and telithromycin, using the Force Field Toolkit plugin in VMD. Because of their large size, novel approaches for parametrizing them had to be developed. Two methods for determining partial atomic charges, from interactions with TIP3P water and from the electrostatic potential, as well as several approaches for fitting the dihedral parameters were tested. The performance of the different parameter sets was evaluated by molecular dynamics simulations of the macrolides in ribosome, with a distinct improvement in maintenance of key interactions observed after refinement of the initial parameters. Based on the results of the macrolide tests, recommended procedures for parametrizing very large molecules using ffTK are given. PMID:26280362

  3. HemI: a toolkit for illustrating heatmaps.

    PubMed

    Deng, Wankun; Wang, Yongbo; Liu, Zexian; Cheng, Han; Xue, Yu

    2014-01-01

    Recent high-throughput techniques have generated a flood of biological data in all aspects. The transformation and visualization of multi-dimensional and numerical gene or protein expression data in a single heatmap can provide a concise but comprehensive presentation of molecular dynamics under different conditions. In this work, we developed an easy-to-use tool named HemI (Heat map Illustrator), which can visualize either gene or protein expression data in heatmaps. Additionally, the heatmaps can be recolored, rescaled or rotated in a customized manner. In addition, HemI provides multiple clustering strategies for analyzing the data. Publication-quality figures can be exported directly. We propose that HemI can be a useful toolkit for conveniently visualizing and manipulating heatmaps. The stand-alone packages of HemI were implemented in Java and can be accessed at http://hemi.biocuckoo.org/down.php.

  4. MeSh ToolKit v1.2

    2004-05-15

    MSTK or Mesh Toolkit is a mesh framework that allows users to represent, manipulate and query unstructured 3D arbitrary topology meshes in a general manner without the need to code their own data structures. MSTK is a flexible framework in that is allows (or will eventually allow) a wide variety of underlying representations for the mesh while maintaining a common interface. It will allow users to choose from different mesh representations either at initialization ormore » during the program execution so that the optimal data structures are used for the particular algorithm. The interaction of users and applications with MSTK is through a functional interface that acts as through the mesh always contains vertices, edges, faces and regions and maintains connectivity between all these entities.« less

  5. Maintaining A User Community For The Montage Image Mosaic Toolkit

    NASA Astrophysics Data System (ADS)

    Berriman, G. B.

    2014-01-01

    The development of the Montage image mosaic toolkit was funded by NASA between 2002 and 2005. Even though the code has been unfunded for eight years, the user community of astronomers and computer scientists has continued to grow, primarily because the code is portable across Unix platforms, highly scalable, and easy to incorporate into user environments and pipelines. The code is publicly available through a clickwrap license at Caltech, but the license does not permit the user to modify and redistribute the software. This presentation outlines successful strategies for maintaining and upgrading Montage in the face of the licensing restrictions and absence of continuing funding, and outlines cases where the restrictions have limited further development.

  6. The Standard European Vector Architecture (SEVA) plasmid toolkit.

    PubMed

    Durante-Rodríguez, Gonzalo; de Lorenzo, Víctor; Martínez-García, Esteban

    2014-01-01

    The Standard European Vector Architecture (SEVA) toolkit is a simple and powerful resource for constructing optimal plasmid vectors based on a backbone and three interchangeable modules flanked by uncommon restriction sites. Functional modules encode several origins of replication, diverse antibiotic selection markers, and a variety of cargoes with different applications. The backbone and DNA modules have been minimized and edited for flaws in their sequence and/or functionality. A protocol for the utilization of the SEVA platform to construct transcriptional and translational fusions between a promoter under study (the arsenic-responsive Pars of Pseudomonas putida KT2440) and the reporter lacZ gene is described. The resulting plasmid collection was instrumental to measure and compare the β-galactosidase activity that report gene expression (i.e., transcription and translation) in different genetic backgrounds.

  7. PHISICS TOOLKIT: MULTI-REACTOR TRANSMUTATION ANALYSIS UTILITY - MRTAU

    SciTech Connect

    Andrea Alfonsi; Cristian Rabiti; Aaron S. Epiney; Yaqi Wang; Joshua Cogliati

    2012-04-01

    The principal idea of this paper is to present the new capabilities available in the PHISICS toolkit, connected with the implementation of the depletion code MRTAU, a generic depletion/ decay/burn-up code developed at the Idaho National Laboratory. It is programmed in a modular structure and modern FORTRAN 95/2003. The code tracks the time evolution of the isotopic concentration of a given material accounting for nuclear reaction happening in presence of neutron flux and also due to natural decay. MRTAU has two different methods to perform the depletion calculation, in order to let the user choose the best one respect his needs. Both the methodologies and some significant results are reported in this paper.

  8. Enhancing the Informatics Evaluation Toolkit with Remote Usability Testing

    PubMed Central

    Dixon, Brian E.

    2009-01-01

    Developing functional clinical informatics products that are also usable remains a challenge. Despite evidence that usability testing should be incorporated into the lifecycle of health information technologies, rarely does this occur. Challenges include poor standards, a lack of knowledge around usability practices, and the expense involved in rigorous testing with a large number of users. Remote usability testing may be a solution for many of these challenges. Remotely testing an application can greatly enhance the number of users who can iteratively interact with a product, and it can reduce the costs associated with usability testing. A case study presents the experiences with remote usability testing when evaluating a Web site designed for health informatics knowledge dissemination. The lessons can inform others seeking to enhance their evaluation toolkits for clinical informatics products. PMID:20351839

  9. Integrating surgical robots into the next medical toolkit.

    PubMed

    Lai, Fuji; Entin, Eileen

    2006-01-01

    Surgical robots hold much promise for revolutionizing the field of surgery and improving surgical care. However, despite the potential advantages they offer, there are multiple barriers to adoption and integration into practice that may prevent these systems from realizing their full potential benefit. This study elucidated some of the most salient considerations that need to be addressed for integration of new technologies such as robotic systems into the operating room of the future as it evolves into a complex system of systems. We conducted in-depth interviews with operating room team members and other stakeholders to identify potential barriers in areas of workflow, teamwork, training, clinical acceptance, and human-system interaction. The findings of this study will inform an approach for the design and integration of robotics and related computer-assisted technologies into the next medical toolkit for "computer-enhanced surgery" to improve patient safety and healthcare quality. PMID:16404063

  10. NASA Space Radiation Program Integrative Risk Model Toolkit

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Hu, Shaowen; Plante, Ianik; Ponomarev, Artem L.; Sandridge, Chris

    2015-01-01

    NASA Space Radiation Program Element scientists have been actively involved in development of an integrative risk models toolkit that includes models for acute radiation risk and organ dose projection (ARRBOD), NASA space radiation cancer risk projection (NSCR), hemocyte dose estimation (HemoDose), GCR event-based risk model code (GERMcode), and relativistic ion tracks (RITRACKS), NASA radiation track image (NASARTI), and the On-Line Tool for the Assessment of Radiation in Space (OLTARIS). This session will introduce the components of the risk toolkit with opportunity for hands on demonstrations. The brief descriptions of each tools are: ARRBOD for Organ dose projection and acute radiation risk calculation from exposure to solar particle event; NSCR for Projection of cancer risk from exposure to space radiation; HemoDose for retrospective dose estimation by using multi-type blood cell counts; GERMcode for basic physical and biophysical properties for an ion beam, and biophysical and radiobiological properties for a beam transport to the target in the NASA Space Radiation Laboratory beam line; RITRACKS for simulation of heavy ion and delta-ray track structure, radiation chemistry, DNA structure and DNA damage at the molecular scale; NASARTI for modeling of the effects of space radiation on human cells and tissue by incorporating a physical model of tracks, cell nucleus, and DNA damage foci with image segmentation for the automated count; and OLTARIS, an integrated tool set utilizing HZETRN (High Charge and Energy Transport) intended to help scientists and engineers study the effects of space radiation on shielding materials, electronics, and biological systems.

  11. Simulation Toolkit for Renewable Energy Advanced Materials Modeling

    SciTech Connect

    Sides, Scott; Kemper, Travis; Larsen, Ross; Graf, Peter

    2013-11-13

    STREAMM is a collection of python classes and scripts that enables and eases the setup of input files and configuration files for simulations of advanced energy materials. The core STREAMM python classes provide a general framework for storing, manipulating and analyzing atomic/molecular coordinates to be used in quantum chemistry and classical molecular dynamics simulations of soft materials systems. The design focuses on enabling the interoperability of materials simulation codes such as GROMACS, LAMMPS and Gaussian.

  12. Compress Your Files

    ERIC Educational Resources Information Center

    Branzburg, Jeffrey

    2005-01-01

    File compression enables data to be squeezed together, greatly reducing file size. Why would someone want to do this? Reducing file size enables the sending and receiving of files over the Internet more quickly, the ability to store more files on the hard drive, and the ability pack many related files into one archive (for example, all files…

  13. Aggregates, broccoli and cauliflower

    NASA Astrophysics Data System (ADS)

    Grey, Francois; Kjems, Jørgen K.

    1989-09-01

    Naturally grown structures with fractal characters like broccoli and cauliflower are discussed and compared with DLA-type aggregates. It is suggested that the branching density can be used to characterize the growth process and an experimental method to determine this parameter is proposed.

  14. STAND: Surface Tension for Aggregation Number Determination.

    PubMed

    Garrido, Pablo F; Brocos, Pilar; Amigo, Alfredo; García-Río, Luis; Gracia-Fadrique, Jesús; Piñeiro, Ángel

    2016-04-26

    Taking advantage of the extremely high dependence of surface tension on the concentration of amphiphilic molecules in aqueous solution, a new model based on the double equilibrium between free and aggregated molecules in the liquid phase and between free molecules in the liquid phase and those adsorbed at the air/liquid interface is presented and validated using literature data and fluorescence measurements. A key point of the model is the use of both the Langmuir isotherm and the Gibbs adsorption equation in terms of free molecules instead of the nominal concentration of the solute. The application of the model should be limited to non ionic compounds since it does not consider the presence of counterions. It requires several coupled nonlinear fittings for which we developed a software that is publicly available in our server as a web application. Using this tool, it is straightforward to get the average aggregation number of an amphiphile, the micellization free energy, the adsorption constant, the maximum surface excess (and so the minimum area per molecule), the distribution of solute in the liquid phase between free and aggregate species, and the surface coverage in only a couple of seconds, just by uploading a text file with surface tension vs concentration data and the corresponding uncertainties. PMID:27048988

  15. Fall TIPS: strategies to promote adoption and use of a fall prevention toolkit.

    PubMed

    Dykes, Patricia C; Carroll, Diane L; Hurley, Ann; Gersh-Zaremski, Ronna; Kennedy, Ann; Kurowski, Jan; Tierney, Kim; Benoit, Angela; Chang, Frank; Lipsitz, Stuart; Pang, Justine; Tsurkova, Ruslana; Zuyov, Lyubov; Middleton, Blackford

    2009-11-14

    Patient falls are serious problems in hospitals. Risk factors for falls are well understood and nurses routinely assess for fall risk on all hospitalized patients. However, the link from nursing assessment of fall risk, to identification and communication of tailored interventions to prevent falls is yet to be established. The Fall TIPS (Tailoring Interventions for Patient Safety) Toolkit was developed to leverage existing practices and workflows and to employ information technology to improve fall prevention practices. The purpose of this paper is to describe the Fall TIPS Toolkit and to report on strategies used to drive adoption of the Toolkit in four acute care hospitals. Using the IHI "Framework for Spread" as a conceptual model, the research team describes the "spread" of the Fall TIPS Toolkit as means to integrate effective fall prevention practices into the workflow of interdisciplinary caregivers, patients and family members.

  16. Challenges and Opportunities in Using Automatic Differentiation with Object-Oriented Toolkits for Scientific Computing

    SciTech Connect

    Hovland, P; Lee, S; McInnes, L; Norris, B; Smith, B

    2001-04-17

    The increased use of object-oriented toolkits in large-scale scientific simulation presents new opportunities and challenges for the use of automatic (or algorithmic) differentiation (AD) techniques, especially in the context of optimization. Because object-oriented toolkits use well-defined interfaces and data structures, there is potential for simplifying the AD process. Furthermore, derivative computation can be improved by exploiting high-level information about numerical and computational abstractions. However, challenges to the successful use of AD with these toolkits also exist. Among the greatest challenges is balancing the desire to limit the scope of the AD process with the desire to minimize the work required of a user. They discuss their experiences in integrating AD with the PETSc, PVODE, and TAO toolkits and the plans for future research and development in this area.

  17. Development and Evaluation of a Toolkit to Assess Partnership Readiness for Community-Based Participatory Research

    PubMed Central

    Andrews, Jeannette O.; Cox, Melissa J.; Newman, Susan D.; Meadows, Otha

    2012-01-01

    An earlier investigation by academic and community co-investigators led to the development of the Partnership Readiness for Community-Based Participatory Research (CBPR) Model, which defined major dimensions and key indicators of partnership readiness. As a next step in this process, we used qualitative methods, cognitive pretesting, and expert reviews to develop a working guide, or toolkit, based on the model for academic and community partners to assess and leverage their readiness for CBPR. The 75-page toolkit is designed as a qualitative assessment promoting equal voice and transparent, bi-directional discussions among all the partners. The toolkit is formatted to direct individual partner assessments, followed by team assessments, discussions, and action plans to optimize their goodness of fit, capacity, and operations to conduct CBPR. The toolkit has been piloted with two cohorts in the Medical University of South Carolina’s (MUSC) Community Engaged Scholars (CES) Program with promising results from process and outcome evaluation data. PMID:21623021

  18. Wind Integration National Dataset (WIND) Toolkit; NREL (National Renewable Energy Laboratory)

    SciTech Connect

    Draxl, Caroline; Hodge, Bri-Mathias

    2015-07-14

    A webinar about the Wind Integration National Dataset (WIND) Toolkit was presented by Bri-Mathias Hodge and Caroline Draxl on July 14, 2015. It was hosted by the Southern Alliance for Clean Energy. The toolkit is a grid integration data set that contains meteorological and power data at a 5-minute resolution across the continental United States for 7 years and hourly power forecasts.

  19. The TEVA-SPOT toolkit for drinking water contaminant warning system design.

    SciTech Connect

    Berry, Jonathan W.; Riesen, Lee Ann; Hart, William Eugene; Watson, Jean-Paul; Phillips, Cynthia Ann; Murray, Regan Elizabeth; Boman, Erik Gunnar

    2008-08-01

    We present the TEVA-SPOT Toolkit, a sensor placement optimization tool developed within the USEPA TEVA program. The TEVA-SPOT Toolkit provides a sensor placement framework that facilitates research in sensor placement optimization and enables the practical application of sensor placement solvers to real-world CWS design applications. This paper provides an overview of its key features, and then illustrates how this tool can be flexibly applied to solve a variety of different types of sensor placement problems.

  20. A proposal for a spiritual care assessment toolkit for religious volunteers and volunteer service users.

    PubMed

    Liu, Yi-Jung

    2014-10-01

    Based on the idea that volunteer services in healthcare settings should focus on the service users' best interests and providing holistic care for the body, mind, and spirit, the aim of this study was to propose an assessment toolkit for assessing the effectiveness of religious volunteers and improving their service. By analyzing and categorizing the results of previous studies, we incorporated effective care goals and methods in the proposed religious and spiritual care assessment toolkit. Two versions of the toolkit were created. The service users' version comprises 10 questions grouped into the following five dimensions: "physical care," "psychological and emotional support," "social relationships," "religious and spiritual care," and "hope restoration." Each question could either be answered with "yes" or "no". The volunteers' version contains 14 specific care goals and 31 care methods, in addition to the 10 care dimensions in the residents' version. A small sample of 25 experts was asked to judge the usefulness of each of the toolkit items for evaluating volunteers' effectiveness. Although some experts questioned the volunteer's capacity, however, to improve the spiritual care capacity and effectiveness provided by volunteers is the main purpose of developing this assessment toolkit. The toolkit developed in this study may not be applicable to other countries, and only addressed patients' general spiritual needs. Volunteers should receive special training in caring for people with special needs.

  1. Designing a Composable Geometric Toolkit for Versatility in Applications to Simulation Development

    NASA Technical Reports Server (NTRS)

    Reed, Gregory S.; Campbell, Thomas

    2008-01-01

    Conceived and implemented through the development of probabilistic risk assessment simulations for Project Constellation, the Geometric Toolkit allows users to create, analyze, and visualize relationships between geometric shapes in three-space using the MATLAB computing environment. The key output of the toolkit is an analysis of how emanations from one "source" geometry (e.g., a leak in a pipe) will affect another "target" geometry (e.g., another heat-sensitive component). It can import computer-aided design (CAD) depictions of a system to be analyzed, allowing the user to reliably and easily represent components within the design and determine the relationships between them, ultimately supporting more technical or physics-based simulations that use the toolkit. We opted to develop a variety of modular, interconnecting software tools to extend the scope of the toolkit, providing the capability to support a range of applications. This concept of simulation composability allows specially-developed tools to be reused by assembling them in various combinations. As a result, the concepts described here and implemented in this toolkit have a wide range of applications outside the domain of risk assessment. To that end, the Geometric Toolkit has been evaluated for use in other unrelated applications due to the advantages provided by its underlying design.

  2. The MPI bioinformatics Toolkit as an integrative platform for advanced protein sequence and structure analysis.

    PubMed

    Alva, Vikram; Nam, Seung-Zin; Söding, Johannes; Lupas, Andrei N

    2016-07-01

    The MPI Bioinformatics Toolkit (http://toolkit.tuebingen.mpg.de) is an open, interactive web service for comprehensive and collaborative protein bioinformatic analysis. It offers a wide array of interconnected, state-of-the-art bioinformatics tools to experts and non-experts alike, developed both externally (e.g. BLAST+, HMMER3, MUSCLE) and internally (e.g. HHpred, HHblits, PCOILS). While a beta version of the Toolkit was released 10 years ago, the current production-level release has been available since 2008 and has serviced more than 1.6 million external user queries. The usage of the Toolkit has continued to increase linearly over the years, reaching more than 400 000 queries in 2015. In fact, through the breadth of its tools and their tight interconnection, the Toolkit has become an excellent platform for experimental scientists as well as a useful resource for teaching bioinformatic inquiry to students in the life sciences. In this article, we report on the evolution of the Toolkit over the last ten years, focusing on the expansion of the tool repertoire (e.g. CS-BLAST, HHblits) and on infrastructural work needed to remain operative in a changing web environment. PMID:27131380

  3. The MPI bioinformatics Toolkit as an integrative platform for advanced protein sequence and structure analysis.

    PubMed

    Alva, Vikram; Nam, Seung-Zin; Söding, Johannes; Lupas, Andrei N

    2016-07-01

    The MPI Bioinformatics Toolkit (http://toolkit.tuebingen.mpg.de) is an open, interactive web service for comprehensive and collaborative protein bioinformatic analysis. It offers a wide array of interconnected, state-of-the-art bioinformatics tools to experts and non-experts alike, developed both externally (e.g. BLAST+, HMMER3, MUSCLE) and internally (e.g. HHpred, HHblits, PCOILS). While a beta version of the Toolkit was released 10 years ago, the current production-level release has been available since 2008 and has serviced more than 1.6 million external user queries. The usage of the Toolkit has continued to increase linearly over the years, reaching more than 400 000 queries in 2015. In fact, through the breadth of its tools and their tight interconnection, the Toolkit has become an excellent platform for experimental scientists as well as a useful resource for teaching bioinformatic inquiry to students in the life sciences. In this article, we report on the evolution of the Toolkit over the last ten years, focusing on the expansion of the tool repertoire (e.g. CS-BLAST, HHblits) and on infrastructural work needed to remain operative in a changing web environment.

  4. New Mexico aggregate production sites, 1997-1999

    USGS Publications Warehouse

    Orris, Greta J.

    2000-01-01

    This report presents data, including latitude and longitude, for aggregate sites in New Mexico that were believed to be active in the period 1997-1999. The data are presented in paper form in Part A of this report and as Microsoft Excel 97 and Data Interchange Format (DIF) files in Part B. The work was undertaken as part of the effort to update information for the National Atlas. This compilation includes data from: the files of U.S. Geological Survey (USGS); company contacts; the New Mexico Bureau of Mines and Mineral Resources, New Mexico Bureau of Mine Inspection, and the Mining and Minerals Division of the New Mexico Energy, Minerals and Natural Resources Department (Hatton and others, 1998); the Bureau of Land Management Information; and direct communications with some of the aggregate operators. Additional information on most of the sites is available in Hatton and others (1998).

  5. Proteins aggregation and human diseases

    NASA Astrophysics Data System (ADS)

    Hu, Chin-Kun

    2015-04-01

    Many human diseases and the death of most supercentenarians are related to protein aggregation. Neurodegenerative diseases include Alzheimer's disease (AD), Huntington's disease (HD), Parkinson's disease (PD), frontotemporallobar degeneration, etc. Such diseases are due to progressive loss of structure or function of neurons caused by protein aggregation. For example, AD is considered to be related to aggregation of Aβ40 (peptide with 40 amino acids) and Aβ42 (peptide with 42 amino acids) and HD is considered to be related to aggregation of polyQ (polyglutamine) peptides. In this paper, we briefly review our recent discovery of key factors for protein aggregation. We used a lattice model to study the aggregation rates of proteins and found that the probability for a protein sequence to appear in the conformation of the aggregated state can be used to determine the temperature at which proteins can aggregate most quickly. We used molecular dynamics and simple models of polymer chains to study relaxation and aggregation of proteins under various conditions and found that when the bending-angle dependent and torsion-angle dependent interactions are zero or very small, then protein chains tend to aggregate at lower temperatures. All atom models were used to identify a key peptide chain for the aggregation of insulin chains and to find that two polyQ chains prefer anti-parallel conformation. It is pointed out that in many cases, protein aggregation does not result from protein mis-folding. A potential drug from Chinese medicine was found for Alzheimer's disease.

  6. Dynamics of fire ant aggregations

    NASA Astrophysics Data System (ADS)

    Tennenbaum, Michael; Hu, David; Fernandez-Nieves, Alberto

    Fire ant aggregations are an inherently active system. Each ant harvests its own energy and can convert it into motion. The motion of individual ants contributes non-trivially to the bulk material properties of the aggregation. We have measured some of these properties using plate-plate rheology, where the response to an applied external force or deformation is measured. In this talk, we will present data pertaining to the aggregation behavior in the absence of any external force. We quantify the aggregation dynamics by monitoring the rotation of the top plate and by measuring the normal force. We then compare the results with visualizations of 2D aggregations.

  7. The Insight ToolKit image registration framework

    PubMed Central

    Avants, Brian B.; Tustison, Nicholas J.; Stauffer, Michael; Song, Gang; Wu, Baohua; Gee, James C.

    2014-01-01

    Publicly available scientific resources help establish evaluation standards, provide a platform for teaching and improve reproducibility. Version 4 of the Insight ToolKit (ITK4) seeks to establish new standards in publicly available image registration methodology. ITK4 makes several advances in comparison to previous versions of ITK. ITK4 supports both multivariate images and objective functions; it also unifies high-dimensional (deformation field) and low-dimensional (affine) transformations with metrics that are reusable across transform types and with composite transforms that allow arbitrary series of geometric mappings to be chained together seamlessly. Metrics and optimizers take advantage of multi-core resources, when available. Furthermore, ITK4 reduces the parameter optimization burden via principled heuristics that automatically set scaling across disparate parameter types (rotations vs. translations). A related approach also constrains steps sizes for gradient-based optimizers. The result is that tuning for different metrics and/or image pairs is rarely necessary allowing the researcher to more easily focus on design/comparison of registration strategies. In total, the ITK4 contribution is intended as a structure to support reproducible research practices, will provide a more extensive foundation against which to evaluate new work in image registration and also enable application level programmers a broad suite of tools on which to build. Finally, we contextualize this work with a reference registration evaluation study with application to pediatric brain labeling.1 PMID:24817849

  8. Machine learning for a Toolkit for Image Mining

    NASA Technical Reports Server (NTRS)

    Delanoy, Richard L.

    1995-01-01

    A prototype user environment is described that enables a user with very limited computer skills to collaborate with a computer algorithm to develop search tools (agents) that can be used for image analysis, creating metadata for tagging images, searching for images in an image database on the basis of image content, or as a component of computer vision algorithms. Agents are learned in an ongoing, two-way dialogue between the user and the algorithm. The user points to mistakes made in classification. The algorithm, in response, attempts to discover which image attributes are discriminating between objects of interest and clutter. It then builds a candidate agent and applies it to an input image, producing an 'interest' image highlighting features that are consistent with the set of objects and clutter indicated by the user. The dialogue repeats until the user is satisfied. The prototype environment, called the Toolkit for Image Mining (TIM) is currently capable of learning spectral and textural patterns. Learning exhibits rapid convergence to reasonable levels of performance and, when thoroughly trained, Fo appears to be competitive in discrimination accuracy with other classification techniques.

  9. Fewbody: Numerical toolkit for simulating small-N gravitational dynamics

    NASA Astrophysics Data System (ADS)

    Fregeau, John

    2012-08-01

    Fewbody is a numerical toolkit for simulating small-N gravitational dynamics. It is a general N-body dynamics code, although it was written for the purpose of performing scattering experiments, and therefore has several features that make it well-suited for this purpose. Fewbody uses the 8th-order Runge-Kutta Prince-Dormand integration method with 9th-order error estimate and adaptive timestep to advance the N-body system forward in time. It integrates the usual formulation of the N-body equations in configuration space, but allows for the option of global pairwise Kustaanheimo-Stiefel (K-S) regularization (Heggie 1974; Mikkola 1985). The code uses a binary tree algorithm to classify the N-body system into a set of independently bound hierarchies, and performs collisions between stars in the “sticky star” approximation. Fewbody contains a collection of command line utilities that can be used to perform individual scattering and N-body interactions, but is more generally a library of functions that can be used from within other codes.

  10. Using the Model Coupling Toolkit to couple earth system models

    USGS Publications Warehouse

    Warner, J.C.; Perlin, N.; Skyllingstad, E.D.

    2008-01-01

    Continued advances in computational resources are providing the opportunity to operate more sophisticated numerical models. Additionally, there is an increasing demand for multidisciplinary studies that include interactions between different physical processes. Therefore there is a strong desire to develop coupled modeling systems that utilize existing models and allow efficient data exchange and model control. The basic system would entail model "1" running on "M" processors and model "2" running on "N" processors, with efficient exchange of model fields at predetermined synchronization intervals. Here we demonstrate two coupled systems: the coupling of the ocean circulation model Regional Ocean Modeling System (ROMS) to the surface wave model Simulating WAves Nearshore (SWAN), and the coupling of ROMS to the atmospheric model Coupled Ocean Atmosphere Prediction System (COAMPS). Both coupled systems use the Model Coupling Toolkit (MCT) as a mechanism for operation control and inter-model distributed memory transfer of model variables. In this paper we describe requirements and other options for model coupling, explain the MCT library, ROMS, SWAN and COAMPS models, methods for grid decomposition and sparse matrix interpolation, and provide an example from each coupled system. Methods presented in this paper are clearly applicable for coupling of other types of models. ?? 2008 Elsevier Ltd. All rights reserved.

  11. Targeting protein function: the expanding toolkit for conditional disruption

    PubMed Central

    Campbell, Amy E.; Bennett, Daimark

    2016-01-01

    A major objective in biological research is to understand spatial and temporal requirements for any given gene, especially in dynamic processes acting over short periods, such as catalytically driven reactions, subcellular transport, cell division, cell rearrangement and cell migration. The interrogation of such processes requires the use of rapid and flexible methods of interfering with gene function. However, many of the most widely used interventional approaches, such as RNAi or CRISPR (clustered regularly interspaced short palindromic repeats)-Cas9 (CRISPR-associated 9), operate at the level of the gene or its transcripts, meaning that the effects of gene perturbation are exhibited over longer time frames than the process under investigation. There has been much activity over the last few years to address this fundamental problem. In the present review, we describe recent advances in disruption technologies acting at the level of the expressed protein, involving inducible methods of protein cleavage, (in)activation, protein sequestration or degradation. Drawing on examples from model organisms we illustrate the utility of fast-acting techniques and discuss how different components of the molecular toolkit can be employed to dissect previously intractable biochemical processes and cellular behaviours. PMID:27574023

  12. Targeting protein function: the expanding toolkit for conditional disruption.

    PubMed

    Campbell, Amy E; Bennett, Daimark

    2016-09-01

    A major objective in biological research is to understand spatial and temporal requirements for any given gene, especially in dynamic processes acting over short periods, such as catalytically driven reactions, subcellular transport, cell division, cell rearrangement and cell migration. The interrogation of such processes requires the use of rapid and flexible methods of interfering with gene function. However, many of the most widely used interventional approaches, such as RNAi or CRISPR (clustered regularly interspaced short palindromic repeats)-Cas9 (CRISPR-associated 9), operate at the level of the gene or its transcripts, meaning that the effects of gene perturbation are exhibited over longer time frames than the process under investigation. There has been much activity over the last few years to address this fundamental problem. In the present review, we describe recent advances in disruption technologies acting at the level of the expressed protein, involving inducible methods of protein cleavage, (in)activation, protein sequestration or degradation. Drawing on examples from model organisms we illustrate the utility of fast-acting techniques and discuss how different components of the molecular toolkit can be employed to dissect previously intractable biochemical processes and cellular behaviours.

  13. A Highly Characterized Yeast Toolkit for Modular, Multipart Assembly.

    PubMed

    Lee, Michael E; DeLoache, William C; Cervantes, Bernardo; Dueber, John E

    2015-09-18

    Saccharomyces cerevisiae is an increasingly attractive host for synthetic biology because of its long history in industrial fermentations. However, until recently, most synthetic biology systems have focused on bacteria. While there is a wealth of resources and literature about the biology of yeast, it can be daunting to navigate and extract the tools needed for engineering applications. Here we present a versatile engineering platform for yeast, which contains both a rapid, modular assembly method and a basic set of characterized parts. This platform provides a framework in which to create new designs, as well as data on promoters, terminators, degradation tags, and copy number to inform those designs. Additionally, we describe genome-editing tools for making modifications directly to the yeast chromosomes, which we find preferable to plasmids due to reduced variability in expression. With this toolkit, we strive to simplify the process of engineering yeast by standardizing the physical manipulations and suggesting best practices that together will enable more straightforward translation of materials and data from one group to another. Additionally, by relieving researchers of the burden of technical details, they can focus on higher-level aspects of experimental design.

  14. Toward a VPH/Physiome ToolKit.

    PubMed

    Garny, Alan; Cooper, Jonathan; Hunter, Peter J

    2010-01-01

    The Physiome Project was officially launched in 1997 and has since brought together teams from around the world to work on the development of a computational framework for the modeling of the human body. At the European level, this effort is focused around patient-specific solutions and is known as the Virtual Physiological Human (VPH) Initiative.Such modeling is both multiscale (in space and time) and multiphysics. This, therefore, requires careful interaction and collaboration between the teams involved in the VPH/Physiome effort, if we are to produce computer models that are not only quantitative, but also integrative and predictive.In that context, several technologies and solutions are already available, developed both by groups involved in the VPH/Physiome effort, and by others. They address areas such as data handling/fusion, markup languages, model repositories, ontologies, tools (for simulation, imaging, data fitting, etc.), as well as grid, middleware, and workflow.Here, we provide an overview of resources that should be considered for inclusion in the VPH/Physiome ToolKit (i.e., the set of tools that addresses the needs and requirements of the Physiome Project and VPH Initiative) and discuss some of the challenges that we are still facing. PMID:20836018

  15. Using the Browser for Science: A Collaborative Toolkit for Astronomy

    NASA Astrophysics Data System (ADS)

    Connolly, A. J.; Smith, I.; Krughoff, K. S.; Gibson, R.

    2011-07-01

    Astronomical surveys have yielded hundreds of terabytes of catalogs and images that span many decades of the electromagnetic spectrum. Even when observatories provide user-friendly web interfaces, exploring these data resources remains a complex and daunting task. In contrast, gadgets and widgets have become popular in social networking (e.g. iGoogle, Facebook). They provide a simple way to make complex data easily accessible that can be customized based on the interest of the user. With ASCOT (an AStronomical COllaborative Toolkit) we expand on these concepts to provide a customizable and extensible gadget framework for use in science. Unlike iGoogle, where all of the gadgets are independent, the gadgets we develop communicate and share information, enabling users to visualize and interact with data through multiple, simultaneous views. With this approach, web-based applications for accessing and visualizing data can be generated easily and, by linking these tools together, integrated and powerful data analysis and discovery tools can be constructed.

  16. Space and Medical Applications of the Geant4 Simulation Toolkit

    NASA Astrophysics Data System (ADS)

    Perl, Joseph

    2008-10-01

    Geant4 is a toolkit to simulate the passage of particles through matter. While Geant4 was developed for High Energy Physics (HEP), applications now include Nuclear, Medical and Space Physics. Medical applications have been increasing rapidly due to the overall growth of Monte Carlo in Medical Physics and the unique qualities of Geant4 as an all-particle code able to handle complex geometry, motion and fields with the flexibility of modern programming and an open free source code. Work has included characterizing beams and sources, treatment planning and imaging. The all-particle nature of Geant4 has made it popular for the newest modes of radiation treatment: Proton and Particle therapy. Geant4 has been used by ESA, NASA and JAXA to study radiation effects to spacecraft and personnel. The flexibility of Geant4 has enabled teams to incorporate it into their own applications (SPENVIS MULASSIS space environment from QinetiQ and ESA, RADSAFE simulation from Vanderbilt University and NASA). We provide an overview of applications and discuss how Geant4 has responded to specific challenges of moving from HEP to Medical and Space Physics, including recent work to extend Geant4's energy range to low dose radiobiology.

  17. Microgrid Design Toolkit (MDT) Technical Documentation and Component Summaries

    SciTech Connect

    Arguello, Bryan; Gearhart, Jared Lee; Jones, Katherine A.; Eddy, John P.

    2015-09-01

    The Microgrid Design Toolkit (MDT) is a decision support software tool for microgrid designers to use during the microgrid design process. The models that support the two main capabilities in MDT are described. The first capability, the Microgrid Sizing Capability (MSC), is used to determine the size and composition of a new microgrid in the early stages of the design process. MSC is a mixed-integer linear program that is focused on developing a microgrid that is economically viable when connected to the grid. The second capability is focused on refining a microgrid design for operation in islanded mode. This second capability relies on two models: the Technology Management Optimization (TMO) model and Performance Reliability Model (PRM). TMO uses a genetic algorithm to create and refine a collection of candidate microgrid designs. It uses PRM, a simulation based reliability model, to assess the performance of these designs. TMO produces a collection of microgrid designs that perform well with respect to one or more performance metrics.

  18. Scientific Visualization Using the Flow Analysis Software Toolkit (FAST)

    NASA Technical Reports Server (NTRS)

    Bancroft, Gordon V.; Kelaita, Paul G.; Mccabe, R. Kevin; Merritt, Fergus J.; Plessel, Todd C.; Sandstrom, Timothy A.; West, John T.

    1993-01-01

    Over the past few years the Flow Analysis Software Toolkit (FAST) has matured into a useful tool for visualizing and analyzing scientific data on high-performance graphics workstations. Originally designed for visualizing the results of fluid dynamics research, FAST has demonstrated its flexibility by being used in several other areas of scientific research. These research areas include earth and space sciences, acid rain and ozone modelling, and automotive design, just to name a few. This paper describes the current status of FAST, including the basic concepts, architecture, existing functionality and features, and some of the known applications for which FAST is being used. A few of the applications, by both NASA and non-NASA agencies, are outlined in more detail. Described in the Outlines are the goals of each visualization project, the techniques or 'tricks' used lo produce the desired results, and custom modifications to FAST, if any, done to further enhance the analysis. Some of the future directions for FAST are also described.

  19. Common File Formats.

    PubMed

    Mills, Lauren

    2014-03-21

    An overview of the many file formats commonly used in bioinformatics and genome sequence analysis is presented, including various data file formats, alignment file formats, and annotation file formats. Example workflows illustrate how some of the different file types are typically used.

  20. Tripal v1.1: a standards-based toolkit for construction of online genetic and genomic databases.

    PubMed

    Sanderson, Lacey-Anne; Ficklin, Stephen P; Cheng, Chun-Huai; Jung, Sook; Feltus, Frank A; Bett, Kirstin E; Main, Dorrie

    2013-01-01

    Tripal is an open-source freely available toolkit for construction of online genomic and genetic databases. It aims to facilitate development of community-driven biological websites by integrating the GMOD Chado database schema with Drupal, a popular website creation and content management software. Tripal provides a suite of tools for interaction with a Chado database and display of content therein. The tools are designed to be generic to support the various ways in which data may be stored in Chado. Previous releases of Tripal have supported organisms, genomic libraries, biological stocks, stock collections and genomic features, their alignments and annotations. Also, Tripal and its extension modules provided loaders for commonly used file formats such as FASTA, GFF, OBO, GAF, BLAST XML, KEGG heir files and InterProScan XML. Default generic templates were provided for common views of biological data, which could be customized using an open Application Programming Interface to change the way data are displayed. Here, we report additional tools and functionality that are part of release v1.1 of Tripal. These include (i) a new bulk loader that allows a site curator to import data stored in a custom tab delimited format; (ii) full support of every Chado table for Drupal Views (a powerful tool allowing site developers to construct novel displays and search pages); (iii) new modules including 'Feature Map', 'Genetic', 'Publication', 'Project', 'Contact' and the 'Natural Diversity' modules. Tutorials, mailing lists, download and set-up instructions, extension modules and other documentation can be found at the Tripal website located at http://tripal.info. DATABASE URL: http://tripal.info/. PMID:24163125

  1. Tripal v1.1: a standards-based toolkit for construction of online genetic and genomic databases

    PubMed Central

    Sanderson, Lacey-Anne; Ficklin, Stephen P.; Cheng, Chun-Huai; Jung, Sook; Feltus, Frank A.; Bett, Kirstin E.; Main, Dorrie

    2013-01-01

    Tripal is an open-source freely available toolkit for construction of online genomic and genetic databases. It aims to facilitate development of community-driven biological websites by integrating the GMOD Chado database schema with Drupal, a popular website creation and content management software. Tripal provides a suite of tools for interaction with a Chado database and display of content therein. The tools are designed to be generic to support the various ways in which data may be stored in Chado. Previous releases of Tripal have supported organisms, genomic libraries, biological stocks, stock collections and genomic features, their alignments and annotations. Also, Tripal and its extension modules provided loaders for commonly used file formats such as FASTA, GFF, OBO, GAF, BLAST XML, KEGG heir files and InterProScan XML. Default generic templates were provided for common views of biological data, which could be customized using an open Application Programming Interface to change the way data are displayed. Here, we report additional tools and functionality that are part of release v1.1 of Tripal. These include (i) a new bulk loader that allows a site curator to import data stored in a custom tab delimited format; (ii) full support of every Chado table for Drupal Views (a powerful tool allowing site developers to construct novel displays and search pages); (iii) new modules including ‘Feature Map’, ‘Genetic’, ‘Publication’, ‘Project’, ‘Contact’ and the ‘Natural Diversity’ modules. Tutorials, mailing lists, download and set-up instructions, extension modules and other documentation can be found at the Tripal website located at http://tripal.info. Database URL: http://tripal.info/ PMID:24163125

  2. Kinetics of protein aggregation

    NASA Astrophysics Data System (ADS)

    Knowles, Tuomas

    2015-03-01

    Aggregation into linear nanostructures, notably amyloid and amyloid-like fibrils, is a common form of behaviour exhibited by a range of peptides and proteins. This process was initially discovered in the context of the aetiology of a range of neurodegenerative diseases, but has recently been recognised to of general significance and has been found at the origin of a number of beneficial functional roles in nature, including as catalytic scaffolds and functional components in biofilms. This talk discusses our ongoing efforts to study the kinetics of linear protein self-assembly by using master equation approaches combined with global analysis of experimental data.

  3. Structure of Viral Aggregates

    NASA Astrophysics Data System (ADS)

    Barr, Stephen; Luijten, Erik

    2010-03-01

    The aggregation of virus particles is a particular form of colloidal self-assembly, since viruses of a give type are monodisperse and have identical, anisotropic surface charge distributions. In small-angle X-ray scattering experiments, the Qbeta virus was found to organize in different crystal structures in the presence of divalent salt and non-adsorbing polymer. Since a simple isotropic potential cannot explain the occurrence of all observed phases, we employ computer simulations to investigate how the surface charge distribution affects the virus interactions. Using a detailed model of the virus particle, we find an asymmetric ion distribution around the virus which gives rise to the different phases observed.

  4. Case File: The Spazzoids

    MedlinePlus

    ... classroom activities. More Related Links Healthy Schools Case file: The Spazzzoids Recommend on Facebook Tweet Share Compartir ... your classroom activities. More Related Links Healthy Schools File Formats Help: How do I view different file ...

  5. Using the PhenX Toolkit to Add Standard Measures to a Study.

    PubMed

    Hendershot, Tabitha; Pan, Huaqin; Haines, Jonathan; Harlan, William R; Junkins, Heather A; Ramos, Erin M; Hamilton, Carol M

    2011-10-01

    The PhenX (consensus measures for Phenotypes and eXposures) Toolkit (https://www.phenxtoolkit.org/) offers high-quality, well-established measures of phenotypes and exposures for use by the scientific community. The Toolkit contains 295 measures drawn from 21 research domains (fields of research). The measures were selected by Working Groups of domain experts using a consensus process that included input from the scientific community. The Toolkit provides a description of each PhenX measure, the rationale for including it in the Toolkit, protocol(s) for collecting the measure, and supporting documentation. Users can browse by measures, domains, or collections, or can search the Toolkit using the Smart Query Tool. Once users have selected some measures, they can download a customized Data Collection Worksheet that specifies what information needs to be collected, and a Data Dictionary that describes each variable included in their Data Collection Worksheet. To help researchers find studies with comparable data, PhenX measures and variables are being mapped to studies in the database of Genotypes and Phenotypes (dbGaP).

  6. Educational RIS/PACS simulator integrated with the HIPAA compliant auditing (HCA) toolkit

    NASA Astrophysics Data System (ADS)

    Zhou, Zheng; Liu, Brent J.; Huang, H. K.; Zhang, J.

    2005-04-01

    Health Insurance Portability and Accountability Act (HIPAA), a guideline for healthcare privacy and security, has been officially instituted recently. HIPAA mandates healthcare providers to follow its privacy and security rules, one of which is to have the ability to generate audit trails on the data access for any specific patient on demand. Although most current medical imaging systems such as PACS utilize logs to record their activities, there is a lack of formal methodology to interpret these large volumes of log data and generate HIPAA compliant auditing trails. In this paper, we present a HIPAA compliant auditing (HCA) toolkit for auditing the image data flow of PACS. The toolkit can extract pertinent auditing information from the logs of various PACS components and store the information in a centralized auditing database. The HIPAA compliant audit trails can be generated based on the database, which can also be utilized for data analysis to facilitate the dynamic monitoring of the data flow of PACS. In order to demonstrate the HCA toolkit in a PACS environment, it was integrated with the PACS Simulator, that was presented as an educational tool in 2003 and 2004 SPIE. With the integration of the HCA toolkit with the PACS simulator, users can learn HIPAA audit concepts and how to generate audit trails of image data access in PACS, as well as trace the image data flow of PACS Simulator through the toolkit.

  7. The medical exploration toolkit: an efficient support for visual computing in surgical planning and training.

    PubMed

    Mühler, Konrad; Tietjen, Christian; Ritter, Felix; Preim, Bernhard

    2010-01-01

    Application development is often guided by the usage of software libraries and toolkits. For medical applications, the toolkits currently available focus on image analysis and volume rendering. Advance interactive visualizations and user interface issues are not adequately supported. Hence, we present a toolkit for application development in the field of medical intervention planning, training, and presentation--the MEDICALEXPLORATIONTOOLKIT (METK). The METK is based on the rapid prototyping platform MeVisLab and offers a large variety of facilities for an easy and efficient application development process. We present dedicated techniques for advanced medical visualizations, exploration, standardized documentation, adn interface widgets for common tasks. These include, e.g., advanced animation facilities, viewpoint selection, several illustrative rendering techniques, and new techniques for object selection in 3D surface models. No extended programming skills are needed for application building, since a graphical programming approach can be used. the toolkit is freely available and well documented to facilitate the use and extension of the toolkit. PMID:19910667

  8. PsyToolkit: a software package for programming psychological experiments using Linux.

    PubMed

    Stoet, Gijsbert

    2010-11-01

    PsyToolkit is a set of software tools for programming psychological experiments on Linux computers. Given that PsyToolkit is freely available under the Gnu Public License, open source, and designed such that it can easily be modified and extended for individual needs, it is suitable not only for technically oriented Linux users, but also for students, researchers on small budgets, and universities in developing countries. The software includes a high-level scripting language, a library for the programming language C, and a questionnaire presenter. The software easily integrates with other open source tools, such as the statistical software package R. PsyToolkit is designed to work with external hardware (including IoLab and Cedrus response keyboards and two common digital input/output boards) and to support millisecond timing precision. Four in-depth examples explain the basic functionality of PsyToolkit. Example 1 demonstrates a stimulus-response compatibility experiment. Example 2 demonstrates a novel mouse-controlled visual search experiment. Example 3 shows how to control light emitting diodes using PsyToolkit, and Example 4 shows how to build a light-detection sensor. The last two examples explain the electronic hardware setup such that they can even be used with other software packages. PMID:21139177

  9. Pcetk: A pDynamo-based Toolkit for Protonation State Calculations in Proteins.

    PubMed

    Feliks, Mikolaj; Field, Martin J

    2015-10-26

    Pcetk (a pDynamo-based continuum electrostatic toolkit) is an open-source, object-oriented toolkit for the calculation of proton binding energetics in proteins. The toolkit is a module of the pDynamo software library, combining the versatility of the Python scripting language and the efficiency of the compiled languages, C and Cython. In the toolkit, we have connected pDynamo to the external Poisson-Boltzmann solver, extended-MEAD. Our goal was to provide a modern and extensible environment for the calculation of protonation states, electrostatic energies, titration curves, and other electrostatic-dependent properties of proteins. Pcetk is freely available under the CeCILL license, which is compatible with the GNU General Public License. The toolkit can be found on the Web at the address http://github.com/mfx9/pcetk. The calculation of protonation states in proteins requires a knowledge of pKa values of protonatable groups in aqueous solution. However, for some groups, such as protonatable ligands bound to protein, the pKa aq values are often difficult to obtain from experiment. As a complement to Pcetk, we revisit an earlier computational method for the estimation of pKa aq values that has an accuracy of ± 0.5 pKa-units or better. Finally, we verify the Pcetk module and the method for estimating pKa aq values with different model cases.

  10. PsyToolkit: a software package for programming psychological experiments using Linux.

    PubMed

    Stoet, Gijsbert

    2010-11-01

    PsyToolkit is a set of software tools for programming psychological experiments on Linux computers. Given that PsyToolkit is freely available under the Gnu Public License, open source, and designed such that it can easily be modified and extended for individual needs, it is suitable not only for technically oriented Linux users, but also for students, researchers on small budgets, and universities in developing countries. The software includes a high-level scripting language, a library for the programming language C, and a questionnaire presenter. The software easily integrates with other open source tools, such as the statistical software package R. PsyToolkit is designed to work with external hardware (including IoLab and Cedrus response keyboards and two common digital input/output boards) and to support millisecond timing precision. Four in-depth examples explain the basic functionality of PsyToolkit. Example 1 demonstrates a stimulus-response compatibility experiment. Example 2 demonstrates a novel mouse-controlled visual search experiment. Example 3 shows how to control light emitting diodes using PsyToolkit, and Example 4 shows how to build a light-detection sensor. The last two examples explain the electronic hardware setup such that they can even be used with other software packages.

  11. Taurine and platelet aggregation

    SciTech Connect

    Nauss-Karol, C.; VanderWende, C.; Gaut, Z.N.

    1986-03-01

    Taurine is a putative neurotransmitter or neuromodulator. The endogenous taurine concentration in human platelets, determined by amino acid analysis, is 15 ..mu..M/g. In spite of this high level, taurine is actively accumulated. Uptake is saturable, Na/sup +/ and temperature dependent, and suppressed by metabolic inhibitors, structural analogues, and several classes of centrally active substances. High, medium and low affinity transport processes have been characterized, and the platelet may represent a model system for taurine transport in the CNS. When platelets were incubated with /sup 14/C-taurine for 30 minutes, then resuspended in fresh medium and reincubated for one hour, essentially all of the taurine was retained within the cells. Taurine, at concentrations ranging from 10-1000 ..mu..M, had no effect on platelet aggregation induced by ADP or epinephrine. However, taurine may have a role in platelet aggregation since 35-39% of the taurine taken up by human platelets appears to be secreted during the release reaction induced by low concentrations of either epinephrine or ADP, respectively. This release phenomenon would imply that part of the taurine taken up is stored directly in the dense bodies of the platelet.

  12. Parameter Sweep and Optimization of Loosely Coupled Simulations Using the DAKOTA Toolkit

    SciTech Connect

    Elwasif, Wael R; Bernholdt, David E; Pannala, Sreekanth; Allu, Srikanth; Foley, Samantha S

    2012-01-01

    The increasing availability of large scale computing capabilities has accelerated the development of high-fidelity coupled simulations. Such simulations typically involve the integration of models that implement various aspects of the complex phenomena under investigation. Coupled simulations are playing an integral role in fields such as climate modeling, earth systems modeling, rocket simulations, computational chemistry, fusion research, and many other computational fields. Model coupling provides scientists with systematic ways to virtually explore the physical, mathematical, and computational aspects of the problem. Such exploration is rarely done using a single execution of a simulation, but rather by aggregating the results from many simulation runs that, together, serve to bring to light novel knowledge about the system under investigation. Furthermore, it is often the case (particularly in engineering disciplines) that the study of the underlying system takes the form of an optimization regime, where the control parameter space is explored to optimize an objective functions that captures system realizability, cost, performance, or a combination thereof. Novel and flexible frameworks that facilitate the integration of the disparate models into a holistic simulation are used to perform this research, while making efficient use of the available computational resources. In this paper, we describe the integration of the DAKOTA optimization and parameter sweep toolkit with the Integrated Plasma Simulator (IPS), a component-based framework for loosely coupled simulations. The integration allows DAKOTA to exploit the internal task and resource management of the IPS to dynamically instantiate simulation instances within a single IPS instance, allowing for greater control over the trade-off between efficiency of resource utilization and time to completion. We present a case study showing the use of the combined DAKOTA-IPS system to aid in the design of a lithium ion

  13. Lawsuit filed.

    PubMed

    1996-04-19

    Social service providers at a Boston center for Haitian immigrants filed a suit in a U.S. District Court against the Federal government. The suit claims that auditors from the Inspector General's Office in the U.S. Department of Health and Human Services demanded the names of clients with AIDS. The auditors explained that this process, conducted to determine whether agencies are properly spending Ryan White CARE Act funds, is appropriate and routine. Attorneys for the social service providers contend that the actions of the auditors breached client confidentiality and privacy.

  14. BioWarehouse: a bioinformatics database warehouse toolkit

    PubMed Central

    Lee, Thomas J; Pouliot, Yannick; Wagner, Valerie; Gupta, Priyanka; Stringer-Calvert, David WJ; Tenenbaum, Jessica D; Karp, Peter D

    2006-01-01

    Background This article addresses the problem of interoperation of heterogeneous bioinformatics databases. Results We introduce BioWarehouse, an open source toolkit for constructing bioinformatics database warehouses using the MySQL and Oracle relational database managers. BioWarehouse integrates its component databases into a common representational framework within a single database management system, thus enabling multi-database queries using the Structured Query Language (SQL) but also facilitating a variety of database integration tasks such as comparative analysis and data mining. BioWarehouse currently supports the integration of a pathway-centric set of databases including ENZYME, KEGG, and BioCyc, and in addition the UniProt, GenBank, NCBI Taxonomy, and CMR databases, and the Gene Ontology. Loader tools, written in the C and JAVA languages, parse and load these databases into a relational database schema. The loaders also apply a degree of semantic normalization to their respective source data, decreasing semantic heterogeneity. The schema supports the following bioinformatics datatypes: chemical compounds, biochemical reactions, metabolic pathways, proteins, genes, nucleic acid sequences, features on protein and nucleic-acid sequences, organisms, organism taxonomies, and controlled vocabularies. As an application example, we applied BioWarehouse to determine the fraction of biochemically characterized enzyme activities for which no sequences exist in the public sequence databases. The answer is that no sequence exists for 36% of enzyme activities for which EC numbers have been assigned. These gaps in sequence data significantly limit the accuracy of genome annotation and metabolic pathway prediction, and are a barrier for metabolic engineering. Complex queries of this type provide examples of the value of the data warehousing approach to bioinformatics research. Conclusion BioWarehouse embodies significant progress on the database integration problem for

  15. Mission Analysis, Operations, and Navigation Toolkit Environment (Monte) Version 040

    NASA Technical Reports Server (NTRS)

    Sunseri, Richard F.; Wu, Hsi-Cheng; Evans, Scott E.; Evans, James R.; Drain, Theodore R.; Guevara, Michelle M.

    2012-01-01

    Monte is a software set designed for use in mission design and spacecraft navigation operations. The system can process measurement data, design optimal trajectories and maneuvers, and do orbit determination, all in one application. For the first time, a single software set can be used for mission design and navigation operations. This eliminates problems due to different models and fidelities used in legacy mission design and navigation software. The unique features of Monte 040 include a blowdown thruster model for GRAIL (Gravity Recovery and Interior Laboratory) with associated pressure models, as well as an updated, optimalsearch capability (COSMIC) that facilitated mission design for ARTEMIS. Existing legacy software lacked the capabilities necessary for these two missions. There is also a mean orbital element propagator and an osculating to mean element converter that allows long-term orbital stability analysis for the first time in compiled code. The optimized trajectory search tool COSMIC allows users to place constraints and controls on their searches without any restrictions. Constraints may be user-defined and depend on trajectory information either forward or backwards in time. In addition, a long-term orbit stability analysis tool (morbiter) existed previously as a set of scripts on top of Monte. Monte is becoming the primary tool for navigation operations, a core competency at JPL. The mission design capabilities in Monte are becoming mature enough for use in project proposals as well as post-phase A mission design. Monte has three distinct advantages over existing software. First, it is being developed in a modern paradigm: object- oriented C++ and Python. Second, the software has been developed as a toolkit, which allows users to customize their own applications and allows the development team to implement requirements quickly, efficiently, and with minimal bugs. Finally, the software is managed in accordance with the CMMI (Capability Maturity Model

  16. Holographic characterization of protein aggregates

    NASA Astrophysics Data System (ADS)

    Wang, Chen; Zhong, Xiao; Ruffner, David; Stutt, Alexandra; Philips, Laura; Ward, Michael; Grier, David

    Holographic characterization directly measures the size distribution of subvisible protein aggregates in suspension and offers insights into their morphology. Based on holographic video microscopy, this analytical technique records and interprets holograms of individual aggregates in protein solutions as they flow down a microfluidic channel, without requiring labeling or other exceptional sample preparation. The hologram of an individual protein aggregate is analyzed in real time with the Lorenz-Mie theory of light scattering to measure that aggregate's size and optical properties. Detecting, counting and characterizing subvisible aggregates proceeds fast enough for time-resolved studies, and lends itself to tracking trends in protein aggregation arising from changing environmental factors. No other analytical technique provides such a wealth of particle-resolved characterization data in situ. Holographic characterization promises accelerated development of therapeutic protein formulations, improved process control during manufacturing, and streamlined quality assurance during storage and at the point of use. Mrsec and MRI program of the NSF, Spheryx Inc.

  17. 43 CFR 4.1352 - Who may file; where to file; when to file.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 43 Public Lands: Interior 1 2010-10-01 2010-10-01 false Who may file; where to file; when to file... Indian Lands) § 4.1352 Who may file; where to file; when to file. (a) The applicant or operator may file... to file a timely request constitutes a waiver of the opportunity for a hearing before OSM makes...

  18. SatelliteDL - An IDL Toolkit for the Analysis of Satellite Earth Observations - GOES, MODIS, VIIRS and CERES

    NASA Astrophysics Data System (ADS)

    Fillmore, D. W.; Galloy, M. D.; Kindig, D.

    2013-12-01

    SatelliteDL is an IDL toolkit for the analysis of satellite Earth observations from a diverse set of platforms and sensors. The design features an abstraction layer that allows for easy inclusion of new datasets in a modular way. The core function of the toolkit is the spatial and temporal alignment of satellite swath and geostationary data. IDL has a powerful suite of statistical and visualization tools that can be used in conjunction with SatelliteDL. Our overarching objective is to create utilities that automate the mundane aspects of satellite data analysis, are extensible and maintainable, and do not place limitations on the analysis itself. Toward this end we have constructed SatelliteDL to include (1) HTML and LaTeX API document generation, (2) a unit test framework, (3) automatic message and error logs, (4) HTML and LaTeX plot and table generation, and (5) several real world examples with bundled datasets available for download. For ease of use, datasets, variables and optional workflows may be specified in a flexible format configuration file. Configuration statements may specify, for example, a region and date range, and the creation of images, plots and statistical summary tables for a long list of variables. SatelliteDL enforces data provenance; all data should be traceable and reproducible. The output NetCDF file metadata holds a complete history of the original datasets and their transformations, and a method exists to reconstruct a configuration file from this information. Release 0.1.0 of SatelliteDL is anticipated for the 2013 Fall AGU conference. It will distribute with ingest methods for GOES, MODIS, VIIRS and CERES radiance data (L1) as well as select 2D atmosphere products (L2) such as aerosol and cloud (MODIS and VIIRS) and radiant flux (CERES). Future releases will provide ingest methods for ocean and land surface products, gridded and time averaged datasets (L3 Daily, Monthly and Yearly), and support for 3D products such as temperature and

  19. Instructional Improvement Cycle: A Teacher's Toolkit for Collecting and Analyzing Data on Instructional Strategies. REL 2015-080

    ERIC Educational Resources Information Center

    Cherasaro, Trudy L.; Reale, Marianne L.; Haystead, Mark; Marzano, Robert J.

    2015-01-01

    This toolkit, developed by Regional Educational Laboratory (REL) Central in collaboration with York Public Schools in Nebraska, provides a process and tools to help teachers use data from their classroom assessments to evaluate promising practices. The toolkit provides teachers with guidance on how to deliberately apply and study one classroom…

  20. Implementing Project Based Survey Research Skills to Grade Six ELP Students with "The Survey Toolkit" and "TinkerPlots"[R

    ERIC Educational Resources Information Center

    Walsh, Thomas, Jr.

    2011-01-01

    "Survey Toolkit Collecting Information, Analyzing Data and Writing Reports" (Walsh, 2009a) is discussed as a survey research curriculum used by the author's sixth grade students. The report describes the implementation of "The Survey Toolkit" curriculum and "TinkerPlots"[R] software to provide instruction to students learning a project based…

  1. Toolkit of Resources for Engaging Parents and Community as Partners in Education. Part 2: Building a Cultural Bridge

    ERIC Educational Resources Information Center

    Regional Educational Laboratory Pacific, 2015

    2015-01-01

    This toolkit is designed to guide school staff in strengthening partnerships with families and community members to support student learning. This toolkit offers an integrated approach to family and community engagement, bringing together research, promising practices, and a wide range of useful tools and resources with explanations and directions…

  2. Toolkit of Resources for Engaging Parents and Community as Partners in Education. Part 4: Engaging All in Data Conversations

    ERIC Educational Resources Information Center

    Regional Educational Laboratory Pacific, 2015

    2015-01-01

    This toolkit is designed to guide school staff in strengthening partnerships with families and community members to support student learning. This toolkit offers an integrated approach to family and community engagement, bringing together research, promising practices, and a wide range of useful tools and resources with explanations and directions…

  3. WATERSHED HEALTH ASSESSMENT TOOLS-INVESTIGATING FISHERIES (WHAT-IF): A MODELING TOOLKIT FOR WATERSHED AND FISHERIES MANAGEMENT

    EPA Science Inventory

    The Watershed Health Assessment Tools-Investigating Fisheries (WHAT-IF) is a decision-analysis modeling toolkit for personal computers that supports watershed and fisheries management. The WHAT-IF toolkit includes a relational database, help-system functions and documentation, a...

  4. The GeoViz Toolkit: Using component-oriented coordination methods for geographic visualization and analysis

    PubMed Central

    Hardisty, Frank; Robinson, Anthony C.

    2010-01-01

    In this paper we present the GeoViz Toolkit, an open-source, internet-delivered program for geographic visualization and analysis that features a diverse set of software components which can be flexibly combined by users who do not have programming expertise. The design and architecture of the GeoViz Toolkit allows us to address three key research challenges in geovisualization: allowing end users to create their own geovisualization and analysis component set on-the-fly, integrating geovisualization methods with spatial analysis methods, and making geovisualization applications sharable between users. Each of these tasks necessitates a robust yet flexible approach to inter-tool coordination. The coordination strategy we developed for the GeoViz Toolkit, called Introspective Observer Coordination, leverages and combines key advances in software engineering from the last decade: automatic introspection of objects, software design patterns, and reflective invocation of methods. PMID:21731423

  5. Kekule.js: An Open Source JavaScript Chemoinformatics Toolkit.

    PubMed

    Jiang, Chen; Jin, Xi; Dong, Ying; Chen, Ming

    2016-06-27

    Kekule.js is an open-source, object-oriented JavaScript toolkit for chemoinformatics. It provides methods for many common tasks in molecular informatics, including chemical data input/output (I/O), two- and three-dimensional (2D/3D) rendering of chemical structure, stereo identification, ring perception, structure comparison, and substructure search. Encapsulated widgets to display and edit chemical structures directly in web context are also supplied. Developed with web standards, the toolkit is ideal for building chemoinformatics applications over the Internet. Moreover, it is highly platform-independent and can also be used in desktop or mobile environments. Some initial applications, such as plugins for inputting chemical structures on the web and uses in chemistry education, have been developed based on the toolkit. PMID:27243272

  6. MSAT—A new toolkit for the analysis of elastic and seismic anisotropy

    NASA Astrophysics Data System (ADS)

    Walker, Andrew M.; Wookey, James

    2012-12-01

    The design and content of MSAT, a new Matlab toolkit for the study and analysis of seismic and elastic anisotropy, is described. Along with a brief introduction to the basic theory of anisotropic elasticity and a guide to the functions provided by the toolkit, three example applications are discussed. First, the toolkit is used to analyse the effect of pressure on the elasticity of the monoclinic upper mantle mineral diopside. Second, the degree to which a model of elasticity in the lowermost mantle can be approximated by transverse isotropy is examined. Finally backazimuthal variation in the effective shear wave splitting caused by two anisotropic layers where the lower layer is dipping is calculated. MSAT can be freely reused for any purpose and the implementation of these and other examples are distributed with the source code.

  7. Kekule.js: An Open Source JavaScript Chemoinformatics Toolkit.

    PubMed

    Jiang, Chen; Jin, Xi; Dong, Ying; Chen, Ming

    2016-06-27

    Kekule.js is an open-source, object-oriented JavaScript toolkit for chemoinformatics. It provides methods for many common tasks in molecular informatics, including chemical data input/output (I/O), two- and three-dimensional (2D/3D) rendering of chemical structure, stereo identification, ring perception, structure comparison, and substructure search. Encapsulated widgets to display and edit chemical structures directly in web context are also supplied. Developed with web standards, the toolkit is ideal for building chemoinformatics applications over the Internet. Moreover, it is highly platform-independent and can also be used in desktop or mobile environments. Some initial applications, such as plugins for inputting chemical structures on the web and uses in chemistry education, have been developed based on the toolkit.

  8. The Visualization Toolkit (VTK): Rewriting the rendering code for modern graphics cards

    NASA Astrophysics Data System (ADS)

    Hanwell, Marcus D.; Martin, Kenneth M.; Chaudhary, Aashish; Avila, Lisa S.

    2015-09-01

    The Visualization Toolkit (VTK) is an open source, permissively licensed, cross-platform toolkit for scientific data processing, visualization, and data analysis. It is over two decades old, originally developed for a very different graphics card architecture. Modern graphics cards feature fully programmable, highly parallelized architectures with large core counts. VTK's rendering code was rewritten to take advantage of modern graphics cards, maintaining most of the toolkit's programming interfaces. This offers the opportunity to compare the performance of old and new rendering code on the same systems/cards. Significant improvements in rendering speeds and memory footprints mean that scientific data can be visualized in greater detail than ever before. The widespread use of VTK means that these improvements will reap significant benefits.

  9. Census of Population and Housing, 1980: Summary Tape File 3F. Technical Documentation.

    ERIC Educational Resources Information Center

    Bureau of the Census (DOC), Washington, DC. Data User Services Div.

    This report provides technical documentation associated with a 1980 Census of Population and Housing Summary Tape File (STF) 3F--which contains responses to the extended questionnaire summarized in STF 3, aggregated by school district. The file contains sample data inflated to represent the total population, 100% counts, and unweighted sample…

  10. 77 FR 43897 - Self-Regulatory Organizations; Chicago Board Options Exchange, Incorporated; Notice of Filing of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-26

    ... COMMISSION Release No. 34-67478; File No. SR-CBOE-2012-066] Self-Regulatory Organizations; Chicago Board... Exchange's requirement that TPHs file reports with the Exchange for any customer who held aggregate large... ( http://www.sec.gov/rules/sro.shtml ); or Send an email to rule-comments@sec.gov . Please include...

  11. Modifications to the accuracy assessment analysis routine MLTCRP to produce an output file

    NASA Technical Reports Server (NTRS)

    Carnes, J. G.

    1978-01-01

    Modifications are described that were made to the analysis program MLTCRP in the accuracy assessment software system to produce a disk output file. The output files produced by this modified program are used to aggregate data for regions greater than a single segment.

  12. Using stakeholder perspectives to develop an ePrescribing toolkit for NHS Hospitals: a questionnaire study

    PubMed Central

    Cresswell, Kathrin; Slee, Ann; Slight, Sarah P; Coleman, Jamie; Sheikh, Aziz

    2014-01-01

    Summary Objective To evaluate how an online toolkit may support ePrescribing deployments in National Health Service hospitals, by assessing the type of knowledge-based resources currently sought by key stakeholders. Design Questionnaire-based survey of attendees at a national ePrescribing symposium. Setting 2013 National ePrescribing Symposium in London, UK. Participants Eighty-four delegates were eligible for inclusion in the survey, of whom 70 completed and returned the questionnaire. Main outcome measures Estimate of the usefulness and type of content to be included in an ePrescribing toolkit. Results Interest in a toolkit designed to support the implementation and use of ePrescribing systems was high (n = 64; 91.4%). As could be expected given the current dearth of such a resource, few respondents (n = 2; 2.9%) had access or used an ePrescribing toolkit at the time of the survey. Anticipated users for the toolkit included implementation (n = 62; 88.6%) and information technology (n = 61; 87.1%) teams, pharmacists (n = 61; 87.1%), doctors (n = 58; 82.9%) and nurses (n = 56; 80.0%). Summary guidance for every stage of the implementation (n = 48; 68.6%), planning and monitoring tools (n = 47; 67.1%) and case studies of hospitals’ experiences (n = 45; 64.3%) were considered the most useful types of content. Conclusions There is a clear need for reliable and up-to-date knowledge to support ePrescribing system deployments and longer term use. The findings highlight how a toolkit may become a useful instrument for the management of knowledge in the field, not least by allowing the exchange of ideas and shared learning. PMID:25383199

  13. Improving the Effectiveness of Medication Review: Guidance from the Health Literacy Universal Precautions Toolkit

    PubMed Central

    Weiss, Barry D.; Brega, Angela G.; LeBlanc, William G.; Mabachi, Natabhona M.; Barnard, Juliana; Albright, Karen; Cifuentes, Maribel; Brach, Cindy; West, David R.

    2016-01-01

    Background Although routine medication reviews in primary care practice are recommended to identify drug therapy problems, it is often difficult to get patients to bring all their medications to office visits. The objective of this study was to determine whether the medication review tool in the Agency for Healthcare Research and Quality Health Literacy Universal Precautions Toolkit can help to improve medication reviews in primary care practices. Methods The toolkit's “Brown Bag Medication Review” was implemented in a rural private practice in Missouri and an urban teaching practice in California. Practices recorded outcomes of medication reviews with 45 patients before toolkit implementation and then changed their medication review processes based on guidance in the toolkit. Six months later we conducted interviews with practice staff to identify changes made as a result of implementing the tool, and practices recorded outcomes of medication reviews with 41 additional patients. Data analyses compared differences in whether all medications were brought to visits, the number of medications reviewed, drug therapy problems identified, and changes in medication regimens before and after implementation. Results Interviews revealed that practices made the changes recommended in the toolkit to encourage patients to bring medications to office visits. Evaluation before and after implementation revealed a 3-fold increase in the percentage of patients who brought all their prescription medications and a 6-fold increase in the number of prescription medications brought to office visits. The percentage of reviews in which drug therapy problems were identified doubled, as did the percentage of medication regimens revised. Conclusions Use of the Health Literacy Universal Precautions Toolkit can help to identify drug therapy problems. PMID:26769873

  14. Do-it-yourself guide: How to use the modern single molecule toolkit

    PubMed Central

    Walter, Nils G.; Huang, Cheng-Yen; Manzo, Anthony J.; Sobhy, Mohamed A.

    2008-01-01

    Single molecule microscopy has evolved into the ultimate-sensitivity toolkit to study systems from small molecules to living cells, with the prospect of revolutionizing the modern biosciences. Here we survey the current state-of-the-art in single molecule tools including fluorescence spectroscopy, tethered particle microscopy, optical and magnetic tweezers, and atomic force microscopy. Our review seeks to guide the biological scientist in choosing the right approach from the available single molecule toolkit for applications ranging as far as structural biology, enzymology, nanotechnology, and systems biology. PMID:18511916

  15. A methodological toolkit for field assessments of artisanally mined alluvial diamond deposits

    USGS Publications Warehouse

    Chirico, Peter G.; Malpeli, Katherine C.

    2014-01-01

    This toolkit provides a standardized checklist of critical issues relevant to artisanal mining-related field research. An integrated sociophysical geographic approach to collecting data at artisanal mine sites is outlined. The implementation and results of a multistakeholder approach to data collection, carried out in the assessment of Guinea’s artisanally mined diamond deposits, also are summarized. This toolkit, based on recent and successful field campaigns in West Africa, has been developed as a reference document to assist other government agencies or organizations in collecting the data necessary for artisanal diamond mining or similar natural resource assessments.

  16. A Platform to Build Mobile Health Apps: The Personal Health Intervention Toolkit (PHIT).

    PubMed

    Eckhoff, Randall Peter; Kizakevich, Paul Nicholas; Bakalov, Vesselina; Zhang, Yuying; Bryant, Stephanie Patrice; Hobbs, Maria Ann

    2015-01-01

    Personal Health Intervention Toolkit (PHIT) is an advanced cross-platform software framework targeted at personal self-help research on mobile devices. Following the subjective and objective measurement, assessment, and plan methodology for health assessment and intervention recommendations, the PHIT platform lets researchers quickly build mobile health research Android and iOS apps. They can (1) create complex data-collection instruments using a simple extensible markup language (XML) schema; (2) use Bluetooth wireless sensors; (3) create targeted self-help interventions based on collected data via XML-coded logic; (4) facilitate cross-study reuse from the library of existing instruments and interventions such as stress, anxiety, sleep quality, and substance abuse; and (5) monitor longitudinal intervention studies via daily upload to a Web-based dashboard portal. For physiological data, Bluetooth sensors collect real-time data with on-device processing. For example, using the BinarHeartSensor, the PHIT platform processes the heart rate data into heart rate variability measures, and plots these data as time-series waveforms. Subjective data instruments are user data-entry screens, comprising a series of forms with validation and processing logic. The PHIT instrument library consists of over 70 reusable instruments for various domains including cognitive, environmental, psychiatric, psychosocial, and substance abuse. Many are standardized instruments, such as the Alcohol Use Disorder Identification Test, Patient Health Questionnaire-8, and Post-Traumatic Stress Disorder Checklist. Autonomous instruments such as battery and global positioning system location support continuous background data collection. All data are acquired using a schedule appropriate to the app's deployment. The PHIT intelligent virtual advisor (iVA) is an expert system logic layer, which analyzes the data in real time on the device. This data analysis results in a tailored app of interventions

  17. A Platform to Build Mobile Health Apps: The Personal Health Intervention Toolkit (PHIT)

    PubMed Central

    2015-01-01

    Personal Health Intervention Toolkit (PHIT) is an advanced cross-platform software framework targeted at personal self-help research on mobile devices. Following the subjective and objective measurement, assessment, and plan methodology for health assessment and intervention recommendations, the PHIT platform lets researchers quickly build mobile health research Android and iOS apps. They can (1) create complex data-collection instruments using a simple extensible markup language (XML) schema; (2) use Bluetooth wireless sensors; (3) create targeted self-help interventions based on collected data via XML-coded logic; (4) facilitate cross-study reuse from the library of existing instruments and interventions such as stress, anxiety, sleep quality, and substance abuse; and (5) monitor longitudinal intervention studies via daily upload to a Web-based dashboard portal. For physiological data, Bluetooth sensors collect real-time data with on-device processing. For example, using the BinarHeartSensor, the PHIT platform processes the heart rate data into heart rate variability measures, and plots these data as time-series waveforms. Subjective data instruments are user data-entry screens, comprising a series of forms with validation and processing logic. The PHIT instrument library consists of over 70 reusable instruments for various domains including cognitive, environmental, psychiatric, psychosocial, and substance abuse. Many are standardized instruments, such as the Alcohol Use Disorder Identification Test, Patient Health Questionnaire-8, and Post-Traumatic Stress Disorder Checklist. Autonomous instruments such as battery and global positioning system location support continuous background data collection. All data are acquired using a schedule appropriate to the app’s deployment. The PHIT intelligent virtual advisor (iVA) is an expert system logic layer, which analyzes the data in real time on the device. This data analysis results in a tailored app of interventions

  18. Report filing in histopathology.

    PubMed Central

    Blenkinsopp, W K

    1977-01-01

    An assessment of alternative methods of filing histopathology report forms in alphabetical order showed that orthodox card index filing is satisfactory up to about 100000 reports but, because of the need for long-term retrieval, when the reports filed exceed this number they should be copied on jacketed microfilm and a new card index file begun. PMID:591645

  19. Common file formats.

    PubMed

    Leonard, Shonda A; Littlejohn, Timothy G; Baxevanis, Andreas D

    2007-01-01

    This appendix discusses a few of the file formats frequently encountered in bioinformatics. Specifically, it reviews the rules for generating FASTA files and provides guidance for interpreting NCBI descriptor lines, commonly found in FASTA files. In addition, it reviews the construction of GenBank, Phylip, MSF and Nexus files. PMID:18428774

  20. Peptide aggregation in neurodegenerative disease.

    PubMed

    Murphy, Regina M

    2002-01-01

    In the not-so-distant past, insoluble aggregated protein was considered as uninteresting and bothersome as yesterday's trash. More recently, protein aggregates have enjoyed considerable scientific interest, as it has become clear that these aggregates play key roles in many diseases. In this review, we focus attention on three polypeptides: beta-amyloid, prion, and huntingtin, which are linked to three feared neurodegenerative diseases: Alzheimer's, "mad cow," and Huntington's disease, respectively. These proteins lack any significant primary sequence homology, yet their aggregates possess very similar features, specifically, high beta-sheet content, fibrillar morphology, relative insolubility, and protease resistance. Because the aggregates are noncrystalline, secrets of their structure at nanometer resolution are only slowly yielding to X-ray diffraction, solid-state NMR, and other techniques. Besides structure, the aggregates may possess similar pathways of assembly. Two alternative assembly pathways have been proposed: the nucleation-elongation and the template-assisted mode. These two modes may be complementary, not mutually exclusive. Strategies for interfering with aggregation, which may provide novel therapeutic approaches, are under development. The structural similarities between protein aggregates of dissimilar origin suggest that therapeutic strategies successful against one disease may have broad utility in others. PMID:12117755

  1. Topics in Probabilistic Judgment Aggregation

    ERIC Educational Resources Information Center

    Wang, Guanchun

    2011-01-01

    This dissertation is a compilation of several studies that are united by their relevance to probabilistic judgment aggregation. In the face of complex and uncertain events, panels of judges are frequently consulted to provide probabilistic forecasts, and aggregation of such estimates in groups often yield better results than could have been made…

  2. Mineral of the month: aggregates

    USGS Publications Warehouse

    Tepordei, Valentin V.

    2005-01-01

    Natural aggregates, consisting of crushed stone, and sand and gravel, are a major contributor to economic health, and have an amazing variety of uses. Aggregates are among the most abundant mineral resources and are major basic raw materials used by construction, agriculture and other industries that employ complex chemical and metallurgical processes.

  3. Molecular aggregation of humic substances

    USGS Publications Warehouse

    Wershaw, R. L.

    1999-01-01

    Humic substances (HS) form molecular aggregates in solution and on mineral surfaces. Elucidation of the mechanism of formation of these aggregates is important for an understanding of the interactions of HS in soils arid natural waters. The HS are formed mainly by enzymatic depolymerization and oxidation of plant biopolymers. These reactions transform the aromatic and lipid plant components into amphiphilic molecules, that is, molecules that consist of separate hydrophobic (nonpolar) and hydrophilic (polar) parts. The nonpolar parts of the molecules are composed of relatively unaltered segments of plant polymers and the polar parts of carboxylic acid groups. These amphiphiles form membrane-like aggregates on mineral surfaces and micelle-like aggregates in solution. The exterior surfaces of these aggregates are hydrophilic, and the interiors constitute separate hydrophobic liquid-like phases.

  4. Nanoparticle aggregation: principles and modeling.

    PubMed

    Zhang, Wen

    2014-01-01

    The high surface area to volume ratio of nanoparticles usually results in highly reactive and colloidal instability compared to their bulk counterparts. Aggregation as well as many other transformations (e.g., dissolution) in the environment may alter the physiochemical properties, reactivity, fate, transport, and biological interactions (e.g., bioavailability and uptake) of nanoparticles. The unique properties pertinent to nanoparticles, such as shape, size, surface characteristics, composition, and electronic structures, greatly challenge the ability of colloid science to understand nanoparticle aggregation and its environmental impacts. This review briefly introduces fundamentals about aggregation, fractal dimensions, classic and extended Derjaguin-Landau-Verwey-Overbeak (DLVO) theories, aggregation kinetic modeling, experimental measurements, followed by detailed discussions on the major factors on aggregation and subsequent effects on nanomaterial transport and reactivity.

  5. Immunogenicity of Therapeutic Protein Aggregates.

    PubMed

    Moussa, Ehab M; Panchal, Jainik P; Moorthy, Balakrishnan S; Blum, Janice S; Joubert, Marisa K; Narhi, Linda O; Topp, Elizabeth M

    2016-02-01

    Therapeutic proteins have a propensity for aggregation during manufacturing, shipping, and storage. The presence of aggregates in protein drug products can induce adverse immune responses in patients that may affect safety and efficacy, and so it is of concern to both manufacturers and regulatory agencies. In this vein, there is a lack of understanding of the physicochemical determinants of immunological responses and a lack of standardized analytical methods to survey the molecular properties of aggregates associated with immune activation. In this review, we provide an overview of the basic immune mechanisms in the context of interactions with protein aggregates. We then critically examine the literature with emphasis on the underlying immune mechanisms as they relate to aggregate properties. Finally, we highlight the gaps in our current understanding of this issue and offer recommendations for future research. PMID:26869409

  6. Mechanics of fire ant aggregations

    NASA Astrophysics Data System (ADS)

    Tennenbaum, Michael; Liu, Zhongyang; Hu, David; Fernandez-Nieves, Alberto

    2016-01-01

    Fire ants link their bodies to form aggregations; these can adopt a variety of structures, they can drip and spread, or withstand applied loads. Here, by using oscillatory rheology, we show that fire ant aggregations are viscoelastic. We find that, at the lowest ant densities probed and in the linear regime, the elastic and viscous moduli are essentially identical over the spanned frequency range, which highlights the absence of a dominant mode of structural relaxation. As ant density increases, the elastic modulus rises, which we interpret by alluding to ant crowding and subsequent jamming. When deformed beyond the linear regime, the aggregation flows, exhibiting shear-thinning behaviour with a stress load that is comparable to the maximum load the aggregation can withstand before individual ants are torn apart. Our findings illustrate the rich, collective mechanical behaviour that can arise in aggregations of active, interacting building blocks.

  7. Mechanics of fire ant aggregations.

    PubMed

    Tennenbaum, Michael; Liu, Zhongyang; Hu, David; Fernandez-Nieves, Alberto

    2016-01-01

    Fire ants link their bodies to form aggregations; these can adopt a variety of structures, they can drip and spread, or withstand applied loads. Here, by using oscillatory rheology, we show that fire ant aggregations are viscoelastic. We find that, at the lowest ant densities probed and in the linear regime, the elastic and viscous moduli are essentially identical over the spanned frequency range, which highlights the absence of a dominant mode of structural relaxation. As ant density increases, the elastic modulus rises, which we interpret by alluding to ant crowding and subsequent jamming. When deformed beyond the linear regime, the aggregation flows, exhibiting shear-thinning behaviour with a stress load that is comparable to the maximum load the aggregation can withstand before individual ants are torn apart. Our findings illustrate the rich, collective mechanical behaviour that can arise in aggregations of active, interacting building blocks. PMID:26501413

  8. Adsorption-induced colloidal aggregation

    NASA Astrophysics Data System (ADS)

    Law, B. M.; Petit, J.-M.; Beysens, D.

    1998-03-01

    Reversible colloidal aggregation in binary liquid mixtures has been studied for a number of years. As the phase separation temperature of the liquid mixture is approached the thickness of an adsorption layer around the colloidal particles increases. Beysens and coworkers have demonstrated experimentally that this adsorption layer is intimately connected with the aggregation of the colloidal particles, however, no definitive theory has been available which can explain all of the experimental observations. In this contribution we describe an extension of the Derjaguin, Landau, Verwey, and Overbeek theory of colloidal aggregation which takes into account the presence of the adsorption layer and which more realistically models the attractive dispersion interactions. This modified theory can quantitatively account for many of the observed experimental features such as the characteristics of the aggregated state, the general shape of the aggregation line, and the temperature dependence of the second virial coefficient for a lutidine-water mixture containing a small volume fraction of silica colloidal particles.

  9. 26 CFR 1.1502-75 - Filing of consolidated returns.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 26 Internal Revenue 12 2013-04-01 2013-04-01 false Filing of consolidated returns. 1.1502-75... of the acquisition an aggregate of at least 25 percent of the fair market value of the outstanding... at least 25 percent of the fair market value of T's stock for 5 years, then for purposes of...

  10. 26 CFR 1.1502-75 - Filing of consolidated returns.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 26 Internal Revenue 12 2012-04-01 2012-04-01 false Filing of consolidated returns. 1.1502-75... of the acquisition an aggregate of at least 25 percent of the fair market value of the outstanding... at least 25 percent of the fair market value of T's stock for 5 years, then for purposes of...

  11. Excellence in Teaching End-of-Life Care. A New Multimedia Toolkit for Nurse Educators.

    ERIC Educational Resources Information Center

    Wilkie, Diana J.; Judge, Kay M.; Wells, Marjorie J.; Berkley, Ila Meredith

    2001-01-01

    Describes a multimedia toolkit for teaching palliative care in nursing, which contains modules on end-of-life topics: comfort, connections, ethics, grief, impact, and well-being. Other contents include myths, definitions, pre- and postassessments, teaching materials, case studies, learning activities, and resources. (SK)

  12. School Turnaround Teachers: Selection Toolkit. Part of the School Turnaround Collection from Public Impact

    ERIC Educational Resources Information Center

    Public Impact, 2008

    2008-01-01

    This toolkit includes these separate sections: (1) Selection Preparation Guide; (2) Day-of-Interview Tools; (3) Candidate Rating Tools; and (4) Candidate Comparison and Decision Tools. Each of the sections is designed to be used at different stages of the selection process. The first section provides turnaround teacher competencies that are the…

  13. School Turnaround Leaders: Selection Toolkit. Part of the School Turnaround Collection from Public Impact

    ERIC Educational Resources Information Center

    Public Impact, 2008

    2008-01-01

    This toolkit includes the following separate sections: (1) Selection Preparation Guide; (2) Day-of-Interview Tools; (3) Candidate Rating Tools; and (4) Candidate Comparison and Decision Tools. Each of the sections is designed to be used at different stages of the selection process. The first section provides a list of competencies that would…

  14. Tailored prevention of inpatient falls: development and usability testing of the fall TIPS toolkit.

    PubMed

    Zuyev, Lyubov; Benoit, Angela N; Chang, Frank Y; Dykes, Patricia C

    2011-02-01

    Patient falls and fall-related injuries are serious problems in hospitals. The Fall TIPS application aims to prevent patient falls by translating routine nursing fall risk assessment into a decision support intervention that communicates fall risk status and creates a tailored evidence-based plan of care that is accessible to the care team, patients, and family members. In our design and implementation of the Fall TIPS toolkit, we used the Spiral Software Development Life Cycle model. Three output tools available to be generated from the toolkit are bed poster, plan of care, and patient education handout. A preliminary design of the application was based on initial requirements defined by project leaders and informed by focus groups with end users. Preliminary design partially simulated the paper version of the Morse Fall Scale currently used in hospitals involved in the research study. Strengths and weaknesses of the first prototype were identified by heuristic evaluation. Usability testing was performed at sites where research study is implemented. Suggestions mentioned by end users participating in usability studies were either directly incorporated into the toolkit and output tools, were slightly modified, or will be addressed during training. The next step is implementation of the fall prevention toolkit on the pilot testing units.

  15. Building Management Information Systems to Coordinate Citywide Afterschool Programs: A Toolkit for Cities. Executive Summary

    ERIC Educational Resources Information Center

    Kingsley, Chris

    2012-01-01

    This executive summary describes highlights from the report, "Building Management Information Systems to Coordinate Citywide Afterschool Programs: A Toolkit for Cities." City-led efforts to build coordinated systems of afterschool programming are an important strategy for improving the health, safety and academic preparedness of children and…

  16. Development and Evaluation of an Integrated Pest Management Toolkit for Child Care Providers

    ERIC Educational Resources Information Center

    Alkon, Abbey; Kalmar, Evie; Leonard, Victoria; Flint, Mary Louise; Kuo, Devina; Davidson, Nita; Bradman, Asa

    2012-01-01

    Young children and early care and education (ECE) staff are exposed to pesticides used to manage pests in ECE facilities in the United States and elsewhere. The objective of this pilot study was to encourage child care programs to reduce pesticide use and child exposures by developing and evaluating an Integrated Pest Management (IPM) Toolkit for…

  17. Kelley Direct (KD) Toolkit: Toward the Development of Innovative Pedagogical Tools for Business Education

    ERIC Educational Resources Information Center

    Magjuka, Richard J.; Liu, Xiaojing; Lee, Seung-Hee

    2006-01-01

    KD Toolkit shows a representative synthesis of the best practices learned by world-renowned instructors in a top ranked online MBA program in the United States. This article will share and discuss the pedagogical implications of this learning technology and the leadership and innovative effort of the program that afforded the development of KD…

  18. Early Language & Literacy Classroom Observation (ELLCO) Toolkit, Research Edition [with] User's Guide.

    ERIC Educational Resources Information Center

    Smith, Miriam W.; Dickinson, David K.

    The document is comprised of the Early Language and Literacy Classroom Observation (ELLCO) Toolkit, a field-tested observation packet to examine literacy and language practices and materials in prekindergarten through third grade classrooms, and a user's guide providing technical information and instructions for administration and scoring the…

  19. After-School Toolkit: Tips, Techniques and Templates for Improving Program Quality

    ERIC Educational Resources Information Center

    Gutierrez, Nora; Bradshaw, Molly; Furano, Kathryn

    2008-01-01

    This toolkit offers program managers a hands-on guide for implementing quality programming in the after-school hours. The kit includes tools and techniques that increased the quality of literacy programming and helped improve student reading gains in the Communities Organizing Resources to Advance Learning (CORAL) initiative of The James Irvine…

  20. EMERGO: A Methodology and Toolkit for Developing Serious Games in Higher Education

    ERIC Educational Resources Information Center

    Nadolski, Rob J.; Hummel, Hans G. K.; van den Brink, Henk J.; Hoefakker, Ruud E.; Slootmaker, Aad; Kurvers, Hub J.; Storm, Jeroen

    2008-01-01

    Societal changes demand educators to apply new pedagogical approaches. Many educational stakeholders feel that serious games could play a key role in fulfilling this demand, and they lick their chops when looking at the booming industry of leisure games. However, current toolkits for developing leisure games show severe shortcomings when applied…

  1. 78 FR 14774 - U.S. Environmental Solutions Toolkit-Universal Waste

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-07

    ... Toolkit will refer users in foreign markets to U.S. approaches to solving environmental problems and to U... technologies that will outline U.S. approaches to a series of environmental problems and highlight...-482-5665; email: todd.delelle@trade.gov ). Catherine Vial, Team Leader, Environmental and...

  2. 78 FR 14773 - U.S. Environmental Solutions Toolkit-Medical Waste

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-07

    ... Toolkit will refer users in foreign markets to U.S. approaches to solving environmental problems and to U... technologies that will outline U.S. approaches to a series of environmental problems and highlight...-482-5665; email: todd.delelle@trade.gov ). Catherine Vial, Team Leader, Environmental and...

  3. Understanding Disabilities in American Indian and Alaska Native Communities. Toolkit Guide.

    ERIC Educational Resources Information Center

    National Council on Disability, Washington, DC.

    This "toolkit" document is intended to provide a culturally appropriate set of resources to address the unique political and legal concerns of people with disabilities in American Indian/Alaska Native (AI/AN) communities. It provides information on education, health, vocational rehabilitation (VR), independent living, model approaches, and…

  4. Distribution of a Generic Mission Planning and Scheduling Toolkit for Astronomical Spacecraft

    NASA Technical Reports Server (NTRS)

    Kleiner, Steven C.; West, Donald K. (Technical Monitor)

    2001-01-01

    This contract provided for the packaging and distribution of the planning and scheduling toolkit developed for the SWAS (Submillimeter Wave Astronomy Satellite). This work is complete and the planning and scheduling software is available for browsing and download at its own web site

  5. Making Schools the Model for Healthier Environments Toolkit: What It Is

    ERIC Educational Resources Information Center

    Robert Wood Johnson Foundation, 2012

    2012-01-01

    Healthy students perform better. Poor nutrition and inadequate physical activity can affect not only academic achievement, but also other factors such as absenteeism, classroom behavior, ability to concentrate, self-esteem, cognitive performance, and test scores. This toolkit provides information to help make schools the model for healthier…

  6. Midwives in Medical Student and Resident Education and the Development of the Medical Education Caucus Toolkit.

    PubMed

    2015-01-01

    In the article, “Midwives in Medical Student and Resident Education and the Development of the Medical Education Caucus Toolkit,” published in the May/June 2015 issue of the Journal of Midwifery & Women's Health (60[3]:304-312) there was an error in the author byline. The correct name of the second author is Amy Nacht, CNM, MSN.

  7. The Student Writing Toolkit: Enhancing Undergraduate Teaching of Scientific Writing in the Biological Sciences

    ERIC Educational Resources Information Center

    Dirrigl, Frank J., Jr.; Noe, Mark

    2014-01-01

    Teaching scientific writing in biology classes is challenging for both students and instructors. This article offers and reviews several useful "toolkit" items that improve student writing. These include sentence and paper-length templates, funnelling and compartmentalisation, and preparing compendiums of corrections. In addition,…

  8. Development of an Online Toolkit for Measuring Commercial Building Energy Efficiency Performance -- Scoping Study

    SciTech Connect

    Wang, Na

    2013-03-13

    This study analyzes the market needs for building performance evaluation tools. It identifies the existing gaps and provides a roadmap for the U.S. Department of Energy (DOE) to develop a toolkit with which to optimize energy performance of a commercial building over its life cycle.

  9. Logic Models for Program Design, Implementation, and Evaluation: Workshop Toolkit. REL 2015-057

    ERIC Educational Resources Information Center

    Shakman, Karen; Rodriguez, Sheila M.

    2015-01-01

    The Logic Model Workshop Toolkit is designed to help practitioners learn the purpose of logic models, the different elements of a logic model, and the appropriate steps for developing and using a logic model for program evaluation. Topics covered in the sessions include an overview of logic models, the elements of a logic model, an introduction to…

  10. College Access and Success for Students Experiencing Homelessness: A Toolkit for Educators and Service Providers

    ERIC Educational Resources Information Center

    Dukes, Christina

    2013-01-01

    This toolkit serves as a comprehensive resource on the issue of higher education access and success for homeless students, including information on understanding homeless students, assisting homeless students in choosing a school, helping homeless students pay for application-related expenses, assisting homeless students in finding financial aid…

  11. Testing Video and Social Media for Engaging Users of the U.S. Climate Resilience Toolkit

    NASA Astrophysics Data System (ADS)

    Green, C. J.; Gardiner, N.; Niepold, F., III; Esposito, C.

    2015-12-01

    We developed a custom video production stye and a method for analyzing social media behavior so that we may deliberately build and track audience growth for decision-support tools and case studies within the U.S. Climate Resilience Toolkit. The new style of video focuses quickly on decision processes; its 30s format is well-suited for deployment through social media. We measured both traffic and engagement with video using Google Analytics. Each video included an embedded tag, allowing us to measure viewers' behavior: whether or not they entered the toolkit website; the duration of their session on the website; and the number pages they visited in that session. Results showed that video promotion was more effective on Facebook than Twitter. Facebook links generated twice the number of visits to the toolkit. Videos also increased Facebook interaction overall. Because most Facebook users are return visitors, this campaign did not substantially draw new site visitors. We continue to research and apply these methods in a targeted engagement and outreach campaign that utilizes the theory of social diffusion and social influence strategies to grow our audience of "influential" decision-makers and people within their social networks. Our goal is to increase access and use of the U.S. Climate Resilience Toolkit.

  12. University of Central Florida and the American Association of State Colleges and Universities: Blended Learning Toolkit

    ERIC Educational Resources Information Center

    EDUCAUSE, 2014

    2014-01-01

    The Blended Learning Toolkit supports the course redesign approach, and interest in its openly available clearinghouse of online tools, strategies, curricula, and other materials to support the adoption of blended learning continues to grow. When the resource originally launched in July 2011, 20 AASCU [American Association of State Colleges and…

  13. A Generic Expert Scheduling System Architecture and Toolkit: GUESS (Generically Used Expert Scheduling System)

    NASA Technical Reports Server (NTRS)

    Liebowitz, Jay; Krishnamurthy, Vijaya; Rodens, Ira; Houston, Chapman; Liebowitz, Alisa; Baek, Seung; Radko, Joe; Zeide, Janet

    1996-01-01

    Scheduling has become an increasingly important element in today's society and workplace. Within the NASA environment, scheduling is one of the most frequently performed and challenging functions. Towards meeting NASA's scheduling needs, a research version of a generic expert scheduling system architecture and toolkit has been developed. This final report describes the development and testing of GUESS (Generically Used Expert Scheduling System).

  14. Field trials of a novel toolkit for evaluating 'intangible' values-related dimensions of projects.

    PubMed

    Burford, Gemma; Velasco, Ismael; Janoušková, Svatava; Zahradnik, Martin; Hak, Tomas; Podger, Dimity; Piggot, Georgia; Harder, Marie K

    2013-02-01

    A novel toolkit has been developed, using an original approach to develop its components, for the purpose of evaluating 'soft' outcomes and processes that have previously been generally considered 'intangible': those which are specifically values based. This represents a step-wise, significant, change in provision for the assessment of values-based achievements that are of absolutely key importance to most civil society organisations (CSOs) and values-based businesses, and fills a known gap in evaluation practice. In this paper, we demonstrate the significance and rigour of the toolkit by presenting an evaluation of it in three diverse scenarios where different CSOs use it to co-evaluate locally relevant outcomes and processes to obtain results which are both meaningful to them and potentially comparable across organisations. A key strength of the toolkit is its original use of a prior generated, peer-elicited 'menu' of values-based indicators which provides a framework for user CSOs to localise. Principles of participatory, process-based and utilisation-focused evaluation are embedded in this toolkit and shown to be critical to its success, achieving high face-validity and wide applicability. The emerging contribution of this next-generation evaluation tool to other fields, such as environmental values, development and environmental sustainable development, shared values, business, education and organisational change is outlined. PMID:22621861

  15. Student-Centred Learning: Toolkit for Students, Staff and Higher Education Institutions

    ERIC Educational Resources Information Center

    Attard, Angele; Di Iorio, Emma; Geven, Koen; Santa, Robert

    2010-01-01

    This Toolkit forms part of the project entitled "Time for a New Paradigm in Education: Student-Centred Learning" (T4SCL), jointly led by the European Students' Union (ESU) and Education International (EI). This is an EU-funded project under the Lifelong Learning Programme (LLP) administered by the Education, Audiovisual and Culture Executive…

  16. Virtual venue management users manual : access grid toolkit documentation, version 2.3.

    SciTech Connect

    Judson, I. R.; Lefvert, S.; Olson, E.; Uram, T. D.; Mathematics and Computer Science

    2007-10-24

    An Access Grid Venue Server provides access to individual Virtual Venues, virtual spaces where users can collaborate using the Access Grid Venue Client software. This manual describes the Venue Server component of the Access Grid Toolkit, version 2.3. Covered here are the basic operations of starting a venue server, modifying its configuration, and modifying the configuration of the individual venues.

  17. Can a workbook work? Examining whether a practitioner evaluation toolkit can promote instrumental use.

    PubMed

    Campbell, Rebecca; Townsend, Stephanie M; Shaw, Jessica; Karim, Nidal; Markowitz, Jenifer

    2015-10-01

    In large-scale, multi-site contexts, developing and disseminating practitioner-oriented evaluation toolkits are an increasingly common strategy for building evaluation capacity. Toolkits explain the evaluation process, present evaluation design choices, and offer step-by-step guidance to practitioners. To date, there has been limited research on whether such resources truly foster the successful design, implementation, and use of evaluation findings. In this paper, we describe a multi-site project in which we developed a practitioner evaluation toolkit and then studied the extent to which the toolkit and accompanying technical assistance was effective in promoting successful completion of local-level evaluations and fostering instrumental use of the findings (i.e., whether programs directly used their findings to improve practice, see Patton, 2008). Forensic nurse practitioners from six geographically dispersed service programs completed methodologically rigorous evaluations; furthermore, all six programs used the findings to create programmatic and community-level changes to improve local practice. Implications for evaluation capacity building are discussed.

  18. Toolkit Approach to Integrating Library Resources into the Learning Management System

    ERIC Educational Resources Information Center

    Black, Elizabeth L.

    2008-01-01

    As use of learning management systems (LMS) increases, it is essential that librarians are there. Ohio State University Libraries took a toolkit approach to integrate library content in the LMS to facilitate creative and flexible interactions between librarians, students and faculty in Ohio State University's large and decentralized academic…

  19. Use of Remote Sensing Data to Enhance the National Weather Service (NWS) Storm Damage Toolkit

    NASA Technical Reports Server (NTRS)

    Jedlovec, Gary; Molthan, Andrew; White, Kris; Burks, Jason; Stellman, Keith; Smith, Matthew

    2012-01-01

    SPoRT is improving the use of near real-time satellite data in response to severe weather events and other diasters. Supported through NASA s Applied Sciences Program. Planned interagency collaboration to support NOAA s Damage Assessment Toolkit, with spinoff opportunities to support other entities such as USGS and FEMA.

  20. The Automatic Parallelisation of Scientific Application Codes Using a Computer Aided Parallelisation Toolkit

    NASA Technical Reports Server (NTRS)

    Ierotheou, C.; Johnson, S.; Leggett, P.; Cross, M.; Evans, E.; Jin, Hao-Qiang; Frumkin, M.; Yan, J.; Biegel, Bryan (Technical Monitor)

    2001-01-01

    The shared-memory programming model is a very effective way to achieve parallelism on shared memory parallel computers. Historically, the lack of a programming standard for using directives and the rather limited performance due to scalability have affected the take-up of this programming model approach. Significant progress has been made in hardware and software technologies, as a result the performance of parallel programs with compiler directives has also made improvements. The introduction of an industrial standard for shared-memory programming with directives, OpenMP, has also addressed the issue of portability. In this study, we have extended the computer aided parallelization toolkit (developed at the University of Greenwich), to automatically generate OpenMP based parallel programs with nominal user assistance. We outline the way in which loop types are categorized and how efficient OpenMP directives can be defined and placed using the in-depth interprocedural analysis that is carried out by the toolkit. We also discuss the application of the toolkit on the NAS Parallel Benchmarks and a number of real-world application codes. This work not only demonstrates the great potential of using the toolkit to quickly parallelize serial programs but also the good performance achievable on up to 300 processors for hybrid message passing and directive-based parallelizations.

  1. Development of the Mississippi communities for healthy living nutrition education toolkit

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The objective of our study was to develop a nutrition education toolkit for communities in the Lower Mississippi Delta (LMD) with content that is current, evidence-based, culturally relevant, and user friendly. The Mississippi Communities for Fealthy Living (MCHL), an evidenced-based nutrition educa...

  2. Redesigning Schools to Reach Every Student with Excellent Teachers: Teacher & Staff Selection, Development, & Evaluation Toolkit

    ERIC Educational Resources Information Center

    Public Impact, 2012

    2012-01-01

    This toolkit is a companion to the school models provided on OpportunityCulture.org. The school models use job redesign and technology to extend the reach of excellent teachers to more students, for more pay, within budget. Most of these school models create new roles and collaborative teams, enabling all teachers and staff to develop and…

  3. UNESCO-APQN Toolkit: Regulating the Quality of Cross-Border Education

    ERIC Educational Resources Information Center

    Online Submission, 2006

    2006-01-01

    As a result of increasing mobility of students and knowledge, cross border education, especially in higher education, is receiving general attention in the Asia and Pacific region. The toolkit serves as a reference tool to assist local policymakers in the formulation of a regulatory framework for cross-border education that is growing rapidly in…

  4. Assessing the effectiveness of the Pesticides and Farmworker Health Toolkit: a curriculum for enhancing farmworkers' understanding of pesticide safety concepts.

    PubMed

    LePrevost, Catherine E; Storm, Julia F; Asuaje, Cesar R; Arellano, Consuelo; Cope, W Gregory

    2014-01-01

    Among agricultural workers, migrant and seasonal farmworkers have been recognized as a special risk population because these laborers encounter cultural challenges and linguistic barriers while attempting to maintain their safety and health within their working environments. The crop-specific Pesticides and Farmworker Health Toolkit (Toolkit) is a pesticide safety and health curriculum designed to communicate to farmworkers pesticide hazards commonly found in their working environments and to address Worker Protection Standard (WPS) pesticide training criteria for agricultural workers. The goal of this preliminary study was to test evaluation items for measuring knowledge increases among farmworkers and to assess the effectiveness of the Toolkit in improving farmworkers' knowledge of key WPS and risk communication concepts when the Toolkit lesson was delivered by trained trainers in the field. After receiving training on the curriculum, four participating trainers provided lessons using the Toolkit as part of their regular training responsibilities and orally administered a pre- and post-lesson evaluation instrument to 20 farmworker volunteers who were generally representative of the national farmworker population. Farmworker knowledge of pesticide safety messages significantly (P<.05) increased after participation in the lesson. Further, items with visual alternatives were found to be most useful in discriminating between more and less knowledgeable farmworkers. The pilot study suggests that the Pesticides and Farmworker Health Toolkit is an effective, research-based pesticide safety and health intervention for the at-risk farmworker population and identifies a testing format appropriate for evaluating the Toolkit and other similar interventions for farmworkers in the field.

  5. Fractal aggregates in Titan's atmosphere

    NASA Astrophysics Data System (ADS)

    Cabane, M.; Rannou, P.; Chassefiere, E.; Israel, G.

    1993-04-01

    The cluster structure of Titan's atmosphere was modeled by using an Eulerian microphysical model with the specific formulation of microphysical laws applying to fractal particles. The growth of aggregates in the settling phase was treated by introducing the fractal dimension as a parameter of the model. The model was used to obtain a vertical distribution of size and number density of the aggregates for different production altitudes. Results confirm previous estimates of the formation altitude of photochemical aerosols. The vertical profile of the effective radius of aggregates was calculated as a function of the visible optical depth.

  6. Information from Searching Content with an Ontology-Utilizing Toolkit (iSCOUT).

    PubMed

    Lacson, Ronilda; Andriole, Katherine P; Prevedello, Luciano M; Khorasani, Ramin

    2012-08-01

    Radiology reports are permanent legal documents that serve as official interpretation of imaging tests. Manual analysis of textual information contained in these reports requires significant time and effort. This study describes the development and initial evaluation of a toolkit that enables automated identification of relevant information from within these largely unstructured text reports. We developed and made publicly available a natural language processing toolkit, Information from Searching Content with an Ontology-Utilizing Toolkit (iSCOUT). Core functions are included in the following modules: the Data Loader, Header Extractor, Terminology Interface, Reviewer, and Analyzer. The toolkit enables search for specific terms and retrieval of (radiology) reports containing exact term matches as well as similar or synonymous term matches within the text of the report. The Terminology Interface is the main component of the toolkit. It allows query expansion based on synonyms from a controlled terminology (e.g., RadLex or National Cancer Institute Thesaurus [NCIT]). We evaluated iSCOUT document retrieval of radiology reports that contained liver cysts, and compared precision and recall with and without using NCIT synonyms for query expansion. iSCOUT retrieved radiology reports with documented liver cysts with a precision of 0.92 and recall of 0.96, utilizing NCIT. This recall (i.e., utilizing the Terminology Interface) is significantly better than using each of two search terms alone (0.72, p=0.03 for liver cyst and 0.52, p=0.0002 for hepatic cyst). iSCOUT reliably assembled relevant radiology reports for a cohort of patients with liver cysts with significant improvement in document retrieval when utilizing controlled lexicons.

  7. Development and formative evaluation of the e-Health Implementation Toolkit (e-HIT)

    PubMed Central

    2010-01-01

    Background The use of Information and Communication Technology (ICT) or e-Health is seen as essential for a modern, cost-effective health service. However, there are well documented problems with implementation of e-Health initiatives, despite the existence of a great deal of research into how best to implement e-Health (an example of the gap between research and practice). This paper reports on the development and formative evaluation of an e-Health Implementation Toolkit (e-HIT) which aims to summarise and synthesise new and existing research on implementation of e-Health initiatives, and present it to senior managers in a user-friendly format. Results The content of the e-HIT was derived by combining data from a systematic review of reviews of barriers and facilitators to implementation of e-Health initiatives with qualitative data derived from interviews of "implementers", that is people who had been charged with implementing an e-Health initiative. These data were summarised, synthesised and combined with the constructs from the Normalisation Process Model. The software for the toolkit was developed by a commercial company (RocketScience). Formative evaluation was undertaken by obtaining user feedback. There are three components to the toolkit - a section on background and instructions for use aimed at novice users; the toolkit itself; and the report generated by completing the toolkit. It is available to download from http://www.ucl.ac.uk/pcph/research/ehealth/documents/e-HIT.xls Conclusions The e-HIT shows potential as a tool for enhancing future e-Health implementations. Further work is needed to make it fully web-enabled, and to determine its predictive potential for future implementations. PMID:20955594

  8. Margins of safety provided by COSHH Essentials and the ILO Chemical Control Toolkit.

    PubMed

    Jones, Rachael M; Nicas, Mark

    2006-03-01

    COSHH Essentials, developed by the UK Health and Safety Executive, and the Chemical Control Toolkit (Toolkit) proposed by the International Labor Organization, are 'control banding' approaches to workplace risk management intended for use by proprietors of small and medium-sized businesses. Both systems group chemical substances into hazard bands based on toxicological endpoint and potency. COSSH Essentials uses the European Union's Risk-phrases (R-phrases), whereas the Toolkit uses R-phrases and the Globally Harmonized System (GHS) of Classification and Labeling of Chemicals. Each hazard band is associated with a range of airborne concentrations, termed exposure bands, which are to be attained by the implementation of recommended control technologies. Here we analyze the margin of safety afforded by the systems and, for each hazard band, define the minimal margin as the ratio of the minimum airborne concentration that produced the toxicological endpoint of interest in experimental animals to the maximum concentration in workplace air permitted by the exposure band. We found that the minimal margins were always <100, with some ranging to <1, and inversely related to molecular weight. The Toolkit-GHS system generally produced margins equal to or larger than COSHH Essentials, suggesting that the Toolkit-GHS system is more protective of worker health. Although, these systems predict exposures comparable with current occupational exposure limits, we argue that the minimal margins are better indicators of health protection. Further, given the small margins observed, we feel it is important that revisions of these systems provide the exposure bands to users, so as to permit evaluation of control technology capture efficiency. PMID:16172140

  9. Phase 1 Development Report for the SESSA Toolkit.

    SciTech Connect

    Knowlton, Robert G.; Melton, Brad J; Anderson, Robert J.

    2014-09-01

    operation of th e SESSA tool kit in order to give the user enough information to start using the tool kit . SESSA is currently a prototype system and this documentation covers the initial release of the tool kit . Funding for SESSA was provided by the Department of Defense (D oD), Assistant Secretary of Defense for Research and Engineering (ASD(R&E)) Rapid Fielding (RF) organization. The project was managed by the Defense Forensic Science Center (DFSC) , formerly known as the U.S. Army Criminal Investigation Laboratory (USACIL) . ACKNOWLEDGEMENTS The authors wish to acknowledge the funding support for the development of the Site Exploitation System for Situational Awareness (SESSA) toolkit from the Department of Defense (DoD), Assistant Secretary of Defense for Research and Engineering (ASD(R&E)) Rapid Fielding (RF) organization. The project was managed by the Defense Forensic Science Center (DFSC) , formerly known as the U.S. Army Criminal Investigation Laboratory (USACIL). Special thanks to Mr. Garold Warner, of DFSC, who served as the Project Manager. Individuals that worked on the design, functional attributes, algorithm development, system arc hitecture, and software programming include: Robert Knowlton, Brad Melton, Robert Anderson, and Wendy Amai.

  10. Managing Water Resources Using WebGIS: Development and Application of an ArcGIS Explorer Toolkit - uWATER

    NASA Astrophysics Data System (ADS)

    Lin, Y. F.; Yang, Y. E.

    2010-12-01

    Integration of GIS with decision support systems (DSS) for environmental resources management has been a popular issue for more than a decade. A GIS-based DSS permits visualization of considerations pertinent to the decision-making process and allows informed participation by the non-technical decision maker. However, the information and knowledge contained in GIS datasets cannot be fairly and efficiently shared unless all interested parties have access to GIS software. This problem has been addressed with the recent release by the Environmental Systems Research Institute, Inc. (ESRI) of a free, lightweight GIS browser, ArcGIS Explorer, a development with significant promise for community participation in planning via a GIS-based DSS. The ubiquitous WebGIS Analysis Toolkit for Extensive Resources (uWATER) is an ArcGIS Explorer plug-in package developed by the Illinois State Water Survey for visualizing and analyzing decision support variables such as groundwater modeling results. The uWATER package was coded in Microsoft Visual Basic 2008 and designed to utilize the computing capacity of a single laptop computer. This presentation will demonstrate an application using uWATER for groundwater resource management including: 1) visualizing heads computed by a numerical model; 2) computation using analytical approaches; 3) identifying hydrologic assets (such as fish habitats, streams, and wetlands) threatened by pumping. uWATER is free and can be downloaded from http://www.sws.uiuc.edu/gws/sware/uwater/. The download package includes the plug-in program, user's guide, and sample files. The Architecture of uWATER in ArcGIS Explorer

  11. Surface fractals in liposome aggregation.

    PubMed

    Roldán-Vargas, Sándalo; Barnadas-Rodríguez, Ramon; Quesada-Pérez, Manuel; Estelrich, Joan; Callejas-Fernández, José

    2009-01-01

    In this work, the aggregation of charged liposomes induced by magnesium is investigated. Static and dynamic light scattering, Fourier-transform infrared spectroscopy, and cryotransmission electron microscopy are used as experimental techniques. In particular, multiple intracluster scattering is reduced to a negligible amount using a cross-correlation light scattering scheme. The analysis of the cluster structure, probed by means of static light scattering, reveals an evolution from surface fractals to mass fractals with increasing magnesium concentration. Cryotransmission electron microscopy micrographs of the aggregates are consistent with this interpretation. In addition, a comparative analysis of these results with those previously reported in the presence of calcium suggests that the different hydration energy between lipid vesicles when these divalent cations are present plays a fundamental role in the cluster morphology. This suggestion is also supported by infrared spectroscopy data. The kinetics of the aggregation processes is also analyzed through the time evolution of the mean diffusion coefficient of the aggregates. PMID:19257067

  12. Surface fractals in liposome aggregation.

    PubMed

    Roldán-Vargas, Sándalo; Barnadas-Rodríguez, Ramon; Quesada-Pérez, Manuel; Estelrich, Joan; Callejas-Fernández, José

    2009-01-01

    In this work, the aggregation of charged liposomes induced by magnesium is investigated. Static and dynamic light scattering, Fourier-transform infrared spectroscopy, and cryotransmission electron microscopy are used as experimental techniques. In particular, multiple intracluster scattering is reduced to a negligible amount using a cross-correlation light scattering scheme. The analysis of the cluster structure, probed by means of static light scattering, reveals an evolution from surface fractals to mass fractals with increasing magnesium concentration. Cryotransmission electron microscopy micrographs of the aggregates are consistent with this interpretation. In addition, a comparative analysis of these results with those previously reported in the presence of calcium suggests that the different hydration energy between lipid vesicles when these divalent cations are present plays a fundamental role in the cluster morphology. This suggestion is also supported by infrared spectroscopy data. The kinetics of the aggregation processes is also analyzed through the time evolution of the mean diffusion coefficient of the aggregates.

  13. Electromagnetic charges in aggregation phenomena.

    NASA Astrophysics Data System (ADS)

    Rioux, Claude; Slobodrian, R. J.

    Introduction The mechanism of fine particles aggregation is of great importance in many areas of research, in particular environment sciences where the state of aggregation defines the removal speed of dust from the atmosphere. The study of this mechanism is also important to understand the first stage of planet formation from the solar nebula. The aggregates formed are generally fractals and, as mentioned in the literature [1], the fractal dimensions and the site growth probability measures of the resulting fractal structures strongly depend on the properties of the forces that cause the aggregation. Theory and experimental apparatus We began this study by the aggregation between two charged particles and we are now consid-ering the aggregation between two magnetized particles. The aggregations are produced in a gas at a pressure between 10 and 1000 mbar and by using the applicable simplifications; we find that the distance (r) between the particles as a function of time (t) is given by the following equations: r=Ce(tf -t)1/3 for the electrical attraction r=Cm(tf -t)1/5 for the magnetic dipoles aligned in an external magnetic field. The apparatus built for these measurements consists of an experimental cell from which two perpendicular views are combined via an optical system in one image recorded by a video camera. From the video, we can then measure the distance between the particles as a function of time and reconstruct the trajectories in 3-D. The horizontal and vertical resolutions are respectively 0.86 and 0.92 microns per pixel. With a depth of field of 250 microns, the usable volume for 3-D observation in then 250 microns x 250 microns x 443 microns. Results and discussion A first version of the apparatus was tested on an electrical force aggregation and the results [2] show that the corresponding equation is a good representation of the phenomenon. Preliminary results, from an experiment using iron particles, show that the magnetic force can be seen in

  14. Aggregate breakdown of nanoparticulate titania

    NASA Astrophysics Data System (ADS)

    Venugopal, Navin

    Six nanosized titanium dioxide powders synthesized from a sulfate process were investigated. The targeted end-use of this powder was for a de-NOx catalyst honeycomb monolith. Alteration of synthesis parameters had resulted principally in differences in soluble ion level and specific surface area of the powders. The goal of this investigation was to understand the role of synthesis parameters in the aggregation behavior of these powders. Investigation via scanning electron microscopy of the powders revealed three different aggregation iterations at specific length scales. Secondary and higher order aggregate strength was investigated via oscillatory stress rheometry as a means of simulating shear conditions encountered during extrusion. G' and G'' were measured as a function of the applied oscillatory stress. Oscillatory rheometry indicated a strong variation as a function of the sulfate level of the particles in the viscoelastic yield strengths. Powder yield stresses ranged from 3.0 Pa to 24.0 Pa of oscillatory stress. Compaction curves to 750 MPa found strong similarities in extrapolated yield point of stage I and II compaction for each of the powders (at approximately 500 MPa) suggesting that the variation in sulfate was greatest above the primary aggregate level. Scanning electron microscopy of samples at different states of shear in oscillatory rheometry confirmed the variation in the linear elastic region and the viscous flow regime. A technique of this investigation was to approach aggregation via a novel perspective: aggregates are distinguished as being loose open structures that are highly disordered and stochastic in nature. The methodology used was to investigate the shear stresses required to rupture the various aggregation stages encountered and investigate the attempt to realign the now free-flowing constituents comprising the aggregate into a denser configuration. Mercury porosimetry was utilized to measure the pore size of the compact resulting from

  15. Glycation precedes lens crystallin aggregation

    SciTech Connect

    Swamy, M.S.; Perry, R.E.; Abraham, E.C.

    1987-05-01

    Non-enzymatic glycosylation (glycation) seems to have the potential to alter the structure of crystallins and make them susceptible to thiol oxidation leading to disulfide-linked high molecular weight (HMW) aggregate formation. They used streptozotocin diabetic rats during precataract and cataract stages and long-term cell-free glycation of bovine lens crystallins to study the relationship between glycation and lens crystallin aggregation. HMW aggregates and other protein components of the water-soluble (WS) and urea-soluble (US) fractions were separated by molecular sieve high performance liquid chromatography. Glycation was estimated by both (/sup 3/H)NaBH/sub 4/ reduction and phenylboronate agarose affinity chromatography. Levels of total glycated protein (GP) in the US fractions were about 2-fold higher than in the WS fractions and there was a linear increase in GP in both WS and US fractions. This increase was parallelled by a corresponding increase in HMW aggregates. Total GP extracted by the affinity method from the US fraction showed a predominance of HMW aggregates and vice versa. Cell-free glycation studies with bovine crystallins confirmed the results of the animals studies. Increasing glycation caused a corresponding increase in protein insolubilization and the insoluble fraction thus formed also contained more glycated protein. It appears that lens protein glycation, HMW aggregate formation, and protein insolubilization are interrelated.

  16. Ash Aggregates in Proximal Settings

    NASA Astrophysics Data System (ADS)

    Porritt, L. A.; Russell, K.

    2012-12-01

    Ash aggregates are thought to have formed within and been deposited by the eruption column and plume and dilute density currents and their associated ash clouds. Moist, turbulent ash clouds are considered critical to ash aggregate formation by facilitating both collision and adhesion of particles. Consequently, they are most commonly found in distal deposits. Proximal deposits containing ash aggregates are less commonly observed but do occur. Here we describe two occurrences of vent proximal ash aggregate-rich deposits; the first within a kimberlite pipe where coated ash pellets and accretionary lapilli are found within the intra-vent sequence; and the second in a glaciovolcanic setting where cored pellets (armoured lapilli) occur within <1 km of the vent. The deposits within the A418 pipe, Diavik Diamond Mine, Canada, are the residual deposits within the conduit and vent of the volcano and are characterised by an abundance of ash aggregates. Coated ash pellets are dominant but are followed in abundance by ash pellets, accretionary lapilli and rare cored pellets. The coated ash pellets typically range from 1 - 5 mm in diameter and have core to rim ratios of approximately 10:1. The formation and preservation of these aggregates elucidates the style and nature of the explosive phase of kimberlite eruption at A418 (and other pipes?). First, these pyroclasts dictate the intensity of the kimberlite eruption; it must be energetic enough to cause intense fragmentation of the kimberlite to produce a substantial volume of very fine ash (<62 μm). Secondly, the ash aggregates indicate the involvement of moisture coupled with the presence of dilute expanded eruption clouds. The structure and distribution of these deposits throughout the kimberlite conduit demand that aggregation and deposition operate entirely within the confines of the vent; this indicates that aggregation is a rapid process. Ash aggregates within glaciovolcanic sequences are also rarely documented. The

  17. High-Performance, Multi-Node File Copies and Checksums for Clustered File Systems

    NASA Technical Reports Server (NTRS)

    Kolano, Paul Z.; Ciotti, Robert B.

    2012-01-01

    Modern parallel file systems achieve high performance using a variety of techniques, such as striping files across multiple disks to increase aggregate I/O bandwidth and spreading disks across multiple servers to increase aggregate interconnect bandwidth. To achieve peak performance from such systems, it is typically necessary to utilize multiple concurrent readers/writers from multiple systems to overcome various singlesystem limitations, such as number of processors and network bandwidth. The standard cp and md5sum tools of GNU coreutils found on every modern Unix/Linux system, however, utilize a single execution thread on a single CPU core of a single system, and hence cannot take full advantage of the increased performance of clustered file systems. Mcp and msum are drop-in replacements for the standard cp and md5sum programs that utilize multiple types of parallelism and other optimizations to achieve maximum copy and checksum performance on clustered file systems. Multi-threading is used to ensure that nodes are kept as busy as possible. Read/write parallelism allows individual operations of a single copy to be overlapped using asynchronous I/O. Multinode cooperation allows different nodes to take part in the same copy/checksum. Split-file processing allows multiple threads to operate concurrently on the same file. Finally, hash trees allow inherently serial checksums to be performed in parallel. Mcp and msum provide significant performance improvements over standard cp and md5sum using multiple types of parallelism and other optimizations. The total speed-ups from all improvements are significant. Mcp improves cp performance over 27x, msum improves md5sum performance almost 19x, and the combination of mcp and msum improves verified copies via cp and md5sum by almost 22x. These improvements come in the form of drop-in replacements for cp and md5sum, so are easily used and are available for download as open source software at http://mutil.sourceforge.net.

  18. Exploiting Lustre File Joining for Effective Collective IO

    SciTech Connect

    Yu, Weikuan; Vetter, Jeffrey S; Canon, Richard Shane; Jiang, Song

    2007-01-01

    Lustre is a parallel file system that presents high aggregated IO bandwidth by striping file extents across many storage devices. However, our experiments indicate excessively wide striping can cause performance degradation. Lustre supports an innovative file joining feature that joins files in place. To mitigate striping overhead and benefit collective IO, we propose two techniques: split writing and hierarchical striping. In split writing, a file is created as separate subfiles, each of which is striped to only a few storage devices. They are joined as a single file at the file close time. Hierarchical striping builds on top of split writing and orchestrates the span of subfiles in a hierarchical manner to avoid overlapping and achieve the appropriate coverage of storage devices. Together, these techniques can avoid the overhead associated with large stripe width, while still being able to combine bandwidth available from many storage devices. We have prototyped these techniques in the ROMIO implementation of MPI-IO. Experimental results indicate that split writing and hierarchical striping can significantly improve the performance of Lustre collective IO in terms of both data transfer and management operations. On a Lustre file system configured with 46 object storage targets, our implementation improves collective write performance of a 16-process job by as much as 220%.

  19. Crystal aggregation in kidney stones; a polymer aggregation problem?

    NASA Astrophysics Data System (ADS)

    Wesson, J.; Beshensky, A.; Viswanathan, P.; Zachowicz, W.; Kleinman, J.

    2008-03-01

    Kidney stones most frequently form as aggregates of calcium oxalate monohydrate (COM) crystals with organic layers between them, and the organic layers contain principally proteins. The pathway leading to the formation of these crystal aggregates in affected people has not been identified, but stone forming patients are thought to have a defect in the structure or distribution of urinary proteins, which normally protect against stone formation. We have developed two polyelectrolyte models that will induce COM crystal aggregation in vitro, and both are consistent with possible urinary protein compositions. The first model was based on mixing polyanionic and polycationic proteins, in portions such that the combined protein charge is near zero. The second model was based on reducing the charge density on partially charged polyanionic proteins, specifically Tamm-Horsfall protein, the second most abundant protein in urine. Both models demonstrated polymer phase separation at solution conditions where COM crystal aggregation was observed. Correlation with data from other bulk crystallization measurements suggest that the anionic side chains form critical binding interactions with COM surfaces that are necessary along with the phase separation process to induce COM crystal aggregation.

  20. Text File Display Program

    NASA Technical Reports Server (NTRS)

    Vavrus, J. L.

    1986-01-01

    LOOK program permits user to examine text file in pseudorandom access manner. Program provides user with way of rapidly examining contents of ASCII text file. LOOK opens text file for input only and accesses it in blockwise fashion. Handles text formatting and displays text lines on screen. User moves forward or backward in file by any number of lines or blocks. Provides ability to "scroll" text at various speeds in forward or backward directions.

  1. Standard interface file handbook

    SciTech Connect

    Shapiro, A.; Huria, H.C. )

    1992-10-01

    This handbook documents many of the standard interface file formats that have been adopted by the US Department of Energy to facilitate communications between and portability of, various large reactor physics and radiation transport software packages. The emphasis is on those files needed for use of the VENTURE/PC diffusion-depletion code system. File structures, contents and some practical advice on use of the various files are provided.

  2. Text File Comparator

    NASA Technical Reports Server (NTRS)

    Kotler, R. S.

    1983-01-01

    File Comparator program IFCOMP, is text file comparator for IBM OS/VScompatable systems. IFCOMP accepts as input two text files and produces listing of differences in pseudo-update form. IFCOMP is very useful in monitoring changes made to software at the source code level.

  3. The Vertical File.

    ERIC Educational Resources Information Center

    Czopek, Vanessa

    The process of establishing the vertical file for a new branch library is traced; suggestions for making the vertical file a better resource are offered; and guidelines covering the general objective, responsibility for selection and maintenance, principles of selection, and scope of the collection for vertical files are presented. A four-item…

  4. MDAnalysis: a toolkit for the analysis of molecular dynamics simulations.

    PubMed

    Michaud-Agrawal, Naveen; Denning, Elizabeth J; Woolf, Thomas B; Beckstein, Oliver

    2011-07-30

    MDAnalysis is an object-oriented library for structural and temporal analysis of molecular dynamics (MD) simulation trajectories and individual protein structures. It is written in the Python language with some performance-critical code in C. It uses the powerful NumPy package to expose trajectory data as fast and efficient NumPy arrays. It has been tested on systems of millions of particles. Many common file formats of simulation packages including CHARMM, Gromacs, Amber, and NAMD and the Protein Data Bank format can be read and written. Atoms can be selected with a syntax similar to CHARMM's powerful selection commands. MDAnalysis enables both novice and experienced programmers to rapidly write their own analytical tools and access data stored in trajectories in an easily accessible manner that facilitates interactive explorative analysis. MDAnalysis has been tested on and works for most Unix-based platforms such as Linux and Mac OS X. It is freely available under the GNU General Public License from http://mdanalysis.googlecode.com. PMID:21500218

  5. An SML Driven Graphical User Interface and Application Management Toolkit

    SciTech Connect

    White, Greg R

    2002-01-18

    In the past, the features of a user interface were limited by those available in the existing graphical widgets it used. Now, improvements in processor speed have fostered the emergence of interpreted languages, in which the appropriate method to render a given data object can be loaded at runtime. XML can be used to precisely describe the association of data types with their graphical handling (beans), and Java provides an especially rich environment for programming the graphics. We present a graphical user interface builder based on Java Beans and XML, in which the graphical screens are described textually (in files or a database) in terms of their screen components. Each component may be a simple text read back, or a complex plot. The programming model provides for dynamic data pertaining to a component to be forwarded synchronously or asynchronously, to the appropriate handler, which may be a built-in method, or a complex applet. This work was initially motivated by the need to move the legacy VMS display interface of the SLAC Control Program to another platform while preserving all of its existing functionality. However the model allows us a powerful and generic system for adding new kinds of graphics, such as Matlab, data sources, such as EPICS, middleware, such as AIDA[1], and transport, such as XML and SOAP. The system will also include a management console, which will be able to report on the present usage of the system, for instance who is running it where and connected to which channels.

  6. An XML Driven Graphical User Interface and Application Management Toolkit

    SciTech Connect

    White, Greg R

    2002-01-18

    In the past, the features of a user interface were limited by those available in the existing graphical widgets it used. Now, improvements in processor speed have fostered the emergence of interpreted languages, in which the appropriate method to render a given data object can be loaded at runtime. XML can be used to precisely describe the association of data types with their graphical handling (beans), and Java provides an especially rich environment for programming the graphics. We present a graphical user interface builder based on Java Beans and XML, in which the graphical screens are described textually (in files or a database) in terms of their screen components. Each component may be a simple text read back, or a complex plot. The programming model provides for dynamic data pertaining to a component to be forwarded synchronously or asynchronously, to the appropriate handler, which may be a built-in method, or a complex applet. This work was initially motivated by the need to move the legacy VMS display interface of the SLAC Control Program to another platform while preserving all of its existing functionality. However the model allows us a powerful and generic system for adding new kinds of graphics, such as Matlab, data sources, such as EPICS, middleware, such as AIDA[1], and transport, such as XML and SOAP. The system will also include a management console, which will be able to report on the present usage of the system, for instance who is running it where and connected to which channels.

  7. 43 CFR 4.1381 - Who may file; when to file; where to file.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... may file; when to file; where to file. (a) Any person who receives a written decision issued by OSM under 30 CFR 773.28 on a challenge to an ownership or control listing or finding may file a request for... 43 Public Lands: Interior 1 2010-10-01 2010-10-01 false Who may file; when to file; where to...

  8. MuCor: mutation aggregation and correlation

    PubMed Central

    Kroll, Karl W.; Eisfeld, Ann-Katherin; Lozanski, Gerard; Bloomfield, Clara D.; Byrd, John C.; Blachly, James S.

    2016-01-01

    Motivation: There are many tools for variant calling and effect prediction, but little to tie together large sample groups. Aggregating, sorting and summarizing variants and effects across a cohort is often done with ad hoc scripts that must be re-written for every new project. In response, we have written MuCor, a tool to gather variants from a variety of input formats (including multiple files per sample), perform database lookups and frequency calculations, and write many types of reports. In addition to use in large studies with numerous samples, MuCor can also be employed to directly compare variant calls from the same sample across two or more platforms, parameters or pipelines. A companion utility, DepthGauge, measures coverage at regions of interest to increase confidence in calls. Availability and implementation: Source code is freely available at https://github.com/blachlylab/mucor and a Docker image is available at https://hub.docker.com/r/blachlylab/mucor/ Contact: james.blachly@osumc.edu Supplementary data: Supplementary data are available at Bioinformatics online. PMID:26803155

  9. Fractal Aggregates in Tennis Ball Systems

    ERIC Educational Resources Information Center

    Sabin, J.; Bandin, M.; Prieto, G.; Sarmiento, F.

    2009-01-01

    We present a new practical exercise to explain the mechanisms of aggregation of some colloids which are otherwise not easy to understand. We have used tennis balls to simulate, in a visual way, the aggregation of colloids under reaction-limited colloid aggregation (RLCA) and diffusion-limited colloid aggregation (DLCA) regimes. We have used the…

  10. Aggregated Recommendation through Random Forests

    PubMed Central

    2014-01-01

    Aggregated recommendation refers to the process of suggesting one kind of items to a group of users. Compared to user-oriented or item-oriented approaches, it is more general and, therefore, more appropriate for cold-start recommendation. In this paper, we propose a random forest approach to create aggregated recommender systems. The approach is used to predict the rating of a group of users to a kind of items. In the preprocessing stage, we merge user, item, and rating information to construct an aggregated decision table, where rating information serves as the decision attribute. We also model the data conversion process corresponding to the new user, new item, and both new problems. In the training stage, a forest is built for the aggregated training set, where each leaf is assigned a distribution of discrete rating. In the testing stage, we present four predicting approaches to compute evaluation values based on the distribution of each tree. Experiments results on the well-known MovieLens dataset show that the aggregated approach maintains an acceptable level of accuracy. PMID:25180204

  11. Novel insights into amylin aggregation

    PubMed Central

    Pillay, Karen; Govender, Patrick

    2014-01-01

    Amylin is a peptide that aggregates into species that are toxic to pancreatic beta cells, leading to type II diabetes. This study has for the first time quantified amylin association and dissociation kinetics (association constant (ka) = 28.7 ± 5.1 L mol−1 s−1 and dissociation constant (kd) = 2.8 ± 0.6 ×10−4 s−1) using surface plasmon resonance (SPR). Thus far, techniques used for the sizing of amylin aggregates do not cater for the real-time monitoring of unconstrained amylin in solution. In this regard we evaluated recently innovated nanoparticle tracking analysis (NTA). In addition, both SPR and NTA were used to study the effect of previously synthesized amylin derivatives on amylin aggregation and to evaluate their potential as a cell-free system for screening potential inhibitors of amylin-mediated cytotoxicity. Results obtained from NTA highlighted a predominance of 100–300 nm amylin aggregates and correlation to previously published cytotoxicity results suggests the toxic species of amylin to be 200–300 nm in size. The results seem to indicate that NTA has potential as a new technique to monitor the aggregation potential of amyloid peptides in solution and also to screen potential inhibitors of amylin-mediated cytotoxicity. PMID:26019498

  12. Toward an Efficient Icing CFD Process Using an Interactive Software Toolkit: Smagglce 2D

    NASA Technical Reports Server (NTRS)

    Vickerman, Mary B.; Choo, Yung K.; Schilling, Herbert W.; Baez, Marivell; Braun, Donald C.; Cotton, Barbara J.

    2001-01-01

    Two-dimensional CID analysis for iced airfoils can be a labor-intensive task. The software toolkit SmaggIce 2D is being developed to help streamline the CID process and provide the unique features needed for icing. When complete, it will include a combination of partially automated and fully interactive tools for all aspects of the tasks leading up to the flow analysis: geometry preparation, domain decomposition. block boundary demoralization. gridding, and linking with a flow solver. It also includes tools to perform ice shape characterization, an important aid in determining the relationship between ice characteristics and their effects on aerodynamic performance. Completed tools, work-in-progress, and planned features of the software toolkit are presented here.

  13. Measuring the Environmental Dimensions of Human Migration: The Demographer’s Toolkit

    PubMed Central

    Hunter, Lori M.; Gray, Clark L.

    2014-01-01

    In recent years, the empirical literature linking environmental factors and human migration has grown rapidly and gained increasing visibility among scholars and the policy community. Still, this body of research uses a wide range of methodological approaches for assessing environment-migration relationships. Without comparable data and measures across a range of contexts, it is impossible to make generalizations that would facilitate the development of future migration scenarios. Demographic researchers have a large methodological toolkit for measuring migration as well as modeling its drivers. This toolkit includes population censuses, household surveys, survival analysis and multi-level modeling. This paper’s purpose is to introduce climate change researchers to demographic data and methods and to review exemplary studies of the environmental dimensions of human migration. Our intention is to foster interdisciplinary understanding and scholarship, and to promote high quality research on environment and migration that will lead toward broader knowledge of this association. PMID:25177108

  14. The MIGenAS integrated bioinformatics toolkit for web-based sequence analysis

    PubMed Central

    Rampp, Markus; Soddemann, Thomas; Lederer, Hermann

    2006-01-01

    We describe a versatile and extensible integrated bioinformatics toolkit for the analysis of biological sequences over the Internet. The web portal offers convenient interactive access to a growing pool of chainable bioinformatics software tools and databases that are centrally installed and maintained by the RZG. Currently, supported tasks comprise sequence similarity searches in public or user-supplied databases, computation and validation of multiple sequence alignments, phylogenetic analysis and protein–structure prediction. Individual tools can be seamlessly chained into pipelines allowing the user to conveniently process complex workflows without the necessity to take care of any format conversions or tedious parsing of intermediate results. The toolkit is part of the Max-Planck Integrated Gene Analysis System (MIGenAS) of the Max Planck Society available at (click ‘Start Toolkit’). PMID:16844980

  15. Efficient Genome Editing in Caenorhabditis elegans with a Toolkit of Dual-Marker Selection Cassettes.

    PubMed

    Norris, Adam D; Kim, Hyun-Min; Colaiácovo, Mónica P; Calarco, John A

    2015-10-01

    Use of the CRISPR/Cas9 RNA-guided endonuclease complex has recently enabled the generation of double-strand breaks virtually anywhere in the C. elegans genome. Here, we present an improved strategy that makes all steps in the genome editing process more efficient. We have created a toolkit of template-mediated repair cassettes that contain an antibiotic resistance gene to select for worms carrying the repair template and a fluorescent visual marker that facilitates identification of bona fide recombinant animals. Homozygous animals can be identified as early as 4-5 days post-injection, and minimal genotyping by PCR is required. We demonstrate that our toolkit of dual-marker vectors can generate targeted disruptions, deletions, and endogenous tagging with fluorescent proteins and epitopes. This strategy should be useful for a wide variety of additional applications and will provide researchers with increased flexibility when designing genome editing experiments.

  16. The 2006 Earnings Public-Use Microdata File: an introduction.

    PubMed

    Compson, Michael

    2011-01-01

    This article introduces the 2006 Earnings Public-Use File (EPUF) and provides important background information on the file's data fields. The EPUF contains selected demographic and earnings information for 4.3 million individuals drawn from a 1-percent sample of all Social Security numbers issued before January 2007. The data file provides aggregate earnings for 1937 to 1950 and annual earnings data for 1951 to 2006. The article focuses on four key items: (1) the Social Security Administration's experiences collecting earnings data over the years and their effect on the data fields included in EPUF; (2) the steps taken to "clean" the underlying administrative data and to minimize the risk of personal data disclosure; (3) the potential limitations of using EPUF data to estimate Social Security benefits for some individuals; and (4) frequency distributions and statistical tabulations of the data in the file, to provide a point of reference for EPUF users.

  17. A Molecular Toolkit to Visualize Native Protein Assemblies in the Context of Human Disease.

    PubMed

    Gilmore, Brian L; Winton, Carly E; Demmert, Andrew C; Tanner, Justin R; Bowman, Sam; Karageorge, Vasilea; Patel, Kaya; Sheng, Zhi; Kelly, Deborah F

    2015-01-01

    We present a new molecular toolkit to investigate protein assemblies natively formed in the context of human disease. The system employs tunable microchips that can be decorated with switchable adaptor molecules to select for target proteins of interest and analyze them using molecular microscopy. Implementing our new streamlined microchip approach, we could directly visualize BRCA1 gene regulatory complexes from patient-derived cancer cells for the first time. PMID:26395823

  18. The Trick Simulation Toolkit: A NASA/Open source Framework for Running Time Based Physics Models

    NASA Technical Reports Server (NTRS)

    Penn, John M.; Lin, Alexander S.

    2016-01-01

    This paper describes the design and use at of the Trick Simulation Toolkit, a simulation development environment for creating high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. It describes Trick's design goals and how the development environment attempts to achieve those goals. It describes how Trick is used in some of the many training and engineering simulations at NASA. Finally it describes the Trick NASA/Open source project on Github.

  19. CDC's Health Equity Resource Toolkit: disseminating guidance for state practitioners to address obesity disparities.

    PubMed

    Payne, Gayle Holmes; James, Stephen D; Hawley, Lisa; Corrigan, Bethany; Kramer, Rachel E; Overton, Samantha N; Farris, Rosanne P; Wasilewski, Yvonne

    2015-01-01

    Obesity has been on the rise in the United States over the past three decades, and is high. In addition to population-wide trends, it is clear that obesity affects some groups more than others and can be associated with age, income, education, gender, race and ethnicity, and geographic region. To reverse the obesity epidemic, the Centers for Disease Control and Prevention) promotes evidence-based and practice-informed strategies to address nutrition and physical activity environments and behaviors. These public health strategies require translation into actionable approaches that can be implemented by state and local entities to address disparities. The Centers for Disease Control and Prevention used findings from an expert panel meeting to guide the development and dissemination of the Health Equity Resource Toolkit for State Practitioners Addressing Obesity Disparities (available at http://www.cdc.gov/obesity/health_equity/toolkit.html). The Toolkit helps public health practitioners take a systematic approach to program planning using a health equity lens. The Toolkit provides a six-step process for planning, implementing, and evaluating strategies to address obesity disparities. Each section contains (a) a basic description of the steps of the process and suggested evidence-informed actions to help address obesity disparities, (b) practical tools for carrying out activities to help reduce obesity disparities, and (c) a "real-world" case study of a successful state-level effort to address obesity with a focus on health equity that is particularly relevant to the content in that section. Hyperlinks to additional resources are included throughout.

  20. Use of OHRA Toolkit in the QRA work in Norway and UK

    SciTech Connect

    Skramstad, E.; Hundseid, H.

    1995-12-31

    The Offshore Hazard and Risk Analysis (OHRA) Toolkit is a comprehensive computer program which has been developed for quantitative risk analyses of offshore installations. Use of OHRAT in the QRA (Quantitative Risk Analysis) work in Norway and UK are increasing rapidly. Being a flexible tool there is no fixed approach for how to do a QRA with OHRAT. The purpose of this paper is to present typical approaches and experience from use for operators in the Norwegian part of the North Sea.

  1. Distribution of a Generic Mission Planning and Scheduling Toolkit for Astronomical Spacecraft

    NASA Technical Reports Server (NTRS)

    Kleiner, Steven C.

    1998-01-01

    This 2-year report describes the progress made to date on the project to package and distribute the planning and scheduling toolkit for the SWAS astronomical spacecraft. SWAS was scheduled to be launched on a Pegasus XL vehicle in fall 1995. Three separate failures in the launch vehicle have delayed the SWAS launch. The researchers have used this time to continue developing scheduling algorithms and GUI design. SWAS is expected to be launched this year.

  2. The SpeX Prism Library Analysis Toolkit: Design Considerations and First Results

    NASA Astrophysics Data System (ADS)

    Burgasser, Adam J.; Aganze, Christian; Escala, Ivana; Lopez, Mike; Choban, Caleb; Jin, Yuhui; Iyer, Aishwarya; Tallis, Melisa; Suarez, Adrian; Sahi, Maitrayee

    2016-01-01

    Various observational and theoretical spectral libraries now exist for galaxies, stars, planets and other objects, which have proven useful for classification, interpretation, simulation and model development. Effective use of these libraries relies on analysis tools, which are often left to users to develop. In this poster, we describe a program to develop a combined spectral data repository and Python-based analysis toolkit for low-resolution spectra of very low mass dwarfs (late M, L and T dwarfs), which enables visualization, spectral index analysis, classification, atmosphere model comparison, and binary modeling for nearly 2000 library spectra and user-submitted data. The SpeX Prism Library Analysis Toolkit (SPLAT) is being constructed as a collaborative, student-centered, learning-through-research model with high school, undergraduate and graduate students and regional science teachers, who populate the database and build the analysis tools through quarterly challenge exercises and summer research projects. In this poster, I describe the design considerations of the toolkit, its current status and development plan, and report the first published results led by undergraduate students. The combined data and analysis tools are ideal for characterizing cool stellar and exoplanetary atmospheres (including direct exoplanetary spectra observations by Gemini/GPI, VLT/SPHERE, and JWST), and the toolkit design can be readily adapted for other spectral datasets as well.This material is based upon work supported by the National Aeronautics and Space Administration under Grant No. NNX15AI75G. SPLAT code can be found at https://github.com/aburgasser/splat.

  3. The doctor-patient relationship as a toolkit for uncertain clinical decisions.

    PubMed

    Diamond-Brown, Lauren

    2016-06-01

    Medical uncertainty is a well-recognized problem in healthcare, yet how doctors make decisions in the face of uncertainty remains to be understood. This article draws on interdisciplinary literature on uncertainty and physician decision-making to examine a specific physician response to uncertainty: using the doctor-patient relationship as a toolkit. Additionally, I ask what happens to this process when the doctor-patient relationship becomes fragmented. I answer these questions by examining obstetrician-gynecologists' narratives regarding how they make decisions when faced with uncertainty in childbirth. Between 2013 and 2014, I performed 21 semi-structured interviews with obstetricians in the United States. Obstetricians were selected to maximize variation in relevant physician, hospital, and practice characteristics. I began with grounded theory and moved to analytical coding of themes in relation to relevant literature. My analysis renders it evident that some physicians use the doctor-patient relationship as a toolkit for dealing with uncertainty. I analyze how this process varies for physicians in different models of care by comparing doctors' experiences in models with continuous versus fragmented doctor-patient relationships. My key findings are that obstetricians in both models appealed to the ideal of patient-centered decision-making to cope with uncertain decisions, but in practice physicians in fragmented care faced a number of challenges to using the doctor-patient relationship as a toolkit for decision-making. These challenges led to additional uncertainties and in some cases to poor outcomes for doctors and/or patients; they also raised concerns about the reproduction of inequality. Thus organization of care delivery mitigates the efficacy of doctors' use of the doctor-patient relationship toolkit for uncertain decisions. These findings have implications for theorizing about decision-making under conditions of medical uncertainty, for understanding

  4. Equilibrium structure of ferrofluid aggregates

    SciTech Connect

    Yoon, Mina; Tomanek, David

    2010-01-01

    We study the equilibrium structure of large but finite aggregates of magnetic dipoles, representing a colloidal suspension of magnetite particles in a ferrofluid. With increasing system size, the structural motif evolves from chains and rings to multi-chain and multi-ring assemblies. Very large systems form single- and multi-wall coils, tubes and scrolls. These structural changes result from a competition between various energy terms, which can be approximated analytically within a continuum model. We also study the effect of external parameters such as magnetic field on the relative stability of these structures. Our results may give insight into experimental data obtained during solidification of ferrofluid aggregates at temperatures where thermal fluctuations become negligible in comparison to inter-particle interactions. These data may also help to experimentally control the aggregation of magnetic particles.

  5. Customer Aggregation: An Opportunity for Green Power?

    SciTech Connect

    Holt, E.; Bird, L.

    2001-02-26

    We undertook research into the experience of aggregation groups to determine whether customer aggregation offers an opportunity to bring green power choices to more customers. The objectives of this report, therefore, are to (1) identify the different types of aggregation that are occurring today, (2) learn whether aggregation offers an opportunity to advance sales of green power, and (3) share these concepts and approaches with potential aggregators and green power advocates.

  6. Fractal aggregates in tennis ball systems

    NASA Astrophysics Data System (ADS)

    Sabin, J.; Bandín, M.; Prieto, G.; Sarmiento, F.

    2009-09-01

    We present a new practical exercise to explain the mechanisms of aggregation of some colloids which are otherwise not easy to understand. We have used tennis balls to simulate, in a visual way, the aggregation of colloids under reaction-limited colloid aggregation (RLCA) and diffusion-limited colloid aggregation (DLCA) regimes. We have used the images of the cluster of balls, following Forrest and Witten's pioneering studies on the aggregation of smoke particles, to estimate their fractal dimension.

  7. Distribution of a Generic Mission Planning and Scheduling Toolkit for Astronomical Spacecraft

    NASA Technical Reports Server (NTRS)

    Kleiner, Steven C.

    1996-01-01

    Work is progressing as outlined in the proposal for this contract. A working planning and scheduling system has been documented and packaged and made available to the WIRE Small Explorer group at JPL, the FUSE group at JHU, the NASA/GSFC Laboratory for Astronomy and Solar Physics and the Advanced Planning and Scheduling Branch at STScI. The package is running successfully on the WIRE computer system. It is expected that the WIRE will reuse significant portions of the SWAS code in its system. This scheduling system itself was tested successfully against the spacecraft hardware in December 1995. A fully automatic scheduling module has been developed and is being added to the toolkit. In order to maximize reuse, the code is being reorganized during the current build into object-oriented class libraries. A paper describing the toolkit has been written and is included in the software distribution. We have experienced interference between the export and production versions of the toolkit. We will be requesting permission to reprogram funds in order to purchase a standalone PC onto which to offload the export version.

  8. Midwives in medical student and resident education and the development of the medical education caucus toolkit.

    PubMed

    Radoff, Kari; Nacht, Amy; Natch, Amy; McConaughey, Edie; Salstrom, Jan; Schelling, Karen; Seger, Suzanne

    2015-01-01

    Midwives have been involved formally and informally in the training of medical students and residents for many years. Recent reductions in resident work hours, emphasis on collaborative practice, and a focus on midwives as key members of the maternity care model have increased the involvement of midwives in medical education. Midwives work in academic settings as educators to teach the midwifery model of care, collaboration, teamwork, and professionalism to medical students and residents. In 2009, members of the American College of Nurse-Midwives formed the Medical Education Caucus (MECA) to discuss the needs of midwives teaching medical students and residents; the group has held a workshop annually over the last 4 years. In 2014, MECA workshop facilitators developed a toolkit to support and formalize the role of midwives involved in medical student and resident education. The MECA toolkit provides a roadmap for midwives beginning involvement and continuing or expanding the role of midwives in medical education. This article describes the history of midwives in medical education, the development and growth of MECA, and the resulting toolkit created to support and formalize the role of midwives as educators in medical student and resident education, as well as common challenges for the midwife in academic medicine. This article is part of a special series of articles that address midwifery innovations in clinical practice, education, interprofessional collaboration, health policy, and global health.

  9. Environmentalism and natural aggregate mining

    USGS Publications Warehouse

    Drew, L.J.; Langer, W.H.; Sachs, J.S.

    2002-01-01

    Sustaining a developed economy and expanding a developing one require the use of large volumes of natural aggregate. Almost all human activity (commercial, recreational, or leisure) is transacted in or on facilities constructed from natural aggregate. In our urban and suburban worlds, we are almost totally dependent on supplies of water collected behind dams and transported through aqueducts made from concrete. Natural aggregate is essential to the facilities that produce energy-hydroelectric dams and coal-fired powerplants. Ironically, the utility created for mankind by the use of natural aggregate is rarely compared favorably with the environmental impacts of mining it. Instead, the empty quarries and pits are seen as large negative environmental consequences. At the root of this disassociation is the philosophy of environmentalism, which flavors our perceptions of the excavation, processing, and distribution of natural aggregate. The two end-member ideas in this philosophy are ecocentrism and anthropocentrism. Ecocentrism takes the position that the natural world is a organism whose arteries are the rivers-their flow must not be altered. The soil is another vital organ and must not be covered with concrete and asphalt. The motto of the ecocentrist is "man must live more lightly on the land." The anthropocentrist wants clean water and air and an uncluttered landscape for human use. Mining is allowed and even encouraged, but dust and noise from quarry and pit operations must be minimized. The large volume of truck traffic is viewed as a real menace to human life and should be regulated and isolated. The environmental problems that the producers of natural aggregate (crushed stone and sand and gravel) face today are mostly difficult social and political concerns associated with the large holes dug in the ground and the large volume of heavy truck traffic associated with quarry and pit operations. These concerns have increased in recent years as society's demand for

  10. Studies on recycled aggregates-based concrete.

    PubMed

    Rakshvir, Major; Barai, Sudhirkumar V

    2006-06-01

    Reduced extraction of raw materials, reduced transportation cost, improved profits, reduced environmental impact and fast-depleting reserves of conventional natural aggregates has necessitated the use of recycling, in order to be able to conserve conventional natural aggregate. In this study various physical and mechanical properties of recycled concrete aggregates were examined. Recycled concrete aggregates are different from natural aggregates and concrete made from them has specific properties. The percentages of recycled concrete aggregates were varied and it was observed that properties such as compressive strength showed a decrease of up to 10% as the percentage of recycled concrete aggregates increased. Water absorption of recycled aggregates was found to be greater than natural aggregates, and this needs to be compensated during mix design.

  11. RAGG - R EPISODIC AGGREGATION PACKAGE

    EPA Science Inventory

    The RAGG package is an R implementation of the CMAQ episodic model aggregation method developed by Constella Group and the Environmental Protection Agency. RAGG is a tool to provide climatological seasonal and annual deposition of sulphur and nitrogen for multimedia management. ...

  12. Cyclosporine A enhances platelet aggregation.

    PubMed

    Grace, A A; Barradas, M A; Mikhailidis, D P; Jeremy, J Y; Moorhead, J F; Sweny, P; Dandona, P

    1987-12-01

    In view of the reported increase in thromboembolic episodes following cyclosporine A (CyA) therapy, the effect of this drug on platelet aggregation and thromboxane A2 release was investigated. The addition of CyA, at therapeutic concentrations to platelet rich plasma from normal subjects in vitro was found to increase aggregation in response to adrenaline, collagen and ADP. Ingestion of CyA by healthy volunteers was also associated with enhanced platelet aggregation. The CyA-mediated enhancement of aggregation was further enhanced by the addition in vitro of therapeutic concentrations of heparin. Platelets from renal allograft recipients treated with CyA also showed hyperaggregability and increased thromboxane A2 release, which were most marked at "peak" plasma CyA concentration and less so at "trough" concentrations. Platelet hyperaggregability in renal allograft patients on long-term CyA therapy tended to revert towards normal following the replacement of CyA with azathioprine. Hypertensive patients with renal allografts on nifedipine therapy had normal platelet function and thromboxane release in spite of CyA therapy. These observations suggest that CyA-mediated platelet activation may contribute to the pathogenesis of the thromboembolic phenomena associated with the use of this drug. The increased release of thromboxane A2 (a vasoconstrictor) may also play a role in mediating CyA-related nephrotoxicity.

  13. Sequence-dependent Internalization of Aggregating Peptides*

    PubMed Central

    Couceiro, José R.; Gallardo, Rodrigo; De Smet, Frederik; De Baets, Greet; Baatsen, Pieter; Annaert, Wim; Roose, Kenny; Saelens, Xavier; Schymkowitz, Joost; Rousseau, Frederic

    2015-01-01

    Recently, a number of aggregation disease polypeptides have been shown to spread from cell to cell, thereby displaying prionoid behavior. Studying aggregate internalization, however, is often hampered by the complex kinetics of the aggregation process, resulting in the concomitant uptake of aggregates of different sizes by competing mechanisms, which makes it difficult to isolate pathway-specific responses to aggregates. We designed synthetic aggregating peptides bearing different aggregation propensities with the aim of producing modes of uptake that are sufficiently distinct to differentially analyze the cellular response to internalization. We found that small acidic aggregates (≤500 nm in diameter) were taken up by nonspecific endocytosis as part of the fluid phase and traveled through the endosomal compartment to lysosomes. By contrast, bigger basic aggregates (>1 μm) were taken up through a mechanism dependent on cytoskeletal reorganization and membrane remodeling with the morphological hallmarks of phagocytosis. Importantly, the properties of these aggregates determined not only the mechanism of internalization but also the involvement of the proteostatic machinery (the assembly of interconnected networks that control the biogenesis, folding, trafficking, and degradation of proteins) in the process; whereas the internalization of small acidic aggregates is HSF1-independent, the uptake of larger basic aggregates was HSF1-dependent, requiring Hsp70. Our results show that the biophysical properties of aggregates determine both their mechanism of internalization and proteostatic response. It remains to be seen whether these differences in cellular response contribute to the particular role of specific aggregated proteins in disease. PMID:25391649

  14. Verification & Validation Toolkit to Assess Codes: Is it Theory Limitation, Numerical Method Inadequacy, Bug in the Code or a Serious Flaw?

    NASA Astrophysics Data System (ADS)

    Bombardelli, F. A.; Zamani, K.

    2014-12-01

    We introduce and discuss an open-source, user friendly, numerical post-processing piece of software to assess reliability of the modeling results of environmental fluid mechanics' codes. Verification and Validation, Uncertainty Quantification (VAVUQ) is a toolkit developed in Matlab© for general V&V proposes. In this work, The VAVUQ implementation of V&V techniques and user interfaces would be discussed. VAVUQ is able to read Excel, Matlab, ASCII, and binary files and it produces a log of the results in txt format. Next, each capability of the code is discussed through an example: The first example is the code verification of a sediment transport code, developed with the Finite Volume Method, with MES. Second example is a solution verification of a code for groundwater flow, developed with the Boundary Element Method, via MES. Third example is a solution verification of a mixed order, Compact Difference Method code of heat transfer via MMS. Fourth example is a solution verification of a 2-D, Finite Difference Method code of floodplain analysis via Complete Richardson Extrapolation. In turn, application of VAVUQ in quantitative model skill assessment studies (validation) of environmental codes is given through two examples: validation of a two-phase flow computational modeling of air entrainment in a free surface flow versus lab measurements and heat transfer modeling in the earth surface versus field measurement. At the end, we discuss practical considerations and common pitfalls in interpretation of V&V results.

  15. Network file-storage system

    SciTech Connect

    Collins, W.W.; Devaney, M.J.; Willbanks, E.W.

    1982-01-01

    The Common File System (CFS) is a file management and mass storage system for the Los Alamos National Laboratory's computer network. The CFS is organized as a hierarchical storage system: active files are stored on fast-access storage devices, larger, less active files are stored on slower, less expensive devices, and archival files are stored offline. Files are automatically moved between the various classes of storage by a file migration program that analyzes file activity, file size and storage device capabilities. This has resulted in a cost-effective system that provides both fast access and large data storage capability (over five trillion bits currently stored).

  16. An Aggregation Advisor for Ligand Discovery.

    PubMed

    Irwin, John J; Duan, Da; Torosyan, Hayarpi; Doak, Allison K; Ziebart, Kristin T; Sterling, Teague; Tumanian, Gurgen; Shoichet, Brian K

    2015-09-10

    Colloidal aggregation of organic molecules is the dominant mechanism for artifactual inhibition of proteins, and controls against it are widely deployed. Notwithstanding an increasingly detailed understanding of this phenomenon, a method to reliably predict aggregation has remained elusive. Correspondingly, active molecules that act via aggregation continue to be found in early discovery campaigns and remain common in the literature. Over the past decade, over 12 thousand aggregating organic molecules have been identified, potentially enabling a precedent-based approach to match known aggregators with new molecules that may be expected to aggregate and lead to artifacts. We investigate an approach that uses lipophilicity, affinity, and similarity to known aggregators to advise on the likelihood that a candidate compound is an aggregator. In prospective experimental testing, five of seven new molecules with Tanimoto coefficients (Tc's) between 0.95 and 0.99 to known aggregators aggregated at relevant concentrations. Ten of 19 with Tc's between 0.94 and 0.90 and three of seven with Tc's between 0.89 and 0.85 also aggregated. Another three of the predicted compounds aggregated at higher concentrations. This method finds that 61 827 or 5.1% of the ligands acting in the 0.1 to 10 μM range in the medicinal chemistry literature are at least 85% similar to a known aggregator with these physical properties and may aggregate at relevant concentrations. Intriguingly, only 0.73% of all drug-like commercially available compounds resemble the known aggregators, suggesting that colloidal aggregators are enriched in the literature. As a percentage of the literature, aggregator-like compounds have increased 9-fold since 1995, partly reflecting the advent of high-throughput and virtual screens against molecular targets. Emerging from this study is an aggregator advisor database and tool ( http://advisor.bkslab.org ), free to the community, that may help distinguish between

  17. Nebular history of amoeboid olivine aggregates

    NASA Astrophysics Data System (ADS)

    Sugiura, N.; Petaev, M. I.; Kimura, M.; Miyazaki, A.; Hiyagon, H.

    2009-05-01

    Minor element (Ca, Cr, and Mn) concentrations in amoeboid olivine aggregates (AOAs) from primitive chondrites were measured and compared with those predicted by equilibrium condensation in the solar nebula. CaO concentrations in forsterite are low, particularly in porous aggregates. A plausible explanation appears that an equilibrium Ca activity was not maintained during the olivine condensation. CaO and MnO in forsterite are negatively correlated, with CaO being higher in compact aggregates. This suggests that the compact aggregates formed either by a prolonged reheating of the porous aggregates or by condensation and aggregation of forsterite during a very slow cooling in the nebula.

  18. Role of streams in myxobacteria aggregate formation

    NASA Astrophysics Data System (ADS)

    Kiskowski, Maria A.; Jiang, Yi; Alber, Mark S.

    2004-10-01

    Cell contact, movement and directionality are important factors in biological development (morphogenesis), and myxobacteria are a model system for studying cell-cell interaction and cell organization preceding differentiation. When starved, thousands of myxobacteria cells align, stream and form aggregates which later develop into round, non-motile spores. Canonically, cell aggregation has been attributed to attractive chemotaxis, a long range interaction, but there is growing evidence that myxobacteria organization depends on contact-mediated cell-cell communication. We present a discrete stochastic model based on contact-mediated signaling that suggests an explanation for the initialization of early aggregates, aggregation dynamics and final aggregate distribution. Our model qualitatively reproduces the unique structures of myxobacteria aggregates and detailed stages which occur during myxobacteria aggregation: first, aggregates initialize in random positions and cells join aggregates by random walk; second, cells redistribute by moving within transient streams connecting aggregates. Streams play a critical role in final aggregate size distribution by redistributing cells among fewer, larger aggregates. The mechanism by which streams redistribute cells depends on aggregate sizes and is enhanced by noise. Our model predicts that with increased internal noise, more streams would form and streams would last longer. Simulation results suggest a series of new experiments.

  19. Medical image file formats.

    PubMed

    Larobina, Michele; Murino, Loredana

    2014-04-01

    Image file format is often a confusing aspect for someone wishing to process medical images. This article presents a demystifying overview of the major file formats currently used in medical imaging: Analyze, Neuroimaging Informatics Technology Initiative (Nifti), Minc, and Digital Imaging and Communications in Medicine (Dicom). Concepts common to all file formats, such as pixel depth, photometric interpretation, metadata, and pixel data, are first presented. Then, the characteristics and strengths of the various formats are discussed. The review concludes with some predictive considerations about the future trends in medical image file formats.

  20. Medical image file formats.

    PubMed

    Larobina, Michele; Murino, Loredana

    2014-04-01

    Image file format is often a confusing aspect for someone wishing to process medical images. This article presents a demystifying overview of the major file formats currently used in medical imaging: Analyze, Neuroimaging Informatics Technology Initiative (Nifti), Minc, and Digital Imaging and Communications in Medicine (Dicom). Concepts common to all file formats, such as pixel depth, photometric interpretation, metadata, and pixel data, are first presented. Then, the characteristics and strengths of the various formats are discussed. The review concludes with some predictive considerations about the future trends in medical image file formats. PMID:24338090

  1. Protein aggregation in salt solutions

    PubMed Central

    Kastelic, Miha; Kalyuzhnyi, Yurij V.; Hribar-Lee, Barbara; Dill, Ken A.; Vlachy, Vojko

    2015-01-01

    Protein aggregation is broadly important in diseases and in formulations of biological drugs. Here, we develop a theoretical model for reversible protein–protein aggregation in salt solutions. We treat proteins as hard spheres having square-well-energy binding sites, using Wertheim’s thermodynamic perturbation theory. The necessary condition required for such modeling to be realistic is that proteins in solution during the experiment remain in their compact form. Within this limitation our model gives accurate liquid–liquid coexistence curves for lysozyme and γ IIIa-crystallin solutions in respective buffers. It provides good fits to the cloud-point curves of lysozyme in buffer–salt mixtures as a function of the type and concentration of salt. It than predicts full coexistence curves, osmotic compressibilities, and second virial coefficients under such conditions. This treatment may also be relevant to protein crystallization. PMID:25964322

  2. Protein aggregation in salt solutions.

    PubMed

    Kastelic, Miha; Kalyuzhnyi, Yurij V; Hribar-Lee, Barbara; Dill, Ken A; Vlachy, Vojko

    2015-05-26

    Protein aggregation is broadly important in diseases and in formulations of biological drugs. Here, we develop a theoretical model for reversible protein-protein aggregation in salt solutions. We treat proteins as hard spheres having square-well-energy binding sites, using Wertheim's thermodynamic perturbation theory. The necessary condition required for such modeling to be realistic is that proteins in solution during the experiment remain in their compact form. Within this limitation our model gives accurate liquid-liquid coexistence curves for lysozyme and γ IIIa-crystallin solutions in respective buffers. It provides good fits to the cloud-point curves of lysozyme in buffer-salt mixtures as a function of the type and concentration of salt. It than predicts full coexistence curves, osmotic compressibilities, and second virial coefficients under such conditions. This treatment may also be relevant to protein crystallization.

  3. Aggregation of Heterogeneously Charged Colloids.

    PubMed

    Dempster, Joshua M; Olvera de la Cruz, Monica

    2016-06-28

    Patchy colloids are attractive as programmable building blocks for metamaterials. Inverse patchy colloids, in which a charged surface is decorated with patches of the opposite charge, are additionally noteworthy as models for heterogeneously charged biological materials such as proteins. We study the phases and aggregation behavior of a single charged patch in an oppositely charged colloid with a single-site model. This single-patch inverse patchy colloid model shows a large number of phases when varying patch size. For large patch sizes we find ferroelectric crystals, while small patch sizes produce cross-linked gels. Intermediate values produce monodisperse clusters and unusual worm structures that preserve finite ratios of area to volume. The polarization observed at large patch sizes is robust under extreme disorder in patch size and shape. We examine phase-temperature dependence and coexistence curves and find that large patch sizes produce polarized liquids, in contrast to mean-field predictions. Finally, we introduce small numbers of unpatched charged colloids. These can either suppress or encourage aggregation depending on their concentration and the size of the patches on the patched colloids. These effects can be exploited to control aggregation and to measure effective patch size.

  4. Register file soft error recovery

    SciTech Connect

    Fleischer, Bruce M.; Fox, Thomas W.; Wait, Charles D.; Muff, Adam J.; Watson, III, Alfred T.

    2013-10-15

    Register file soft error recovery including a system that includes a first register file and a second register file that mirrors the first register file. The system also includes an arithmetic pipeline for receiving data read from the first register file, and error detection circuitry to detect whether the data read from the first register file includes corrupted data. The system further includes error recovery circuitry to insert an error recovery instruction into the arithmetic pipeline in response to detecting the corrupted data. The inserted error recovery instruction replaces the corrupted data in the first register file with a copy of the data from the second register file.

  5. Tau Phosphorylation, Aggregation, and Cell Toxicity

    PubMed Central

    Avila, J.; Santa-María, I.; Pérez, M.; Hernández, F.; Moreno, F.

    2006-01-01

    Protein aggregation takes place in many neurodegenerative disorders. However, there is a controversy about the possible toxicity of these protein aggregates. In this review, this controversy is discussed, focussing on the tau aggregation that takes place in those disorders known as tauopathies. PMID:17047313

  6. Mineral resource of the month: aggregates

    USGS Publications Warehouse

    Willett, Jason C.

    2012-01-01

    Crushed stone and construction sand and gravel, the two major types of natural aggregates, are among the most abundant and accessible natural resources on the planet. The earliest civilizations used aggregates for various purposes, mainly construction. Today aggregates provide the basic raw materials for the foundation of modern society.

  7. A Grassroots Remote Sensing Toolkit Using Live Coding, Smartphones, Kites and Lightweight Drones.

    PubMed

    Anderson, K; Griffiths, D; DeBell, L; Hancock, S; Duffy, J P; Shutler, J D; Reinhardt, W J; Griffiths, A

    2016-01-01

    This manuscript describes the development of an android-based smartphone application for capturing aerial photographs and spatial metadata automatically, for use in grassroots mapping applications. The aim of the project was to exploit the plethora of on-board sensors within modern smartphones (accelerometer, GPS, compass, camera) to generate ready-to-use spatial data from lightweight aerial platforms such as drones or kites. A visual coding 'scheme blocks' framework was used to build the application ('app'), so that users could customise their own data capture tools in the field. The paper reports on the coding framework, then shows the results of test flights from kites and lightweight drones and finally shows how open-source geospatial toolkits were used to generate geographical information system (GIS)-ready GeoTIFF images from the metadata stored by the app. Two Android smartphones were used in testing-a high specification OnePlus One handset and a lower cost Acer Liquid Z3 handset, to test the operational limits of the app on phones with different sensor sets. We demonstrate that best results were obtained when the phone was attached to a stable single line kite or to a gliding drone. Results show that engine or motor vibrations from powered aircraft required dampening to ensure capture of high quality images. We demonstrate how the products generated from the open-source processing workflow are easily used in GIS. The app can be downloaded freely from the Google store by searching for 'UAV toolkit' (UAV toolkit 2016), and used wherever an Android smartphone and aerial platform are available to deliver rapid spatial data (e.g. in supporting decision-making in humanitarian disaster-relief zones, in teaching or for grassroots remote sensing and democratic mapping).

  8. A versatile valving toolkit for automating fluidic operations in paper microfluidic devices

    PubMed Central

    Toley, Bhushan J.; Wang, Jessica A.; Gupta, Mayuri; Buser, Joshua R.; Lafleur, Lisa K.; Lutz, Barry R.; Fu, Elain; Yager, Paul

    2015-01-01

    Failure to utilize valving and automation techniques has restricted the complexity of fluidic operations that can be performed in paper microfluidic devices. We developed a toolkit of paper microfluidic valves and methods for automatic valve actuation using movable paper strips and fluid-triggered expanding elements. To the best of our knowledge, this is the first functional demonstration of this valving strategy in paper microfluidics. After introduction of fluids on devices, valves can actuate automatically a) after a certain period of time, or b) after the passage of a certain volume of fluid. Timing of valve actuation can be tuned with greater than 8.5% accuracy by changing lengths of timing wicks, and we present timed on-valves, off-valves, and diversion (channel-switching) valves. The actuators require ~30 μl fluid to actuate and the time required to switch from one state to another ranges from ~5 s for short to ~50s for longer wicks. For volume-metered actuation, the size of a metering pad can be adjusted to tune actuation volume, and we present two methods – both methods can achieve greater than 9% accuracy. Finally, we demonstrate the use of these valves in a device that conducts a multi-step assay for the detection of the malaria protein PfHRP2. Although slightly more complex than devices that do not have moving parts, this valving and automation toolkit considerably expands the capabilities of paper microfluidic devices. Components of this toolkit can be used to conduct arbitrarily complex, multi-step fluidic operations on paper-based devices, as demonstrated in the malaria assay device. PMID:25606810

  9. Conservation and modification of genetic and physiological toolkits underpinning diapause in bumble bee queens.

    PubMed

    Amsalem, Etya; Galbraith, David A; Cnaani, Jonathan; Teal, Peter E A; Grozinger, Christina M

    2015-11-01

    Diapause is the key adaptation allowing insects to survive unfavourable conditions and inhabit an array of environments. Physiological changes during diapause are largely conserved across species and are hypothesized to be regulated by a conserved suite of genes (a 'toolkit'). Furthermore, it is hypothesized that in social insects, this toolkit was co-opted to mediate caste differentiation between long-lived, reproductive, diapause-capable queens and short-lived, sterile workers. Using Bombus terrestris queens, we examined the physiological and transcriptomic changes associated with diapause and CO2 treatment, which causes queens to bypass diapause. We performed comparative analyses with genes previously identified to be associated with diapause in the Dipteran Sarcophaga crassipalpis and with caste differentiation in bumble bees. As in Diptera, diapause in bumble bees is associated with physiological and transcriptional changes related to nutrient storage, stress resistance and core metabolic pathways. There is a significant overlap, both at the level of transcript and gene ontology, between the genetic mechanisms mediating diapause in B. terrestris and S. crassipalpis, reaffirming the existence of a conserved insect diapause genetic toolkit. However, a substantial proportion (10%) of the differentially regulated transcripts in diapausing queens have no clear orthologs in other species, and key players regulating diapause in Diptera (juvenile hormone and vitellogenin) appear to have distinct functions in bumble bees. We also found a substantial overlap between genes related to caste determination and diapause in bumble bees. Thus, our studies demonstrate an intriguing interplay between pathways underpinning adaptation to environmental extremes and the evolution of sociality in insects.

  10. The genome of Romanomermis culicivorax: revealing fundamental changes in the core developmental genetic toolkit in Nematoda

    PubMed Central

    2013-01-01

    Background The genetics of development in the nematode Caenorhabditis elegans has been described in exquisite detail. The phylum Nematoda has two classes: Chromadorea (which includes C. elegans) and the Enoplea. While the development of many chromadorean species resembles closely that of C. elegans, enoplean nematodes show markedly different patterns of early cell division and cell fate assignment. Embryogenesis of the enoplean Romanomermis culicivorax has been studied in detail, but the genetic circuitry underpinning development in this species has not been explored. Results We generated a draft genome for R. culicivorax and compared its gene content with that of C. elegans, a second enoplean, the vertebrate parasite Trichinella spiralis, and a representative arthropod, Tribolium castaneum. This comparison revealed that R. culicivorax has retained components of the conserved ecdysozoan developmental gene toolkit lost in C. elegans. T. spiralis has independently lost even more of this toolkit than has C. elegans. However, the C. elegans toolkit is not simply depauperate, as many novel genes essential for embryogenesis in C. elegans are not found in, or have only extremely divergent homologues in R. culicivorax and T. spiralis. Our data imply fundamental differences in the genetic programmes not only for early cell specification but also others such as vulva formation and sex determination. Conclusions Despite the apparent morphological conservatism, major differences in the molecular logic of development have evolved within the phylum Nematoda. R. culicivorax serves as a tractable system to contrast C. elegans and understand how divergent genomic and thus regulatory backgrounds nevertheless generate a conserved phenotype. The R. culicivorax draft genome will promote use of this species as a research model. PMID:24373391

  11. Evolving the US Climate Resilience Toolkit to Support a Climate-Smart Nation

    NASA Astrophysics Data System (ADS)

    Tilmes, C.; Niepold, F., III; Fox, J. F.; Herring, D.; Dahlman, L. E.; Hall, N.; Gardiner, N.

    2015-12-01

    Communities, businesses, resource managers, and decision-makers at all levels of government need information to understand and ameliorate climate-related risks. Likewise, climate information can expose latent opportunities. Moving from climate science to social and economic decisions raises complex questions about how to communicate the causes and impacts of climate variability and change; how to characterize and quantify vulnerabilities, risks, and opportunities faced by communities and businesses; and how to make and implement "win-win" adaptation plans at local, regional, and national scales. A broad coalition of federal agencies launched the U.S. Climate Resilience Toolkit (toolkit.climate.gov) in November 2014 to help our nation build resilience to climate-related extreme events. The site's primary audience is planners and decision makers in business, resource management, and government (at all levels) who seek science-based climate information and tools to help them in their near- and long-term planning. The Executive Office of the President assembled a task force of dozens of subject experts from across the 13 agencies of the U.S. Global Change Research Program to guide the site's development. The site's ongoing evolution is driven by feedback from the target audience. For example, based on feedback, climate projections will soon play a more prominent role in the site's "Climate Explorer" tool and case studies. The site's five-step adaptation planning process is being improved to better facilitate people getting started and to provide clear benchmarks for evaluating progress along the way. In this session, we will share lessons learned from a series of user engagements around the nation and evidence that the Toolkit couples climate information with actionable decision-making processes in ways that are helping Americans build resilience to climate-related stressors.

  12. A versatile valving toolkit for automating fluidic operations in paper microfluidic devices.

    PubMed

    Toley, Bhushan J; Wang, Jessica A; Gupta, Mayuri; Buser, Joshua R; Lafleur, Lisa K; Lutz, Barry R; Fu, Elain; Yager, Paul

    2015-03-21

    Failure to utilize valving and automation techniques has restricted the complexity of fluidic operations that can be performed in paper microfluidic devices. We developed a toolkit of paper microfluidic valves and methods for automatic valve actuation using movable paper strips and fluid-triggered expanding elements. To the best of our knowledge, this is the first functional demonstration of this valving strategy in paper microfluidics. After introduction of fluids on devices, valves can actuate automatically after a) a certain period of time, or b) the passage of a certain volume of fluid. Timing of valve actuation can be tuned with greater than 8.5% accuracy by changing lengths of timing wicks, and we present timed on-valves, off-valves, and diversion (channel-switching) valves. The actuators require ~30 μl fluid to actuate and the time required to switch from one state to another ranges from ~5 s for short to ~50 s for longer wicks. For volume-metered actuation, the size of a metering pad can be adjusted to tune actuation volume, and we present two methods - both methods can achieve greater than 9% accuracy. Finally, we demonstrate the use of these valves in a device that conducts a multi-step assay for the detection of the malaria protein PfHRP2. Although slightly more complex than devices that do not have moving parts, this valving and automation toolkit considerably expands the capabilities of paper microfluidic devices. Components of this toolkit can be used to conduct arbitrarily complex, multi-step fluidic operations on paper-based devices, as demonstrated in the malaria assay device.

  13. A Grassroots Remote Sensing Toolkit Using Live Coding, Smartphones, Kites and Lightweight Drones.

    PubMed

    Anderson, K; Griffiths, D; DeBell, L; Hancock, S; Duffy, J P; Shutler, J D; Reinhardt, W J; Griffiths, A

    2016-01-01

    This manuscript describes the development of an android-based smartphone application for capturing aerial photographs and spatial metadata automatically, for use in grassroots mapping applications. The aim of the project was to exploit the plethora of on-board sensors within modern smartphones (accelerometer, GPS, compass, camera) to generate ready-to-use spatial data from lightweight aerial platforms such as drones or kites. A visual coding 'scheme blocks' framework was used to build the application ('app'), so that users could customise their own data capture tools in the field. The paper reports on the coding framework, then shows the results of test flights from kites and lightweight drones and finally shows how open-source geospatial toolkits were used to generate geographical information system (GIS)-ready GeoTIFF images from the metadata stored by the app. Two Android smartphones were used in testing-a high specification OnePlus One handset and a lower cost Acer Liquid Z3 handset, to test the operational limits of the app on phones with different sensor sets. We demonstrate that best results were obtained when the phone was attached to a stable single line kite or to a gliding drone. Results show that engine or motor vibrations from powered aircraft required dampening to ensure capture of high quality images. We demonstrate how the products generated from the open-source processing workflow are easily used in GIS. The app can be downloaded freely from the Google store by searching for 'UAV toolkit' (UAV toolkit 2016), and used wherever an Android smartphone and aerial platform are available to deliver rapid spatial data (e.g. in supporting decision-making in humanitarian disaster-relief zones, in teaching or for grassroots remote sensing and democratic mapping). PMID:27144310

  14. JENDL Dosimetry File 99.

    2001-01-22

    Version 00 JENDL/D-99 contains information for 47 nuclides and 67 reactions in the SAND-II group structure (although it was observed by RSICC that not all of the processed files are in the SAND-II group structure) and as 0K preprocessed pointwise files.

  15. Hopper File Management Tool

    SciTech Connect

    Long, J W; O'Neill, N J; Smith, N G; Springmeyer, R R; Remmele, S; Richards, D A; Southon, J

    2004-11-15

    Hopper is a powerful interactive tool that allows users to transfer and manipulate files and directories by means of a graphical user interface. Users can connect to and manage resources using the major file transfer protocols. Implemented in Java, Hopper can be run almost anywhere: from an individual's desktop machine to large production machines. In a high-performance computing environment, managing files can become a difficult and time-consuming task that distracts from scientific work. Users must deal with multiple file transfer protocols, transferring enormous amounts of files between computer platforms, repeated authentication, organizing massive amounts of data, and other detailed but necessary tasks. This is often accomplished with a set of several different tools, each with its own interface and idiosyncrasies. Our goal is to develop tools for a more automated approach to file management that substantially improves users' ability to transfer, organize, search, and operate on collections of files. This paper describes the Hopper tool for advanced file management, including the software architecture, the functionality, and the user interface.

  16. Toolkit of Resources for Engaging Parents and Community as Partners in Education. Part 3: Building Trusting Relationships with Families & Community through Effective Communication

    ERIC Educational Resources Information Center

    Regional Educational Laboratory Pacific, 2015

    2015-01-01

    This toolkit is designed to guide school staff in strengthening partnerships with families and community members to support student learning. This toolkit offers an integrated approach to family and community engagement, bringing together research, promising practices, and a wide range of useful tools and resources with explanations and directions…

  17. Toolkit of Resources for Engaging Parents and Community as Partners in Education. Part I: Building an Understanding of Family and Community Engagement

    ERIC Educational Resources Information Center

    Regional Educational Laboratory Pacific, 2014

    2014-01-01

    This toolkit is designed to guide school staff in strengthening partnerships with families and community members to support student learning. This toolkit offers an integrated approach to family and community engagement, bringing together research, promising practices, and a wide range of useful tools and resources with explanations and directions…

  18. Toolkit of Resources for Engaging Families and the Community as Partners in Education: Part 2: Building a Cultural Bridge. REL 2016-151

    ERIC Educational Resources Information Center

    Garcia, Maria Elena; Frunzi, Kay; Dean, Ceri B.; Flores, Nieves; Miller, Kirsten B.

    2016-01-01

    The Toolkit of Resources for Engaging Families and the Community as Partners in Education is a four-part resource that brings together research, promising practices, and useful tools and resources to guide educators in strengthening partnerships with families and community members to support student learning. The toolkit defines family and…

  19. Toolkit of Resources for Engaging Families and the Community as Partners in Education: Part 4: Engaging All in Data Conversations. REL 2016-153

    ERIC Educational Resources Information Center

    Garcia, Maria Elena; Frunzi, Kay; Dean, Ceri B.; Flores, Nieves; Miller, Kirsten B.

    2016-01-01

    The Toolkit of Resources for Engaging Families and the Community as Partners in Education is a four-part resource that brings together research, promising practices, and useful tools and resources to guide educators in strengthening partnerships with families and community members to support student learning. This toolkit defines family and…

  20. Toolkit of Resources for Engaging Families and the Community as Partners in Education. Part 1: Building an Understanding of Family and Community Engagement. REL 2016-148

    ERIC Educational Resources Information Center

    Garcia, Maria Elena; Frunzi, Kay; Dean, Ceri B.; Flores, Nieves; Miller, Kirsten B.

    2016-01-01

    The Toolkit of Resources for Engaging Families and the Community as Partners in Education is a four-part resource that brings together research, promising practices, and useful tools and resources to guide educators in strengthening partnerships with families and community members to support student learning. The toolkit defines family and…

  1. A MultiSite Gateway Toolkit for Rapid Cloning of Vertebrate Expression Constructs with Diverse Research Applications

    PubMed Central

    Fowler, Daniel K.; Stewart, Scott; Seredick, Steve; Eisen, Judith S.

    2016-01-01

    Recombination-based cloning is a quick and efficient way to generate expression vectors. Recent advancements have provided powerful recombinant DNA methods for molecular manipulations. Here, we describe a novel collection of three-fragment MultiSite Gateway cloning system-compatible vectors providing expanded molecular tools for vertebrate research. The components of this toolkit encompass a broad range of uses such as fluorescent imaging, dual gene expression, RNA interference, tandem affinity purification, chemically-inducible dimerization and lentiviral production. We demonstrate examples highlighting the utility of this toolkit for producing multi-component vertebrate expression vectors with diverse primary research applications. The vectors presented here are compatible with other Gateway toolkits and collections, facilitating the rapid generation of a broad range of innovative DNA constructs for biological research. PMID:27500400

  2. Fluorescent Bisphosphonate and Carboxyphosphonate Probes: A Versatile Imaging Toolkit for Applications in Bone Biology and Biomedicine.

    PubMed

    Sun, Shuting; Błażewska, Katarzyna M; Kadina, Anastasia P; Kashemirov, Boris A; Duan, Xuchen; Triffitt, James T; Dunford, James E; Russell, R Graham G; Ebetino, Frank H; Roelofs, Anke J; Coxon, Fraser P; Lundy, Mark W; McKenna, Charles E

    2016-02-17

    A bone imaging toolkit of 21 fluorescent probes with variable spectroscopic properties, bone mineral binding affinities, and antiprenylation activities has been created, including a novel linking strategy. The linking chemistry allows attachment of a diverse selection of dyes fluorescent in the visible to near-infrared range to any of the three clinically important heterocyclic bisphosphonate bone drugs (risedronate, zoledronate, and minodronate or their analogues). The resultant suite of conjugates offers multiple options to "mix and match" parent drug structure, fluorescence emission wavelength, relative bone affinity, and presence or absence of antiprenylation activity, for bone-related imaging applications.

  3. Fluorescent Bisphosphonate and Carboxyphosphonate Probes: A Versatile Imaging Toolkit for Applications in Bone Biology and Biomedicine.

    PubMed

    Sun, Shuting; Błażewska, Katarzyna M; Kadina, Anastasia P; Kashemirov, Boris A; Duan, Xuchen; Triffitt, James T; Dunford, James E; Russell, R Graham G; Ebetino, Frank H; Roelofs, Anke J; Coxon, Fraser P; Lundy, Mark W; McKenna, Charles E

    2016-02-17

    A bone imaging toolkit of 21 fluorescent probes with variable spectroscopic properties, bone mineral binding affinities, and antiprenylation activities has been created, including a novel linking strategy. The linking chemistry allows attachment of a diverse selection of dyes fluorescent in the visible to near-infrared range to any of the three clinically important heterocyclic bisphosphonate bone drugs (risedronate, zoledronate, and minodronate or their analogues). The resultant suite of conjugates offers multiple options to "mix and match" parent drug structure, fluorescence emission wavelength, relative bone affinity, and presence or absence of antiprenylation activity, for bone-related imaging applications. PMID:26646666

  4. The development of a standard training toolkit for research studies that recruit pregnant women in labour

    PubMed Central

    2013-01-01

    Recruitment of pregnant women in labour to clinical trials poses particular challenges. Interpretation of regulation lacks consistency or clarity and variation occurs as to the training required by clinicians to safely contribute to the conduct of intrapartum studies. The Royal College of Obstetricians and Gynaecologists Intrapartum Clinical Study Group initiated the development of a pragmatic, proportionate and standardised toolkit for training clinical staff that complies with both regulatory and clinician requirements and has been peer-reviewed. This approach may be useful to researchers in acute care settings that necessitate the integration of research, routine clinical practice and compliance with regulation. PMID:24171801

  5. Advancements in Wind Integration Study Input Data Modeling: The Wind Integration National Dataset (WIND) Toolkit

    NASA Astrophysics Data System (ADS)

    Hodge, B.; Orwig, K.; McCaa, J. R.; Harrold, S.; Draxl, C.; Jones, W.; Searight, K.; Getman, D.

    2013-12-01

    projects to develop updated datasets: the Wind Integration National Dataset (WIND) Toolkit and the Solar Integration National Dataset (SIND) Toolkit. The WIND Toolkit spans 2007-2013 using advanced NWP methods run on a nationwide 2-km grid with 5-minute resolution, and includes over 110,000 onshore and offshore wind power production sites. This paper and presentation will discuss an overview of the WIND Toolkit modeling advancements, site selection, data accessibility, and validation results.

  6. Testability, Test Automation and Test Driven Development for the Trick Simulation Toolkit

    NASA Technical Reports Server (NTRS)

    Penn, John

    2014-01-01

    This paper describes the adoption of a Test Driven Development approach and a Continuous Integration System in the development of the Trick Simulation Toolkit, a generic simulation development environment for creating high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. It describes the approach, and the significant benefits seen, such as fast, thorough and clear test feedback every time code is checked into the code repository. It also describes an approach that encourages development of code that is testable and adaptable.

  7. Performance analysis of the Globus Toolkit Monitoring and Discovery Service, MDS2.

    SciTech Connect

    Zhang, X.; Schopf, J. M.; Mathematics and Computer Science; Univ. of Chicago

    2004-01-01

    Monitoring and information services form a key component of a distributed system, or grid. A quantitative study of such services can aid in understanding the performance limitations, advise in the deployment of the monitoring system, and help evaluate future development work. To this end, we examined the performance of the Globus Toolkit/spl reg/ Monitoring and Discovery Service (MDS2) by instrumenting its main services using NetLogger. Our study shows a strong advantage to caching or prefetching the data, as well as the need to have primary components at well-connected sites.

  8. Background simulation of the X-ray detectors using Geant4 toolkit

    NASA Astrophysics Data System (ADS)

    Sarkar, R.; Mandal, S.; Nandi, A.; Debnath, D.; Chakrabarti, S. K.; Rao, A. R.

    We have studied the background noise of X-ray detectors using the Geant4 simulation toolkit. The main source of background noise for the X-ray detectors of low earth orbit is due to cosmic diffused background photons. We have calculated the background spectrum for the CZT of ASTROSAT as well as the phoswich detector of RT-2. Also we have studied the importance of the veto detector to reduce the Compton induced background photons. In this simulation ess we also have optimized the passive shielding to minimize the detector weight within the allowed limit of background counts.

  9. The MicroAnalysis Toolkit: X-ray Fluorescence Image Processing Software

    SciTech Connect

    Webb, S. M.

    2011-09-09

    The MicroAnalysis Toolkit is an analysis suite designed for the processing of x-ray fluorescence microprobe data. The program contains a wide variety of analysis tools, including image maps, correlation plots, simple image math, image filtering, multiple energy image fitting, semi-quantitative elemental analysis, x-ray fluorescence spectrum analysis, principle component analysis, and tomographic reconstructions. To be as widely useful as possible, data formats from many synchrotron sources can be read by the program with more formats available by request. An overview of the most common features will be presented.

  10. Atlas Toolkit: Fast registration of 3D morphological datasets in the absence of landmarks

    PubMed Central

    Grocott, Timothy; Thomas, Paul; Münsterberg, Andrea E.

    2016-01-01

    Image registration is a gateway technology for Developmental Systems Biology, enabling computational analysis of related datasets within a shared coordinate system. Many registration tools rely on landmarks to ensure that datasets are correctly aligned; yet suitable landmarks are not present in many datasets. Atlas Toolkit is a Fiji/ImageJ plugin collection offering elastic group-wise registration of 3D morphological datasets, guided by segmentation of the interesting morphology. We demonstrate the method by combinatorial mapping of cell signalling events in the developing eyes of chick embryos, and use the integrated datasets to predictively enumerate Gene Regulatory Network states. PMID:26864723

  11. Diabetes and Healthy Eyes Toolkit: a community health worker program to prevent vision loss and blindness among people with diabetes.

    PubMed

    Ammary-Risch, Neyal J; Aguilar, Marcela; Goodman, Laura Saunders; Quiroz, Leslie

    2012-01-01

    Diabetic eye disease is a leading cause of vision loss and blindness in the United States and disproportionately affects Hispanics/Latinos. This article provides an overview of the Diabetes and Healthy Eyes Toolkit, a culturally and linguistically appropriate resource designed for community health workers to educate people with diabetes about eye health complications. The toolkit provides science-based, easy-to-understand information that can be used to conduct interactive, educational sessions about diabetes and eye health, the importance of comprehensive dilated eye examinations at least once a year for people with diabetes, and other ways to prevent vision loss and blindness.

  12. NeuroPigPen: A Scalable Toolkit for Processing Electrophysiological Signal Data in Neuroscience Applications Using Apache Pig.

    PubMed

    Sahoo, Satya S; Wei, Annan; Valdez, Joshua; Wang, Li; Zonjy, Bilal; Tatsuoka, Curtis; Loparo, Kenneth A; Lhatoo, Samden D

    2016-01-01

    The recent advances in neurological imaging and sensing technologies have led to rapid increase in the volume, rate of data generation, and variety of neuroscience data. This "neuroscience Big data" represents a significant opportunity for the biomedical research community to design experiments using data with greater timescale, large number of attributes, and statistically significant data size. The results from these new data-driven research techniques can advance our understanding of complex neurological disorders, help model long-term effects of brain injuries, and provide new insights into dynamics of brain networks. However, many existing neuroinformatics data processing and analysis tools were not built to manage large volume of data, which makes it difficult for researchers to effectively leverage this available data to advance their research. We introduce a new toolkit called NeuroPigPen that was developed using Apache Hadoop and Pig data flow language to address the challenges posed by large-scale electrophysiological signal data. NeuroPigPen is a modular toolkit that can process large volumes of electrophysiological signal data, such as Electroencephalogram (EEG), Electrocardiogram (ECG), and blood oxygen levels (SpO2), using a new distributed storage model called Cloudwave Signal Format (CSF) that supports easy partitioning and storage of signal data on commodity hardware. NeuroPigPen was developed with three design principles: (a) Scalability-the ability to efficiently process increasing volumes of data; (b) Adaptability-the toolkit can be deployed across different computing configurations; and (c) Ease of programming-the toolkit can be easily used to compose multi-step data processing pipelines using high-level programming constructs. The NeuroPigPen toolkit was evaluated using 750 GB of electrophysiological signal data over a variety of Hadoop cluster configurations ranging from 3 to 30 Data nodes. The evaluation results demonstrate that the toolkit

  13. NeuroPigPen: A Scalable Toolkit for Processing Electrophysiological Signal Data in Neuroscience Applications Using Apache Pig.

    PubMed

    Sahoo, Satya S; Wei, Annan; Valdez, Joshua; Wang, Li; Zonjy, Bilal; Tatsuoka, Curtis; Loparo, Kenneth A; Lhatoo, Samden D

    2016-01-01

    The recent advances in neurological imaging and sensing technologies have led to rapid increase in the volume, rate of data generation, and variety of neuroscience data. This "neuroscience Big data" represents a significant opportunity for the biomedical research community to design experiments using data with greater timescale, large number of attributes, and statistically significant data size. The results from these new data-driven research techniques can advance our understanding of complex neurological disorders, help model long-term effects of brain injuries, and provide new insights into dynamics of brain networks. However, many existing neuroinformatics data processing and analysis tools were not built to manage large volume of data, which makes it difficult for researchers to effectively leverage this available data to advance their research. We introduce a new toolkit called NeuroPigPen that was developed using Apache Hadoop and Pig data flow language to address the challenges posed by large-scale electrophysiological signal data. NeuroPigPen is a modular toolkit that can process large volumes of electrophysiological signal data, such as Electroencephalogram (EEG), Electrocardiogram (ECG), and blood oxygen levels (SpO2), using a new distributed storage model called Cloudwave Signal Format (CSF) that supports easy partitioning and storage of signal data on commodity hardware. NeuroPigPen was developed with three design principles: (a) Scalability-the ability to efficiently process increasing volumes of data; (b) Adaptability-the toolkit can be deployed across different computing configurations; and (c) Ease of programming-the toolkit can be easily used to compose multi-step data processing pipelines using high-level programming constructs. The NeuroPigPen toolkit was evaluated using 750 GB of electrophysiological signal data over a variety of Hadoop cluster configurations ranging from 3 to 30 Data nodes. The evaluation results demonstrate that the toolkit

  14. NeuroPigPen: A Scalable Toolkit for Processing Electrophysiological Signal Data in Neuroscience Applications Using Apache Pig

    PubMed Central

    Sahoo, Satya S.; Wei, Annan; Valdez, Joshua; Wang, Li; Zonjy, Bilal; Tatsuoka, Curtis; Loparo, Kenneth A.; Lhatoo, Samden D.

    2016-01-01

    The recent advances in neurological imaging and sensing technologies have led to rapid increase in the volume, rate of data generation, and variety of neuroscience data. This “neuroscience Big data” represents a significant opportunity for the biomedical research community to design experiments using data with greater timescale, large number of attributes, and statistically significant data size. The results from these new data-driven research techniques can advance our understanding of complex neurological disorders, help model long-term effects of brain injuries, and provide new insights into dynamics of brain networks. However, many existing neuroinformatics data processing and analysis tools were not built to manage large volume of data, which makes it difficult for researchers to effectively leverage this available data to advance their research. We introduce a new toolkit called NeuroPigPen that was developed using Apache Hadoop and Pig data flow language to address the challenges posed by large-scale electrophysiological signal data. NeuroPigPen is a modular toolkit that can process large volumes of electrophysiological signal data, such as Electroencephalogram (EEG), Electrocardiogram (ECG), and blood oxygen levels (SpO2), using a new distributed storage model called Cloudwave Signal Format (CSF) that supports easy partitioning and storage of signal data on commodity hardware. NeuroPigPen was developed with three design principles: (a) Scalability—the ability to efficiently process increasing volumes of data; (b) Adaptability—the toolkit can be deployed across different computing configurations; and (c) Ease of programming—the toolkit can be easily used to compose multi-step data processing pipelines using high-level programming constructs. The NeuroPigPen toolkit was evaluated using 750 GB of electrophysiological signal data over a variety of Hadoop cluster configurations ranging from 3 to 30 Data nodes. The evaluation results demonstrate that

  15. Microwave extinction characteristics of nanoparticle aggregates

    NASA Astrophysics Data System (ADS)

    Wu, Y. P.; Cheng, J. X.; Liu, X. X.; Wang, H. X.; Zhao, F. T.; Wen, W. W.

    2016-07-01

    Structure of nanoparticle aggregates plays an important role in microwave extinction capacity. The diffusion-limited aggregation model (DLA) for fractal growth is utilized to explore the possible structures of nanoparticle aggregates by computer simulation. Based on the discrete dipole approximation (DDA) method, the microwave extinction performance by different nano-carborundum aggregates is numerically analyzed. The effects of the particle quantity, original diameter, fractal structure, as well as orientation on microwave extinction are investigated, and also the extinction characteristics of aggregates are compared with the spherical nanoparticle in the same volume. Numerical results give out that proper aggregation of nanoparticle is beneficial to microwave extinction capacity, and the microwave extinction cross section by aggregated granules is better than that of the spherical solid one in the same volume.

  16. What favors convective aggregation and why?

    NASA Astrophysics Data System (ADS)

    Muller, Caroline; Bony, Sandrine

    2015-07-01

    The organization of convection is ubiquitous, but its physical understanding remains limited. One particular type of organization is the spatial self-aggregation of convection, taking the form of cloud clusters, or tropical cyclones in the presence of rotation. We show that several physical processes can give rise to self-aggregation and highlight the key features responsible for it, using idealized simulations. Longwave radiative feedbacks yield a "radiative aggregation." In that case, sufficient spatial variability of radiative cooling rates yields a low-level circulation, which induces the upgradient energy transport and radiative-convective instability. Not only do vertically integrated radiative budgets matter but the vertical profile of cooling is also crucial. Convective aggregation is facilitated when downdrafts below clouds are weak ("moisture-memory aggregation"), and this is sufficient to trigger aggregation in the absence of longwave radiative feedbacks. These results shed some light on the sensitivity of self-aggregation to various parameters, including resolution or domain size.

  17. Simulation of J-aggregate microcavity photoluminescence

    NASA Astrophysics Data System (ADS)

    Michetti, Paolo; La Rocca, Giuseppe C.

    2008-05-01

    We have developed a model in order to account for the photoexcitation dynamics of J-aggregate films and strongly coupled J-aggregate microcavities. The J aggregates are described as a disordered Frenkel exciton system in which relaxation occurs due to the presence of a thermal bath of molecular vibrations. The correspondence between the photophysics in J-aggregate films and that in J-aggregate microcavities is obtained by introducing a model polariton wave function mixing cavity photon modes and J-aggregate super-radiant excitons. With the same description of the material properties, we have calculated both absorption and luminescence spectra for the J-aggregate film and the photoluminescence of strongly coupled organic microcavities. The model is able to account for the fast relaxation dynamics in organic microcavities following nonresonant pumping and explains the temperature dependence of the ratio between the upper polariton and the lower polariton luminescence.

  18. Changes in fractal dimension during aggregation.

    PubMed

    Chakraborti, Rajat K; Gardner, Kevin H; Atkinson, Joseph F; Van Benschoten, John E

    2003-02-01

    Experiments were performed to evaluate temporal changes in the fractal dimension of aggregates formed during flocculation of an initially monodisperse suspension of latex microspheres. Particle size distributions and aggregate geometrical information at different mixing times were obtained using a non-intrusive optical sampling and digital image analysis technique, under variable conditions of mixing speed, coagulant (alum) dose and particle concentration. Pixel resolution required to determine aggregate size and geometric measures including the fractal dimension is discussed and a quantitative measure of accuracy is developed. The two-dimensional fractal dimension was found to range from 1.94 to 1.48, corresponding to aggregates that are either relatively compact or loosely structured, respectively. Changes in fractal dimension are explained using a conceptual model, which describes changes in fractal dimension associated with aggregate growth and changes in aggregate structure. For aggregation of an initially monodisperse suspension, the fractal dimension was found to decrease over time in the initial stages of floc formation.

  19. Inhomogeneous diffusion-limited aggregation

    NASA Technical Reports Server (NTRS)

    Selinger, Robin Blumberg; Nittmann, Johann; Stanley, H. E.

    1989-01-01

    It is demonstrated here that inhomogeneous diffusion-limited aggregation (DLA) model can be used to simulate viscous fingering in a medium with inhomogeneous permeability and homogeneous porosity. The medium consists of a pipe-pore square-lattice network in which all pores have equal volume and the pipes have negligible volume. It is shown that fluctuations in a DLA-based growth process may be tuned by noise reduction, and that fluctuations in the velocity of the moving interface are multiplicative in form.

  20. Familial Aggregation of Absolute Pitch

    PubMed Central

    Baharloo, Siamak; Service, Susan K.; Risch, Neil; Gitschier, Jane; Freimer, Nelson B.

    2000-01-01

    Absolute pitch (AP) is a behavioral trait that is defined as the ability to identify the pitch of tones in the absence of a reference pitch. AP is an ideal phenotype for investigation of gene and environment interactions in the development of complex human behaviors. Individuals who score exceptionally well on formalized auditory tests of pitch perception are designated as “AP-1.” As described in this report, auditory testing of siblings of AP-1 probands and of a control sample indicates that AP-1 aggregates in families. The implications of this finding for the mapping of loci for AP-1 predisposition are discussed. PMID:10924408

  1. VisDock: A Toolkit for Cross-Cutting Interactions in Visualization.

    PubMed

    Choi, Jungu; Park, Deok Gun; Wong, Yuet Ling; Fisher, Eli; Elmqvist, Niklas

    2015-09-01

    Standard user applications provide a range of cross-cutting interaction techniques that are common to virtually all such tools: selection, filtering, navigation, layer management, and cut-and-paste. We present VisDock, a JavaScript mixin library that provides a core set of these cross-cutting interaction techniques for visualization, including selection (lasso, paths, shape selection, etc), layer management (visibility, transparency, set operations, etc), navigation (pan, zoom, overview, magnifying lenses, etc), and annotation (point-based, region-based, data-space based, etc). To showcase the utility of the library, we have released it as Open Source and integrated it with a large number of existing web-based visualizations. Furthermore, we have evaluated VisDock using qualitative studies with both developers utilizing the toolkit to build new web-based visualizations, as well as with end-users utilizing it to explore movie ratings data. Results from these studies highlight the usability and effectiveness of the toolkit from both developer and end-user perspectives.

  2. Backtracking behaviour in lost ants: an additional strategy in their navigational toolkit

    PubMed Central

    Wystrach, Antoine; Schwarz, Sebastian; Baniel, Alice; Cheng, Ken

    2013-01-01

    Ants use multiple sources of information to navigate, but do not integrate all this information into a unified representation of the world. Rather, the available information appears to serve three distinct main navigational systems: path integration, systematic search and the use of learnt information—mainly via vision. Here, we report on an additional behaviour that suggests a supplemental system in the ant's navigational toolkit: ‘backtracking’. Homing ants, having almost reached their nest but, suddenly displaced to unfamiliar areas, did not show the characteristic undirected headings of systematic searches. Instead, these ants backtracked in the compass direction opposite to the path that they had just travelled. The ecological function of this behaviour is clear as we show it increases the chances of returning to familiar terrain. Importantly, the mechanistic implications of this behaviour stress an extra level of cognitive complexity in ant navigation. Our results imply: (i) the presence of a type of ‘memory of the current trip’ allowing lost ants to take into account the familiar view recently experienced, and (ii) direct sharing of information across different navigational systems. We propose a revised architecture of the ant's navigational toolkit illustrating how the different systems may interact to produce adaptive behaviours. PMID:23966644

  3. Iterative user centered design for development of a patient-centered fall prevention toolkit.

    PubMed

    Katsulis, Zachary; Ergai, Awatef; Leung, Wai Yin; Schenkel, Laura; Rai, Amisha; Adelman, Jason; Benneyan, James; Bates, David W; Dykes, Patricia C

    2016-09-01

    Due to the large number of falls that occur in hospital settings, inpatient fall prevention is a topic of great interest to patients and health care providers. The use of electronic decision support that tailors fall prevention strategy to patient-specific risk factors, known as Fall T.I.P.S (Tailoring Interventions for Patient Safety), has proven to be an effective approach for decreasing hospital falls. A paper version of the Fall T.I.P.S toolkit was developed primarily for hospitals that do not have the resources to implement the electronic solution; however, more work is needed to optimize the effectiveness of the paper version of this tool. We examined the use of human factors techniques in the redesign of the existing paper fall prevention tool with the goal of increasing ease of use and decreasing inpatient falls. The inclusion of patients and clinical staff in the redesign of the existing tool was done to increase adoption of the tool and fall prevention best practices. The redesigned paper Fall T.I.P.S toolkit showcased a built in clinical decision support system and increased ease of use over the existing version.

  4. VisDock: A Toolkit for Cross-Cutting Interactions in Visualization.

    PubMed

    Choi, Jungu; Park, Deok Gun; Wong, Yuet Ling; Fisher, Eli; Elmqvist, Niklas

    2015-09-01

    Standard user applications provide a range of cross-cutting interaction techniques that are common to virtually all such tools: selection, filtering, navigation, layer management, and cut-and-paste. We present VisDock, a JavaScript mixin library that provides a core set of these cross-cutting interaction techniques for visualization, including selection (lasso, paths, shape selection, etc), layer management (visibility, transparency, set operations, etc), navigation (pan, zoom, overview, magnifying lenses, etc), and annotation (point-based, region-based, data-space based, etc). To showcase the utility of the library, we have released it as Open Source and integrated it with a large number of existing web-based visualizations. Furthermore, we have evaluated VisDock using qualitative studies with both developers utilizing the toolkit to build new web-based visualizations, as well as with end-users utilizing it to explore movie ratings data. Results from these studies highlight the usability and effectiveness of the toolkit from both developer and end-user perspectives. PMID:26357289

  5. An expanded framework for the advanced computational testing and simulation toolkit

    SciTech Connect

    Marques, Osni A.; Drummond, Leroy A.

    2003-11-09

    The Advanced Computational Testing and Simulation (ACTS) Toolkit is a set of computational tools developed primarily at DOE laboratories and is aimed at simplifying the solution of common and important computational problems. The use of the tools reduces the development time for new codes and the tools provide functionality that might not otherwise be available. This document outlines an agenda for expanding the scope of the ACTS Project based on lessons learned from current activities. Highlights of this agenda include peer-reviewed certification of new tools; finding tools to solve problems that are not currently addressed by the Toolkit; working in collaboration with other software initiatives and DOE computer facilities; expanding outreach efforts; promoting interoperability, further development of the tools; and improving functionality of the ACTS Information Center, among other tasks. The ultimate goal is to make the ACTS tools more widely used and more effective in solving DOE's and the nation's scientific problems through the creation of a reliable software infrastructure for scientific computing.

  6. Backtracking behaviour in lost ants: an additional strategy in their navigational toolkit.

    PubMed

    Wystrach, Antoine; Schwarz, Sebastian; Baniel, Alice; Cheng, Ken

    2013-10-22

    Ants use multiple sources of information to navigate, but do not integrate all this information into a unified representation of the world. Rather, the available information appears to serve three distinct main navigational systems: path integration, systematic search and the use of learnt information--mainly via vision. Here, we report on an additional behaviour that suggests a supplemental system in the ant's navigational toolkit: 'backtracking'. Homing ants, having almost reached their nest but, suddenly displaced to unfamiliar areas, did not show the characteristic undirected headings of systematic searches. Instead, these ants backtracked in the compass direction opposite to the path that they had just travelled. The ecological function of this behaviour is clear as we show it increases the chances of returning to familiar terrain. Importantly, the mechanistic implications of this behaviour stress an extra level of cognitive complexity in ant navigation. Our results imply: (i) the presence of a type of 'memory of the current trip' allowing lost ants to take into account the familiar view recently experienced, and (ii) direct sharing of information across different navigational systems. We propose a revised architecture of the ant's navigational toolkit illustrating how the different systems may interact to produce adaptive behaviours.

  7. Modeling of a Flooding Induced Station Blackout for a Pressurized Water Reactor Using the RISMC Toolkit

    SciTech Connect

    Mandelli, Diego; Prescott, Steven R; Smith, Curtis L; Alfonsi, Andrea; Rabiti, Cristian; Cogliati, Joshua J; Kinoshita, Robert A

    2011-07-01

    In the Risk Informed Safety Margin Characterization (RISMC) approach we want to understand not just the frequency of an event like core damage, but how close we are (or are not) to key safety-related events and how might we increase our safety margins. The RISMC Pathway uses the probabilistic margin approach to quantify impacts to reliability and safety by coupling both probabilistic (via stochastic simulation) and mechanistic (via physics models) approaches. This coupling takes place through the interchange of physical parameters and operational or accident scenarios. In this paper we apply the RISMC approach to evaluate the impact of a power uprate on a pressurized water reactor (PWR) for a tsunami-induced flooding test case. This analysis is performed using the RISMC toolkit: RELAP-7 and RAVEN codes. RELAP-7 is the new generation of system analysis codes that is responsible for simulating the thermal-hydraulic dynamics of PWR and boiling water reactor systems. RAVEN has two capabilities: to act as a controller of the RELAP-7 simulation (e.g., system activation) and to perform statistical analyses (e.g., run multiple RELAP-7 simulations where sequencing/timing of events have been changed according to a set of stochastic distributions). By using the RISMC toolkit, we can evaluate how power uprate affects the system recovery measures needed to avoid core damage after the PWR lost all available AC power by a tsunami induced flooding. The simulation of the actual flooding is performed by using a smooth particle hydrodynamics code: NEUTRINO.

  8. Iterative user centered design for development of a patient-centered fall prevention toolkit.

    PubMed

    Katsulis, Zachary; Ergai, Awatef; Leung, Wai Yin; Schenkel, Laura; Rai, Amisha; Adelman, Jason; Benneyan, James; Bates, David W; Dykes, Patricia C

    2016-09-01

    Due to the large number of falls that occur in hospital settings, inpatient fall prevention is a topic of great interest to patients and health care providers. The use of electronic decision support that tailors fall prevention strategy to patient-specific risk factors, known as Fall T.I.P.S (Tailoring Interventions for Patient Safety), has proven to be an effective approach for decreasing hospital falls. A paper version of the Fall T.I.P.S toolkit was developed primarily for hospitals that do not have the resources to implement the electronic solution; however, more work is needed to optimize the effectiveness of the paper version of this tool. We examined the use of human factors techniques in the redesign of the existing paper fall prevention tool with the goal of increasing ease of use and decreasing inpatient falls. The inclusion of patients and clinical staff in the redesign of the existing tool was done to increase adoption of the tool and fall prevention best practices. The redesigned paper Fall T.I.P.S toolkit showcased a built in clinical decision support system and increased ease of use over the existing version. PMID:27184319

  9. Modeling the tagged-neutron UXO identification technique using the Geant4 toolkit

    SciTech Connect

    Zhou Y.; Mitra S.; Zhu X.; Wang Y.

    2011-10-16

    It is proposed to use 14 MeV neutrons tagged by the associated particle neutron time-of-flight technique (APnTOF) to identify the fillers of unexploded ordnances (UXO) by characterizing their carbon, nitrogen and oxygen contents. To facilitate the design and construction of a prototype system, a preliminary simulation model was developed, using the Geant4 toolkit. This work established the toolkit environment for (a) generating tagged neutrons, (b) their transport and interactions within a sample to induce emission and detection of characteristic gamma-rays, and (c) 2D and 3D-image reconstruction of the interrogated object using the neutron and gamma-ray time-of-flight information. Using the modeling, this article demonstrates the novelty of the tagged-neutron approach for extracting useful signals with high signal-to-background discrimination of an object-of-interest from that of its environment. Simulations indicated that an UXO filled with the RDX explosive, hexogen (C{sub 3}H{sub 6}O{sub 6}N{sub 6}), can be identified to a depth of 20 cm when buried in soil.

  10. A toolkit for integrated deterministic and probabilistic assessment for hydrogen infrastructure.

    SciTech Connect

    Groth, Katrina M.; Tchouvelev, Andrei V.

    2014-03-01

    There has been increasing interest in using Quantitative Risk Assessment [QRA] to help improve the safety of hydrogen infrastructure and applications. Hydrogen infrastructure for transportation (e.g. fueling fuel cell vehicles) or stationary (e.g. back-up power) applications is a relatively new area for application of QRA vs. traditional industrial production and use, and as a result there are few tools designed to enable QRA for this emerging sector. There are few existing QRA tools containing models that have been developed and validated for use in small-scale hydrogen applications. However, in the past several years, there has been significant progress in developing and validating deterministic physical and engineering models for hydrogen dispersion, ignition, and flame behavior. In parallel, there has been progress in developing defensible probabilistic models for the occurrence of events such as hydrogen release and ignition. While models and data are available, using this information is difficult due to a lack of readily available tools for integrating deterministic and probabilistic components into a single analysis framework. This paper discusses the first steps in building an integrated toolkit for performing QRA on hydrogen transportation technologies and suggests directions for extending the toolkit.

  11. Introducing GHOST: The Geospace/Heliosphere Observation & Simulation Tool-kit

    NASA Astrophysics Data System (ADS)

    Murphy, J. J.; Elkington, S. R.; Schmitt, P.; Wiltberger, M. J.; Baker, D. N.

    2013-12-01

    Simulation models of the heliospheric and geospace environments can provide key insights into the geoeffective potential of solar disturbances such as Coronal Mass Ejections and High Speed Solar Wind Streams. Advanced post processing of the results of these simulations greatly enhances the utility of these models for scientists and other researchers. Currently, no supported centralized tool exists for performing these processing tasks. With GHOST, we introduce a toolkit for the ParaView visualization environment that provides a centralized suite of tools suited for Space Physics post processing. Building on the work from the Center For Integrated Space Weather Modeling (CISM) Knowledge Transfer group, GHOST is an open-source tool suite for ParaView. The tool-kit plugin currently provides tools for reading LFM and Enlil data sets, and provides automated tools for data comparison with NASA's CDAweb database. As work progresses, many additional tools will be added and through open-source collaboration, we hope to add readers for additional model types, as well as any additional tools deemed necessary by the scientific public. The ultimate end goal of this work is to provide a complete Sun-to-Earth model analysis toolset.

  12. Using the Health Literacy Universal Precautions Toolkit to Improve the Quality of Patient Materials.

    PubMed

    Brega, Angela G; Freedman, Megan A G; LeBlanc, William G; Barnard, Juliana; Mabachi, Natabhona M; Cifuentes, Maribel; Albright, Karen; Weiss, Barry D; Brach, Cindy; West, David R

    2015-01-01

    Patient materials are often written above the reading level of most adults. Tool 11 of the Health Literacy Universal Precautions Toolkit ("Design Easy-to-Read Material") provides guidance on ensuring that written patient materials are easy to understand. As part of a pragmatic demonstration of the Toolkit, we examined how four primary care practices implemented Tool 11 and whether written materials improved as a result. We conducted interviews to learn about practices' implementation activities and assessed the readability, understandability, and actionability of patient education materials collected during pre- and postimplementation site visits. Interview data indicated that practices followed many action steps recommended in Tool 11, including training staff, assessing readability, and developing or revising materials, typically focusing on brief documents such as patient letters and information sheets. Many of the revised and newly developed documents had reading levels appropriate for most patients and--in the case of revised documents--better readability than the original materials. In contrast, the readability, understandability, and actionability of lengthier patient education materials were poor and did not improve over the 6-month implementation period. Findings guided revisions to Tool 11 and highlighted the importance of engaging multiple stakeholders in improving the quality of patient materials. PMID:26513033

  13. LZIFU: an emission-line fitting toolkit for integral field spectroscopy data

    NASA Astrophysics Data System (ADS)

    Ho, I.-Ting; Medling, Anne M.; Groves, Brent; Rich, Jeffrey A.; Rupke, David S. N.; Hampton, Elise; Kewley, Lisa J.; Bland-Hawthorn, Joss; Croom, Scott M.; Richards, Samuel; Schaefer, Adam L.; Sharp, Rob; Sweet, Sarah M.

    2016-09-01

    We present lzifu (LaZy-IFU), an idl toolkit for fitting multiple emission lines simultaneously in integral field spectroscopy (IFS) data. lzifu is useful for the investigation of the dynamical, physical and chemical properties of gas in galaxies. lzifu has already been applied to many world-class IFS instruments and large IFS surveys, including the Wide Field Spectrograph, the new Multi Unit Spectroscopic Explorer (MUSE), the Calar Alto Legacy Integral Field Area (CALIFA) survey, the Sydney-Australian-astronomical-observatory Multi-object Integral-field spectrograph (SAMI) Galaxy Survey. Here we describe in detail the structure of the toolkit, and how the line fluxes and flux uncertainties are determined, including the possibility of having multiple distinct kinematic components. We quantify the performance of lzifu, demonstrating its accuracy and robustness. We also show examples of applying lzifu to CALIFA and SAMI data to construct emission line and kinematic maps, and investigate complex, skewed line profiles presented in IFS data. The code is made available to the astronomy community through github. lzifu will be further developed over time to other IFS instruments, and to provide even more accurate line and uncertainty estimates.

  14. eVITAL: A Preliminary Taxonomy and Electronic Toolkit of Health-Related Habits and Lifestyle

    PubMed Central

    Salvador-Carulla, Luis; Olson Walsh, Carolyn; Alonso, Federico; Gómez, Rafael; de Teresa, Carlos; Cabo-Soler, José Ricardo; Cano, Antonio; Ruiz, Mencía

    2012-01-01

    Objectives. To create a preliminary taxonomy and related toolkit of health-related habits (HrH) following a person-centered approach with a focus on primary care. Methods. From 2003–2009, a working group (n = 6 physicians) defined the knowledge base, created a framing document, and selected evaluation tools using an iterative process. Multidisciplinary focus groups (n = 29 health professionals) revised the document and evaluation protocol and participated in a feasibility study and review of the model based on a demonstration study with 11 adult volunteers in Antequera, Spain. Results. The preliminary taxonomy contains 6 domains of HrH and 1 domain of additional health descriptors, 3 subdomains, 43 dimensions, and 141 subdimensions. The evaluation tool was completed by the 11 volunteers. The eVITAL toolkit contains history and examination items for 4 levels of engagement: self-assessment, basic primary care, extended primary care, and specialty care. There was positive feedback from the volunteers and experts, but concern about the length of the evaluation. Conclusions. We present the first taxonomy of HrH, which may aid the development of the new models of care such as the personal contextual factors of the International Classification of Functioning (ICF) and the positive and negative components of the multilevel person-centered integrative diagnosis model. PMID:22545016

  15. DDG4 A Simulation Framework based on the DD4hep Detector Description Toolkit

    NASA Astrophysics Data System (ADS)

    Frank, M.; Gaede, F.; Nikiforou, N.; Petric, M.; Sailer, A.

    2015-12-01

    The detector description is an essential component that has to be used to analyse and simulate data resulting from particle collisions in high energy physics experiments. Based on the DD4hep detector description toolkit a flexible and data driven simulation framework was designed using the Geant4 tool-kit. We present this framework and describe the guiding requirements and the architectural design, which was strongly driven by ease of use. The goal was, given an existing detector description, to simulate the detector response to particle collisions in high energy physics experiments with minimal effort, but not impose restrictions to support enhanced or improved behaviour. Starting from the ROOT based geometry implementation used by DD4hep an automatic conversion mechanism to Geant4 was developed. The physics response and the mechanism to input particle data from generators was highly formalized and can be instantiated on demand using known factory patterns. A palette of components to model the detector response is provided by default, but improved or more sophisticated components may easily be added using the factory pattern. Only the final configuration of the instantiated components has to be provided by end-users using either C++ or python scripting or an XML based description.

  16. Construction of file database management

    SciTech Connect

    MERRILL,KYLE J.

    2000-03-01

    This work created a database for tracking data analysis files from multiple lab techniques and equipment stored on a central file server. Experimental details appropriate for each file type are pulled from the file header and stored in a searchable database. The database also stores specific location and self-directory structure for each data file. Queries can be run on the database according to file type, sample type or other experimental parameters. The database was constructed in Microsoft Access and Visual Basic was used for extraction of information from the file header.

  17. Detergent-mediated protein aggregation.

    PubMed

    Neale, Chris; Ghanei, Hamed; Holyoake, John; Bishop, Russell E; Privé, Gilbert G; Pomès, Régis

    2013-04-01

    Because detergents are commonly used to solvate membrane proteins for structural evaluation, much attention has been devoted to assessing the conformational bias imparted by detergent micelles in comparison to the native environment of the lipid bilayer. Here, we conduct six 500-ns simulations of a system with >600,000 atoms to investigate the spontaneous self assembly of dodecylphosphocholine detergent around multiple molecules of the integral membrane protein PagP. This detergent formed equatorial micelles in which acyl chains surround the protein's hydrophobic belt, confirming existing models of the detergent solvation of membrane proteins. In addition, unexpectedly, the extracellular and periplasmic apical surfaces of PagP interacted with the headgroups of detergents in other micelles 85 and 60% of the time, respectively, forming complexes that were stable for hundreds of nanoseconds. In some cases, an apical surface of one molecule of PagP interacted with an equatorial micelle surrounding another molecule of PagP. In other cases, the apical surfaces of two molecules of PagP simultaneously bound a neat detergent micelle. In these ways, detergents mediated the non-specific aggregation of folded PagP. These simulation results are consistent with dynamic light scattering experiments, which show that, at detergent concentrations ≥600 mM, PagP induces the formation of large scattering species that are likely to contain many copies of the PagP protein. Together, these simulation and experimental results point to a potentially generic mechanism of detergent-mediated protein aggregation.

  18. Attracted diffusion-limited aggregation.

    PubMed

    Rahbari, S H Ebrahimnazhad; Saberi, A A

    2012-07-01

    In this paper we present results of extensive Monte Carlo simulations of diffusion-limited aggregation (DLA) with a seed placed on an attractive plane as a simple model in connection with the electrical double layers. We compute the fractal dimension of the aggregated patterns as a function of the attraction strength α. For the patterns grown in both two and three dimensions, the fractal dimension shows a significant dependence on the attraction strength for small values of α and approaches that of the ordinary two-dimensional (2D) DLA in the limit of large α. For the nonattracting case with α = 1, our results in three dimensions reproduce the patterns of 3D ordinary DLA, while in two dimensions our model leads to the formation of a compact cluster with dimension 2. For intermediate α, the 3D clusters have a quasi-2D structure with a fractal dimension very close to that of the ordinary 2D DLA. This allows one to control the morphology of a growing cluster by tuning a single external parameter α. PMID:23005417

  19. Attracted diffusion-limited aggregation

    NASA Astrophysics Data System (ADS)

    Rahbari, S. H. Ebrahimnazhad; Saberi, A. A.

    2012-07-01

    In this paper we present results of extensive Monte Carlo simulations of diffusion-limited aggregation (DLA) with a seed placed on an attractive plane as a simple model in connection with the electrical double layers. We compute the fractal dimension of the aggregated patterns as a function of the attraction strength α. For the patterns grown in both two and three dimensions, the fractal dimension shows a significant dependence on the attraction strength for small values of α and approaches that of the ordinary two-dimensional (2D) DLA in the limit of large α. For the nonattracting case with α=1, our results in three dimensions reproduce the patterns of 3D ordinary DLA, while in two dimensions our model leads to the formation of a compact cluster with dimension 2. For intermediate α, the 3D clusters have a quasi-2D structure with a fractal dimension very close to that of the ordinary 2D DLA. This allows one to control the morphology of a growing cluster by tuning a single external parameter α.

  20. Sectoral shifts and aggregate unemployment

    SciTech Connect

    Loungani, P.

    1986-01-01

    Some recent research has taken the view that sectoral or industry-specific shocks significantly affect aggregate unemployment by increasing the amount of inter-industry labor reallocation required. The empirical evidence for this view rests on the finding that during the 1950s - and again during the 1970s - there was a positive correlation between aggregate unemployment and the dispersion of employment growth rates. This thesis demonstrates that this correlation arises largely because oil price shocks affect both unemployment and the dispersion of employment growth. Once the dispersion due to oil shocks is accounted for, the residual dispersion in employment has very low explanatory power for unemployment. Since the dispersion index does not measure pure sectoral shifts, an alternate measure of dispersion is developed that serves as a better proxy for the amount of inter-industry labor reallocation required each period. Estimates using this measure suggest that, during the 1950s, temporary increases in the relative price of oil were responsible for generating the observed correlation. On the other hand, sectoral shifts were important during the 1970s; in particular, the 1973 oil price increase has had significant reallocative effects on the economy. This contention is subjected to further tests by looking at the time-series behavior of employment in durable-goods industries and also by following the inter-industry movements of workers over time through the use of panel data.