Science.gov

Sample records for files aggregation toolkit

  1. The ALFA (Activity Log Files Aggregation) Toolkit: A Method for Precise Observation of the Consultation

    PubMed Central

    2008-01-01

    Background There is a lack of tools to evaluate and compare Electronic patient record (EPR) systems to inform a rational choice or development agenda. Objective To develop a tool kit to measure the impact of different EPR system features on the consultation. Methods We first developed a specification to overcome the limitations of existing methods. We divided this into work packages: (1) developing a method to display multichannel video of the consultation; (2) code and measure activities, including computer use and verbal interactions; (3) automate the capture of nonverbal interactions; (4) aggregate multiple observations into a single navigable output; and (5) produce an output interpretable by software developers. We piloted this method by filming live consultations (n = 22) by 4 general practitioners (GPs) using different EPR systems. We compared the time taken and variations during coded data entry, prescribing, and blood pressure (BP) recording. We used nonparametric tests to make statistical comparisons. We contrasted methods of BP recording using Unified Modeling Language (UML) sequence diagrams. Results We found that 4 channels of video were optimal. We identified an existing application for manual coding of video output. We developed in-house tools for capturing use of keyboard and mouse and to time stamp speech. The transcript is then typed within this time stamp. Although we managed to capture body language using pattern recognition software, we were unable to use this data quantitatively. We loaded these observational outputs into our aggregation tool, which allows simultaneous navigation and viewing of multiple files. This also creates a single exportable file in XML format, which we used to develop UML sequence diagrams. In our pilot, the GP using the EMIS LV (Egton Medical Information Systems Limited, Leeds, UK) system took the longest time to code data (mean 11.5 s, 95% CI 8.7-14.2). Nonparametric comparison of EMIS LV with the other systems showed

  2. Small file aggregation in a parallel computing system

    DOEpatents

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Grider, Gary; Zhang, Jingwang

    2014-09-02

    Techniques are provided for small file aggregation in a parallel computing system. An exemplary method for storing a plurality of files generated by a plurality of processes in a parallel computing system comprises aggregating the plurality of files into a single aggregated file; and generating metadata for the single aggregated file. The metadata comprises an offset and a length of each of the plurality of files in the single aggregated file. The metadata can be used to unpack one or more of the files from the single aggregated file.

  3. Recent College Graduates Study, 1987 (RCGS-1987). Combined File of Survey and Aggregate Transcript Data [machine-readable data file].

    ERIC Educational Resources Information Center

    National Center for Education Statistics (ED), Washington, DC.

    The 1987 Recent College Graduates Study (RCGS) machine-readable data file, RECENT.GRADS.COMBINED.A8586, is the third of three data files produced from the study and contains information about 1985-86 bachelor's degree graduates for whom both questionnaire and transcript data were collected. The combined survey and aggregate transcript data file…

  4. Basic Internet Software Toolkit.

    ERIC Educational Resources Information Center

    Buchanan, Larry

    1998-01-01

    Once schools are connected to the Internet, the next step is getting network workstations configured for Internet access. This article describes a basic toolkit comprising software currently available on the Internet for free or modest cost. Lists URLs for Web browser, Telnet, FTP, file decompression, portable document format (PDF) reader,…

  5. Geospatial Toolkit

    Energy Science and Technology Software Center (ESTSC)

    2010-10-14

    The Geospatial Toolkit is an NREL-developed map-based software application that integrates resource data and other geographic information systems (GIS) data for integrated resource assessment. The non-resource, country-specific data for each toolkit comes from a variety of agencies within each country as well as from global datasets. Originally developed in 2005, the Geospatial Toolkit was completely redesigned and re-released in November 2010 to provide a more modern, easier-to-use interface with considerably faster analytical querying capabilities. Themore » revised version of the Geospatial Toolkit has been released for all original toolkit countries/regions and each software package is made available on NREL's website,« less

  6. Geospatial Toolkit

    SciTech Connect

    2010-10-14

    The Geospatial Toolkit is an NREL-developed map-based software application that integrates resource data and other geographic information systems (GIS) data for integrated resource assessment. The non-resource, country-specific data for each toolkit comes from a variety of agencies within each country as well as from global datasets. Originally developed in 2005, the Geospatial Toolkit was completely redesigned and re-released in November 2010 to provide a more modern, easier-to-use interface with considerably faster analytical querying capabilities. The revised version of the Geospatial Toolkit has been released for all original toolkit countries/regions and each software package is made available on NREL's website,

  7. Molecular techniques in ecohealth research toolkit: facilitating estimation of aggregate gastroenteritis burden in an irrigated periurban landscape.

    PubMed

    Tserendorj, Ariuntuya; Anceno, Alfredo J; Houpt, Eric R; Icenhour, Crystal R; Sethabutr, Orntipa; Mason, Carl S; Shipin, Oleg V

    2011-09-01

    Assessment of microbial hazards associated with certain environmental matrices, livelihood strategies, and food handling practices are constrained by time-consuming conventional microbiological techniques that lead to health risk assessments of narrow geographic or time scope, often targeting very few pathogens. Health risk assessment based on one or few indicator organisms underestimates true disease burden due a number of coexisting causative pathogens. Here, we employed molecular techniques in a survey of Cryptosporidium parvum, Giardia lamblia, Campylobacter jejuni, Escherichia coli O157:H7, Listeria monocytogenes, Salmonella spp., Shigella spp., Vibrio cholera, and Rotavirus A densities in canal water with respect to seasonality and spatial distribution of point-nonpoint pollution sources. Three irrigational canals stretching across nearly a 150-km(2) periurban landscape, traditionally used for agricultural irrigation but function as vital part of municipal wastewater stabilization in recent years, were investigated. Compiled stochastic data (pathogen concentration, susceptible populations) and literature-obtained deterministic data (pathogen dose-response model parameter values) were used in estimating waterborne gastroenteritis burden. Exposure scenarios include swimming or fishing, consuming canal water-irrigated vegetables, and ingesting or inhaling water aerosols while working in canal water-irrigated fields. Estimated annual gastroenteritis burden due individual pathogens among the sampling points was -10.6log(10) to -2.2log(10) DALYs. Aggregated annual gastroenteritis burden due all the target pathogens per sampling point was -3.1log(10) to -1.9log(10) DALYs, far exceeding WHO acceptable limit of -6.0log(10) DALYs. The present approach will facilitate the comprehensive collection of surface water microbiological baseline data and setting of benchmarks for interventions aimed at reducing microbial hazards in similar landscapes worldwide. PMID:22146856

  8. Literacy Toolkit

    ERIC Educational Resources Information Center

    Center for Best Practices in Early Childhood Education, 2005

    2005-01-01

    The toolkit contains print and electronic resources, including (1) "eMERGing Literacy and Technology: Working Together", A 492 page curriculum guide; (2) "LitTECH Interactive Presents: The Beginning of Literacy", a DVD that provides and overview linking technology to the concepts of emerging literacy; (3) "Your Preschool Classroom Computer Center:…

  9. Local Toolkit

    Energy Science and Technology Software Center (ESTSC)

    2007-05-31

    The LOCAL Toolkit contains tools and libraries developed under the LLNL LOCAL LDRD project for managing and processing large unstructured data sets primrily from parallel numerical simulations, such as triangular, tetrahedral, and hexahedral meshes, point sets, and graphs. The tools have three main functionalities: cache-coherent, linear ordering of multidimensional data; lossy and lossless data compression optimized for different data types; and an out-of-core streaming I/O library with simple processing modules for unstructed data.

  10. Tracker Toolkit

    NASA Technical Reports Server (NTRS)

    Lewis, Steven J.; Palacios, David M.

    2013-01-01

    This software can track multiple moving objects within a video stream simultaneously, use visual features to aid in the tracking, and initiate tracks based on object detection in a subregion. A simple programmatic interface allows plugging into larger image chain modeling suites. It extracts unique visual features for aid in tracking and later analysis, and includes sub-functionality for extracting visual features about an object identified within an image frame. Tracker Toolkit utilizes a feature extraction algorithm to tag each object with metadata features about its size, shape, color, and movement. Its functionality is independent of the scale of objects within a scene. The only assumption made on the tracked objects is that they move. There are no constraints on size within the scene, shape, or type of movement. The Tracker Toolkit is also capable of following an arbitrary number of objects in the same scene, identifying and propagating the track of each object from frame to frame. Target objects may be specified for tracking beforehand, or may be dynamically discovered within a tripwire region. Initialization of the Tracker Toolkit algorithm includes two steps: Initializing the data structures for tracked target objects, including targets preselected for tracking; and initializing the tripwire region. If no tripwire region is desired, this step is skipped. The tripwire region is an area within the frames that is always checked for new objects, and all new objects discovered within the region will be tracked until lost (by leaving the frame, stopping, or blending in to the background).

  11. Teacher Quality Toolkit

    ERIC Educational Resources Information Center

    Lauer, Patricia A.; Dean, Ceri B.

    2004-01-01

    This Teacher Quality Toolkit aims to support the continuum of teacher learning by providing tools that institutions of higher education, districts, and schools can use to improve both preservice and inservice teacher education. The toolkit incorporates McREL?s accumulated knowledge and experience related to teacher quality and standards-based…

  12. Community Schools Evaluation Toolkit

    ERIC Educational Resources Information Center

    Shah, Shital C.; Brink, Katrina; London, Rebecca; Masur, Shelly; Quihuis, Gisell

    2009-01-01

    This toolkit is designed to help community schools evaluate their efforts so that they learn from their successes, identify current challenges, and plan future efforts. It provides a step-by-step process for planning and conducting an evaluation at your community school site(s). The toolkit is a practical, hands-on guide that makes it possible for…

  13. Student Success Center Toolkit

    ERIC Educational Resources Information Center

    Jobs For the Future, 2014

    2014-01-01

    "Student Success Center Toolkit" is a compilation of materials organized to assist Student Success Center directors as they staff, launch, operate, and sustain Centers. The toolkit features materials created and used by existing Centers, such as staffing and budgeting templates, launch materials, sample meeting agendas, and fundraising…

  14. A new open-source Python-based Space Weather data access, visualization, and analysis toolkit

    NASA Astrophysics Data System (ADS)

    de Larquier, S.; Ribeiro, A.; Frissell, N. A.; Spaleta, J.; Kunduri, B.; Thomas, E. G.; Ruohoniemi, J.; Baker, J. B.

    2013-12-01

    Space weather research relies heavily on combining and comparing data from multiple observational platforms. Current frameworks exist to aggregate some of the data sources, most based on file downloads via web or ftp interfaces. Empirical models are mostly fortran based and lack interfaces with more useful scripting languages. In an effort to improve data and model access, the SuperDARN community has been developing a Python-based Space Science Data Visualization Toolkit (DaViTpy). At the center of this development was a redesign of how our data (from 30 years of SuperDARN radars) was made available. Several access solutions are now wrapped into one convenient Python interface which probes local directories, a new remote NoSQL database, and an FTP server to retrieve the requested data based on availability. Motivated by the efficiency of this interface and the inherent need for data from multiple instruments, we implemented similar modules for other space science datasets (POES, OMNI, Kp, AE...), and also included fundamental empirical models with Python interfaces to enhance data analysis (IRI, HWM, MSIS...). All these modules and more are gathered in a single convenient toolkit, which is collaboratively developed and distributed using Github and continues to grow. While still in its early stages, we expect this toolkit will facilitate multi-instrument space weather research and improve scientific productivity.

  15. TOOLKIT, Version 2. 0

    SciTech Connect

    Schroeder, E.; Bagot, B.; McNeill, R.L.

    1990-05-09

    The purpose of this User's Guide is to show by example many of the features of Toolkit II. Some examples will be copies of screens as they appear while running the Toolkit. Other examples will show what the user should enter in various situations; in these instances, what the computer asserts will be in boldface and what the user responds will be in regular type. The User's Guide is divided into four sections. The first section, FOCUS Databases'', will give a broad overview of the Focus administrative databases that are available on the VAX; easy-to-use reports are available for most of them in the Toolkit. The second section, Getting Started'', will cover the steps necessary to log onto the Computer Center VAX cluster and how to start Focus and the Toolkit. The third section, Using the Toolkit'', will discuss some of the features in the Toolkit -- the available reports and how to access them, as well as some utilities. The fourth section, Helpful Hints'', will cover some useful facts about the VAX and Focus as well as some of the more common problems that can occur. The Toolkit is not set in concrete but is continually being revised and improved. If you have any opinions as to changes that you would like to see made to the Toolkit or new features that you would like included, please let us know. Since we do try to respond to the needs of the user and make periodic improvement to the Toolkit, this User's Guide may not correspond exactly to what is available in the computer. In general, changes are made to provide new options or features; rarely is an existing feature deleted.

  16. JAVA Stereo Display Toolkit

    NASA Technical Reports Server (NTRS)

    Edmonds, Karina

    2008-01-01

    This toolkit provides a common interface for displaying graphical user interface (GUI) components in stereo using either specialized stereo display hardware (e.g., liquid crystal shutter or polarized glasses) or anaglyph display (red/blue glasses) on standard workstation displays. An application using this toolkit will work without modification in either environment, allowing stereo software to reach a wider audience without sacrificing high-quality display on dedicated hardware. The toolkit is written in Java for use with the Swing GUI Toolkit and has cross-platform compatibility. It hooks into the graphics system, allowing any standard Swing component to be displayed in stereo. It uses the OpenGL graphics library to control the stereo hardware and to perform the rendering. It also supports anaglyph and special stereo hardware using the same API (application-program interface), and has the ability to simulate color stereo in anaglyph mode by combining the red band of the left image with the green/blue bands of the right image. This is a low-level toolkit that accomplishes simply the display of components (including the JadeDisplay image display component). It does not include higher-level functions such as disparity adjustment, 3D cursor, or overlays all of which can be built using this toolkit.

  17. System Design Toolkit for Integrated Modular Avionics for Space

    NASA Astrophysics Data System (ADS)

    Hann, Mark; Balbastre Betoret, Patricia; Simo Ten, Jose Enrique; De Ferluc, Regis; Ramachandran, Jinesh

    2015-09-01

    The IMA-SP development process identified tools were needed to perform the activities of: i) Partitioning and Resource Allocation and ii) System Feasibility Assessment. This paper describes the definition, design, implementation and test of the tool support required to perform the IMA-SP development process activities. This includes the definition of a data model, with associated files and file formats, describing the complete setup of a partitioned system and allowing system feasibility assessment; the development of a prototype of the tool set, that is called the IMA-SP System Design Toolkit (SDT) and the demonstration of the toolkit on a case study.

  18. The Einstein Toolkit

    NASA Astrophysics Data System (ADS)

    Löffler, Frank

    2012-03-01

    The Einstein Toolkit Consortium is developing and supporting open software for relativistic astrophysics. Its aim is to provide the core computational tools that can enable new science, broaden our community, facilitate interdisciplinary research and take advantage of petascale computers and advanced cyberinfrastructure. The Einstein Toolkit currently consists of an open set of over 100 modules for the Cactus framework, primarily for computational relativity along with associated tools for simulation management and visualization. The toolkit includes solvers for vacuum spacetimes as well as relativistic magneto-hydrodynamics, along with modules for initial data, analysis and computational infrastructure. These modules have been developed and improved over many years by many different researchers. The Einstein Toolkit is supported by a distributed model, combining core support of software, tools, and documentation in its own repositories and through partnerships with other developers who contribute open software and coordinate together on development. As of January 2012 it has 68 registered members from 30 research groups world-wide. This talk will present the current capabilities of the Einstein Toolkit and will point to information how to leverage it for future research.

  19. Knowledge information management toolkit and method

    DOEpatents

    Hempstead, Antoinette R.; Brown, Kenneth L.

    2006-08-15

    A system is provided for managing user entry and/or modification of knowledge information into a knowledge base file having an integrator support component and a data source access support component. The system includes processing circuitry, memory, a user interface, and a knowledge base toolkit. The memory communicates with the processing circuitry and is configured to store at least one knowledge base. The user interface communicates with the processing circuitry and is configured for user entry and/or modification of knowledge pieces within a knowledge base. The knowledge base toolkit is configured for converting knowledge in at least one knowledge base from a first knowledge base form into a second knowledge base form. A method is also provided.

  20. Water Security Toolkit

    Energy Science and Technology Software Center (ESTSC)

    2012-09-11

    The Water Security Toolkit (WST) provides software for modeling and analyzing water distribution systems to minimize the potential impact of contamination incidents. WST wraps capabilities for contaminant transport, impact assessment, and sensor network design with response action plans, including source identification, rerouting, and decontamination, to provide a range of water security planning and real-time applications.

  1. THE EPANET PROGRAMMER'S TOOLKIT FOR ANALYSIS OF WATER DISTRIBUTION SYSTEMS

    EPA Science Inventory

    The EPANET Programmer's Toolkit is a collection of functions that helps simplify computer programming of water distribution network analyses. the functions can be used to read in a pipe network description file, modify selected component properties, run multiple hydraulic and wa...

  2. Parallel Climate Analysis Toolkit (ParCAT)

    Energy Science and Technology Software Center (ESTSC)

    2013-06-30

    The parallel analysis toolkit (ParCAT) provides parallel statistical processing of large climate model simulation datasets. ParCAT provides parallel point-wise average calculations, frequency distributions, sum/differences of two datasets, and difference-of-average and average-of-difference for two datasets for arbitrary subsets of simulation time. ParCAT is a command-line utility that can be easily integrated in scripts or embedded in other application. ParCAT supports CMIP5 post-processed datasets as well as non-CMIP5 post-processed datasets. ParCAT reads and writes standard netCDF files.

  3. Self-assessment toolkit.

    PubMed

    2016-09-01

    A new health and integration toolkit has been launched by NHS Clinical Commissioners, in partnership with the Local Government Association, NHS Confederation and the Association of Directors of Adult Services. The self-assessment tool is designed to help local health and care leaders, through health and well-being boards, to assess their ambition, capability, capacity and readiness to integrate local health and social care services. PMID:27581897

  4. Molecular model generator toolkit

    SciTech Connect

    Schneider, R.D.

    1994-07-01

    This report is a user manual for an ASCII file of Fortran source code which must be compiled before use. The software will assist in creating plastic models of molecules whose specifications are described in the Brookhaven Protein Databank. Other data files can be used if they are in the same format as the files in the databank. The output file is a program for a 3-D Systems Stereolithography Apparatus and the program is run on a SGI Indigo workstation.

  5. Network algorithms for information analysis using the Titan Toolkit.

    SciTech Connect

    McLendon, William Clarence, III; Baumes, Jeffrey; Wilson, Andrew T.; Wylie, Brian Neil; Shead, Timothy M.

    2010-07-01

    The analysis of networked activities is dramatically more challenging than many traditional kinds of analysis. A network is defined by a set of entities (people, organizations, banks, computers, etc.) linked by various types of relationships. These entities and relationships are often uninteresting alone, and only become significant in aggregate. The analysis and visualization of these networks is one of the driving factors behind the creation of the Titan Toolkit. Given the broad set of problem domains and the wide ranging databases in use by the information analysis community, the Titan Toolkit's flexible, component based pipeline provides an excellent platform for constructing specific combinations of network algorithms and visualizations.

  6. The Weather and Climate Toolkit

    NASA Astrophysics Data System (ADS)

    Ansari, S.; Del Greco, S.; Hankins, B.

    2010-12-01

    The Weather and Climate Toolkit (WCT) is free, platform independent software distributed from NOAA’s National Climatic Data Center (NCDC). The WCT allows the visualization and data export of weather and climate data, including Radar, Satellite and Model data. By leveraging the NetCDF for Java library and Common Data Model, the WCT is extremely scalable and capable of supporting many new datasets in the future. Gridded NetCDF files (regular and irregularly spaced, using Climate-Forecast (CF) conventions) are supported, along with many other formats including GRIB. The WCT provides tools for custom data overlays, Web Map Service (WMS) background maps, animations and basic filtering. The export of images and movies is provided in multiple formats. The WCT Data Export Wizard allows for data export in both vector polygon/point (Shapefile, Well-Known Text) and raster (GeoTIFF, ESRI Grid, VTK, Gridded NetCDF) formats. These data export features promote the interoperability of weather and climate information with various scientific communities and common software packages such as ArcGIS, Google Earth, MatLAB, GrADS and R. The WCT also supports an embedded, integrated Google Earth instance. The Google Earth Browser Plugin allows seamless visualization of data on a native 3-D Google Earth instance linked to the standard 2-D map. Level-II NEXRAD data for Hurricane Katrina GPCP (Global Precipitation Product), visualized in 2-D and internal Google Earth view.

  7. Radiation source search toolkit

    NASA Astrophysics Data System (ADS)

    Young, Jason S.

    The newly developed Radiation Source Search Toolkit (RSST) is a toolkit for generating gamma-ray spectroscopy data for use in the testing of source search algorithms. RSST is designed in a modular fashion to allow for ease of use while still maintaining accuracy in developing the output spectra. Users are allowed to define a real-world path for mobile radiation detectors to travel as well as radiation sources for possible detection. RSST can accept measured or simulated radiation spectrum data for generation into a source search simulation. RSST handles traversing the path, computing distance related attenuation, and generating the final output spectra. RSST also has the ability to simulate anisotropic shielding as well as traffic conditions that would impede a ground-based detection platform in a real-world scenario. RSST provides a novel fusion between spectral data and geospatial source search data generation. By utilizing the RSST, researchers can easily generate multiple datasets for testing detection algorithms without the need for actual radiation sources and mobile detector platforms.

  8. The MIS Pipeline Toolkit

    NASA Astrophysics Data System (ADS)

    Teuben, Peter J.; Pound, M. W.; Storm, S.; Mundy, L. G.; Salter, D. M.; Lee, K.; Kwon, W.; Fernandez Lopez, M.; Plunkett, A.

    2013-01-01

    A pipeline toolkit was developed to help organizing, reducing and analyzing a large number of near-identical datasets. This is a very general problem, for which many different solutions have been implemented. In this poster we present one such solution that lends itself to users of the Unix command line, using the Unix "make" utility, and adapts itself easily to observational as well as theoretical projects. Two examples are given, one from the CARMA CLASSy survey, and another from a simulated kinematic survey of early galaxy forming disks. The CLASSy survey (discussed in more detail in three accompanying posters) consists of 5 different star forming regions, observed with CARMA, each containing roughly 10-20 datasets in continuum and 3 different molecular lines, that need to be combined in final data cubes and maps. The strength of such a pipeline toolkit shows itself as new data are accumulated, the data reduction steps are improved and easily re-applied to previously taken data. For this we employed a master script that was run nightly, and collaborators submitted improved script and/or pipeline parameters that control these scripts. MIS is freely available for download.

  9. Multiphysics Application Coupling Toolkit

    SciTech Connect

    Campbell, Michael T.

    2013-12-02

    This particular consortium implementation of the software integration infrastructure will, in large part, refactor portions of the Rocstar multiphysics infrastructure. Development of this infrastructure originated at the University of Illinois DOE ASCI Center for Simulation of Advanced Rockets (CSAR) to support the center's massively parallel multiphysics simulation application, Rocstar, and has continued at IllinoisRocstar, a small company formed near the end of the University-based program. IllinoisRocstar is now licensing these new developments as free, open source, in hopes to help improve their own and others' access to infrastructure which can be readily utilized in developing coupled or composite software systems; with particular attention to more rapid production and utilization of multiphysics applications in the HPC environment. There are two major pieces to the consortium implementation, the Application Component Toolkit (ACT), and the Multiphysics Application Coupling Toolkit (MPACT). The current development focus is the ACT, which is (will be) the substrate for MPACT. The ACT itself is built up from the components described in the technical approach. In particular, the ACT has the following major components: 1.The Component Object Manager (COM): The COM package provides encapsulation of user applications, and their data. COM also provides the inter-component function call mechanism. 2.The System Integration Manager (SIM): The SIM package provides constructs and mechanisms for orchestrating composite systems of multiply integrated pieces.

  10. Mission Simulation Toolkit

    NASA Technical Reports Server (NTRS)

    Pisaich, Gregory; Flueckiger, Lorenzo; Neukom, Christian; Wagner, Mike; Buchanan, Eric; Plice, Laura

    2007-01-01

    The Mission Simulation Toolkit (MST) is a flexible software system for autonomy research. It was developed as part of the Mission Simulation Facility (MSF) project that was started in 2001 to facilitate the development of autonomous planetary robotic missions. Autonomy is a key enabling factor for robotic exploration. There has been a large gap between autonomy software (at the research level), and software that is ready for insertion into near-term space missions. The MST bridges this gap by providing a simulation framework and a suite of tools for supporting research and maturation of autonomy. MST uses a distributed framework based on the High Level Architecture (HLA) standard. A key feature of the MST framework is the ability to plug in new models to replace existing ones with the same services. This enables significant simulation flexibility, particularly the mixing and control of fidelity level. In addition, the MST provides automatic code generation from robot interfaces defined with the Unified Modeling Language (UML), methods for maintaining synchronization across distributed simulation systems, XML-based robot description, and an environment server. Finally, the MSF supports a number of third-party products including dynamic models and terrain databases. Although the communication objects and some of the simulation components that are provided with this toolkit are specifically designed for terrestrial surface rovers, the MST can be applied to any other domain, such as aerial, aquatic, or space.

  11. NAIF Toolkit - Extended

    NASA Technical Reports Server (NTRS)

    Acton, Charles H., Jr.; Bachman, Nathaniel J.; Semenov, Boris V.; Wright, Edward D.

    2010-01-01

    The Navigation Ancillary Infor ma tion Facility (NAIF) at JPL, acting under the direction of NASA s Office of Space Science, has built a data system named SPICE (Spacecraft Planet Instrument Cmatrix Events) to assist scientists in planning and interpreting scientific observations (see figure). SPICE provides geometric and some other ancillary information needed to recover the full value of science instrument data, including correlation of individual instrument data sets with data from other instruments on the same or other spacecraft. This data system is used to produce space mission observation geometry data sets known as SPICE kernels. It is also used to read SPICE kernels and to compute derived quantities such as positions, orientations, lighting angles, etc. The SPICE toolkit consists of a subroutine/ function library, executable programs (both large applications and simple utilities that focus on kernel management), and simple examples of using SPICE toolkit subroutines. This software is very accurate, thoroughly tested, and portable to all computers. It is extremely stable and reusable on all missions. Since the previous version, three significant capabilities have been added: Interactive Data Language (IDL) interface, MATLAB interface, and a geometric event finder subsystem.

  12. Multiphysics Application Coupling Toolkit

    Energy Science and Technology Software Center (ESTSC)

    2013-12-02

    This particular consortium implementation of the software integration infrastructure will, in large part, refactor portions of the Rocstar multiphysics infrastructure. Development of this infrastructure originated at the University of Illinois DOE ASCI Center for Simulation of Advanced Rockets (CSAR) to support the center's massively parallel multiphysics simulation application, Rocstar, and has continued at IllinoisRocstar, a small company formed near the end of the University-based program. IllinoisRocstar is now licensing these new developments as free, openmore » source, in hopes to help improve their own and others' access to infrastructure which can be readily utilized in developing coupled or composite software systems; with particular attention to more rapid production and utilization of multiphysics applications in the HPC environment. There are two major pieces to the consortium implementation, the Application Component Toolkit (ACT), and the Multiphysics Application Coupling Toolkit (MPACT). The current development focus is the ACT, which is (will be) the substrate for MPACT. The ACT itself is built up from the components described in the technical approach. In particular, the ACT has the following major components: 1.The Component Object Manager (COM): The COM package provides encapsulation of user applications, and their data. COM also provides the inter-component function call mechanism. 2.The System Integration Manager (SIM): The SIM package provides constructs and mechanisms for orchestrating composite systems of multiply integrated pieces.« less

  13. Einstein Toolkit for Relativistic Astrophysics

    NASA Astrophysics Data System (ADS)

    Collaborative Effort

    2011-02-01

    The Einstein Toolkit is a collection of software components and tools for simulating and analyzing general relativistic astrophysical systems. Such systems include gravitational wave space-times, collisions of compact objects such as black holes or neutron stars, accretion onto compact objects, core collapse supernovae and Gamma-Ray Bursts. The Einstein Toolkit builds on numerous software efforts in the numerical relativity community including CactusEinstein, Whisky, and Carpet. The Einstein Toolkit currently uses the Cactus Framework as the underlying computational infrastructure that provides large-scale parallelization, general computational components, and a model for collaborative, portable code development.

  14. Mesh Quality Improvement Toolkit

    Energy Science and Technology Software Center (ESTSC)

    2002-11-15

    MESQUITE is a linkable software library to be used by simulation and mesh generation tools to improve the quality of meshes. Mesh quality is improved by node movement and/or local topological modifications. Various aspects of mesh quality such as smoothness, element shape, size, and orientation are controlled by choosing the appropriate mesh qualtiy metric, and objective function tempate, and a numerical optimization solver to optimize the quality of meshes, MESQUITE uses the TSTT mesh interfacemore » specification to provide an interoperable toolkit that can be used by applications which adopt the standard. A flexible code design makes it easy for meshing researchers to add additional mesh quality metrics, templates, and solvers to develop new quality improvement algorithms by making use of the MESQUITE infrastructure.« less

  15. TOOLKIT FOR ADVANCED OPTIMIZATION

    Energy Science and Technology Software Center (ESTSC)

    2000-10-13

    The TAO project focuses on the development of software for large scale optimization problems. TAO uses an object-oriented design to create a flexible toolkit with strong emphasis on the reuse of external tools where appropriate. Our design enables bi-directional connection to lower level linear algebra support (for example, parallel sparse matrix data structures) as well as higher level application frameworks. The Toolkist for Advanced Optimization (TAO) is aimed at teh solution of large-scale optimization problemsmore » on high-performance architectures. Our main goals are portability, performance, scalable parallelism, and an interface independent of the architecture. TAO is suitable for both single-processor and massively-parallel architectures. The current version of TAO has algorithms for unconstrained and bound-constrained optimization.« less

  16. A Prototype Search Toolkit

    NASA Astrophysics Data System (ADS)

    Knepper, Margaret M.; Fox, Kevin L.; Frieder, Ophir

    Information overload is now a reality. We no longer worry about obtaining a sufficient volume of data; we now are concerned with sifting and understanding the massive volumes of data available to us. To do so, we developed an integrated information processing toolkit that provides the user with a variety of ways to view their information. The views include keyword search results, a domain specific ranking system that allows for adaptively capturing topic vocabularies to customize and focus the search results, navigation pages for browsing, and a geospatial and temporal component to visualize results in time and space, and provide “what if” scenario playing. Integrating the information from different tools and sources gives the user additional information and another way to analyze the data. An example of the integration is illustrated on reports of the avian influenza (bird flu).

  17. ParCAT: Parallel Climate Analysis Toolkit

    SciTech Connect

    Smith, Brian E.; Steed, Chad A.; Shipman, Galen M.; Ricciuto, Daniel M.; Thornton, Peter E.; Wehner, Michael; Williams, Dean N.

    2013-01-01

    Climate science is employing increasingly complex models and simulations to analyze the past and predict the future of Earth s climate. This growth in complexity is creating a widening gap between the data being produced and the ability to analyze the datasets. Parallel computing tools are necessary to analyze, compare, and interpret the simulation data. The Parallel Climate Analysis Toolkit (ParCAT) provides basic tools to efficiently use parallel computing techniques to make analysis of these datasets manageable. The toolkit provides the ability to compute spatio-temporal means, differences between runs or differences between averages of runs, and histograms of the values in a data set. ParCAT is implemented as a command-line utility written in C. This allows for easy integration in other tools and allows for use in scripts. This also makes it possible to run ParCAT on many platforms from laptops to supercomputers. ParCAT outputs NetCDF files so it is compatible with existing utilities such as Panoply and UV-CDAT. This paper describes ParCAT and presents results from some example runs on the Titan system at ORNL.

  18. Introducing the Ginga FITS Viewer and Toolkit

    NASA Astrophysics Data System (ADS)

    Jeschke, E.; Inagaki, T.; Kackley, R.

    2013-10-01

    We introduce Ginga, a new open-source FITS viewer and toolkit based on Python astronomical packages such as pyfits, numpy, scipy, matplotlib, and pywcs. For developers, we present a set of Python classes for viewing FITS files under the modern Gtk and Qt widget sets and a more full-featured viewer that has a plugin architecture. We further describe how plugins can be written to extend the viewer with many different capabilities. The software may be of interest to software developers who are looking for a solution for integrating FITS visualization into their Python programs and end users interested in a new and different FITS viewer that is not based on Tcl/Tk widget technology. The software has been released under a BSD license.

  19. Simplifying operations with an uplink/downlink integration toolkit

    NASA Technical Reports Server (NTRS)

    Murphy, Susan C.; Miller, Kevin J.; Guerrero, Ana Maria; Joe, Chester; Louie, John J.; Aguilera, Christine

    1994-01-01

    The Operations Engineering Lab (OEL) at JPL has developed a simple, generic toolkit to integrate the uplink/downlink processes, (often called closing the loop), in JPL's Multimission Ground Data System. This toolkit provides capabilities for integrating telemetry verification points with predicted spacecraft commands and ground events in the Mission Sequence Of Events (SOE) document. In the JPL ground data system, the uplink processing functions and the downlink processing functions are separate subsystems that are not well integrated because of the nature of planetary missions with large one-way light times for spacecraft-to-ground communication. Our new closed-loop monitoring tool allows an analyst or mission controller to view and save uplink commands and ground events with their corresponding downlinked telemetry values regardless of the delay in downlink telemetry and without requiring real-time intervention by the user. An SOE document is a time-ordered list of all the planned ground and spacecraft events, including all commands, sequence loads, ground events, significant mission activities, spacecraft status, and resource allocations. The SOE document is generated by expansion and integration of spacecraft sequence files, ground station allocations, navigation files, and other ground event files. This SOE generation process has been automated within the OEL and includes a graphical, object-oriented SOE editor and real-time viewing tool running under X/Motif. The SOE toolkit was used as the framework for the integrated implementation. The SOE is used by flight engineers to coordinate their operations tasks, serving as a predict data set in ground operations and mission control. The closed-loop SOE toolkit allows simple, automated integration of predicted uplink events with correlated telemetry points in a single SOE document for on-screen viewing and archiving. It automatically interfaces with existing real-time or non real-time sources of information, to

  20. ParCAT: A Parallel Climate Analysis Toolkit

    NASA Astrophysics Data System (ADS)

    Haugen, B.; Smith, B.; Steed, C.; Ricciuto, D. M.; Thornton, P. E.; Shipman, G.

    2012-12-01

    Climate science has employed increasingly complex models and simulations to analyze the past and predict the future of our climate. The size and dimensionality of climate simulation data has been growing with the complexity of the models. This growth in data is creating a widening gap between the data being produced and the tools necessary to analyze large, high dimensional data sets. With single run data sets increasing into 10's, 100's and even 1000's of gigabytes, parallel computing tools are becoming a necessity in order to analyze and compare climate simulation data. The Parallel Climate Analysis Toolkit (ParCAT) provides basic tools that efficiently use parallel computing techniques to narrow the gap between data set size and analysis tools. ParCAT was created as a collaborative effort between climate scientists and computer scientists in order to provide efficient parallel implementations of the computing tools that are of use to climate scientists. Some of the basic functionalities included in the toolkit are the ability to compute spatio-temporal means and variances, differences between two runs and histograms of the values in a data set. ParCAT is designed to facilitate the "heavy lifting" that is required for large, multidimensional data sets. The toolkit does not focus on performing the final visualizations and presentation of results but rather, reducing large data sets to smaller, more manageable summaries. The output from ParCAT is provided in commonly used file formats (NetCDF, CSV, ASCII) to allow for simple integration with other tools. The toolkit is currently implemented as a command line utility, but will likely also provide a C library for developers interested in tighter software integration. Elements of the toolkit are already being incorporated into projects such as UV-CDAT and CMDX. There is also an effort underway to implement portions of the CCSM Land Model Diagnostics package using ParCAT in conjunction with Python and gnuplot. Par

  1. A Scalable Analysis Toolkit

    NASA Technical Reports Server (NTRS)

    Aiken, Alexander

    2001-01-01

    The Scalable Analysis Toolkit (SAT) project aimed to demonstrate that it is feasible and useful to statically detect software bugs in very large systems. The technical focus of the project was on a relatively new class of constraint-based techniques for analysis software, where the desired facts about programs (e.g., the presence of a particular bug) are phrased as constraint problems to be solved. At the beginning of this project, the most successful forms of formal software analysis were limited forms of automatic theorem proving (as exemplified by the analyses used in language type systems and optimizing compilers), semi-automatic theorem proving for full verification, and model checking. With a few notable exceptions these approaches had not been demonstrated to scale to software systems of even 50,000 lines of code. Realistic approaches to large-scale software analysis cannot hope to make every conceivable formal method scale. Thus, the SAT approach is to mix different methods in one application by using coarse and fast but still adequate methods at the largest scales, and reserving the use of more precise but also more expensive methods at smaller scales for critical aspects (that is, aspects critical to the analysis problem under consideration) of a software system. The principled method proposed for combining a heterogeneous collection of formal systems with different scalability characteristics is mixed constraints. This idea had been used previously in small-scale applications with encouraging results: using mostly coarse methods and narrowly targeted precise methods, useful information (meaning the discovery of bugs in real programs) was obtained with excellent scalability.

  2. Pizza.py Toolkit

    SciTech Connect

    Plimpton, Steve; Jones, Matt; Crozier, Paul

    2006-01-01

    Pizza.py is a loosely integrated collection of tools, many of which provide support for the LAMMPS molecular dynamics and ChemCell cell modeling packages. There are tools to create input files. convert between file formats, process log and dump files, create plots, and visualize and animate simulation snapshots. Software packages that are wrapped by Pizza.py. so they can invoked from within Python, include GnuPlot, MatLab, Raster3d. and RasMol. Pizza.py is written in Python and runs on any platform that supports Python. Pizza.py enhances the standard Python interpreter in a few simple ways. Its tools are Python modules which can be invoked interactively, from scripts, or from GUIs when appropriate. Some of the tools require additional Python packages to be installed as part of the users Python. Others are wrappers on software packages (as listed above) which must be available on the users system. It is easy to modify or extend Pizza.py with new functionality or new tools, which need not have anything to do with LAMMPS or ChemCell.

  3. Pizza.py Toolkit

    Energy Science and Technology Software Center (ESTSC)

    2006-01-01

    Pizza.py is a loosely integrated collection of tools, many of which provide support for the LAMMPS molecular dynamics and ChemCell cell modeling packages. There are tools to create input files. convert between file formats, process log and dump files, create plots, and visualize and animate simulation snapshots. Software packages that are wrapped by Pizza.py. so they can invoked from within Python, include GnuPlot, MatLab, Raster3d. and RasMol. Pizza.py is written in Python and runs onmore » any platform that supports Python. Pizza.py enhances the standard Python interpreter in a few simple ways. Its tools are Python modules which can be invoked interactively, from scripts, or from GUIs when appropriate. Some of the tools require additional Python packages to be installed as part of the users Python. Others are wrappers on software packages (as listed above) which must be available on the users system. It is easy to modify or extend Pizza.py with new functionality or new tools, which need not have anything to do with LAMMPS or ChemCell.« less

  4. toolkit computational mesh conceptual model.

    SciTech Connect

    Baur, David G.; Edwards, Harold Carter; Cochran, William K.; Williams, Alan B.; Sjaardema, Gregory D.

    2010-03-01

    The Sierra Toolkit computational mesh is a software library intended to support massively parallel multi-physics computations on dynamically changing unstructured meshes. This domain of intended use is inherently complex due to distributed memory parallelism, parallel scalability, heterogeneity of physics, heterogeneous discretization of an unstructured mesh, and runtime adaptation of the mesh. Management of this inherent complexity begins with a conceptual analysis and modeling of this domain of intended use; i.e., development of a domain model. The Sierra Toolkit computational mesh software library is designed and implemented based upon this domain model. Software developers using, maintaining, or extending the Sierra Toolkit computational mesh library must be familiar with the concepts/domain model presented in this report.

  5. ProtoMD: A prototyping toolkit for multiscale molecular dynamics

    NASA Astrophysics Data System (ADS)

    Somogyi, Endre; Mansour, Andrew Abi; Ortoleva, Peter J.

    2016-05-01

    ProtoMD is a toolkit that facilitates the development of algorithms for multiscale molecular dynamics (MD) simulations. It is designed for multiscale methods which capture the dynamic transfer of information across multiple spatial scales, such as the atomic to the mesoscopic scale, via coevolving microscopic and coarse-grained (CG) variables. ProtoMD can be also be used to calibrate parameters needed in traditional CG-MD methods. The toolkit integrates 'GROMACS wrapper' to initiate MD simulations, and 'MDAnalysis' to analyze and manipulate trajectory files. It facilitates experimentation with a spectrum of coarse-grained variables, prototyping rare events (such as chemical reactions), or simulating nanocharacterization experiments such as terahertz spectroscopy, AFM, nanopore, and time-of-flight mass spectroscopy. ProtoMD is written in python and is freely available under the GNU General Public License from

  6. ProtoMD: A prototyping toolkit for multiscale molecular dynamics

    NASA Astrophysics Data System (ADS)

    Somogyi, Endre; Mansour, Andrew Abi; Ortoleva, Peter J.

    2016-05-01

    ProtoMD is a toolkit that facilitates the development of algorithms for multiscale molecular dynamics (MD) simulations. It is designed for multiscale methods which capture the dynamic transfer of information across multiple spatial scales, such as the atomic to the mesoscopic scale, via coevolving microscopic and coarse-grained (CG) variables. ProtoMD can be also be used to calibrate parameters needed in traditional CG-MD methods. The toolkit integrates 'GROMACS wrapper' to initiate MD simulations, and 'MDAnalysis' to analyze and manipulate trajectory files. It facilitates experimentation with a spectrum of coarse-grained variables, prototyping rare events (such as chemical reactions), or simulating nanocharacterization experiments such as terahertz spectroscopy, AFM, nanopore, and time-of-flight mass spectroscopy. ProtoMD is written in python and is freely available under the GNU General Public License from github.com/CTCNano/proto_md.

  7. MCS Systems Administration Toolkit

    Energy Science and Technology Software Center (ESTSC)

    2001-09-30

    This package contains a number of systems administration utilities to assist a team of system administrators in managing a computer environment by automating routine tasks and centralizing information. Included are utilities to help install software on a network of computers and programs to make an image of a disk drive, to manage and distribute configuration files for a number of systems, and to run self-testss on systems, as well as an example of using amore » database to manage host information and various utilities.« less

  8. Web-based Toolkit for Dynamic Generation of Data Processors

    NASA Astrophysics Data System (ADS)

    Patel, J.; Dascalu, S.; Harris, F. C.; Benedict, K. K.; Gollberg, G.; Sheneman, L.

    2011-12-01

    All computation-intensive scientific research uses structured datasets, including hydrology and all other types of climate-related research. When it comes to testing their hypotheses, researchers might use the same dataset differently, and modify, transform, or convert it to meet their research needs. Currently, many researchers spend a good amount of time performing data processing and building tools to speed up this process. They might routinely repeat the same process activities for new research projects, spending precious time that otherwise could be dedicated to analyzing and interpreting the data. Numerous tools are available to run tests on prepared datasets and many of them work with datasets in different formats. However, there is still a significant need for applications that can comprehensively handle data transformation and conversion activities and help prepare the various processed datasets required by the researchers. We propose a web-based application (a software toolkit) that dynamically generates data processors capable of performing data conversions, transformations, and customizations based on user-defined mappings and selections. As a first step, the proposed solution allows the users to define various data structures and, in the next step, can select various file formats and data conversions for their datasets of interest. In a simple scenario, the core of the proposed web-based toolkit allows the users to define direct mappings between input and output data structures. The toolkit will also support defining complex mappings involving the use of pre-defined sets of mathematical, statistical, date/time, and text manipulation functions. Furthermore, the users will be allowed to define logical cases for input data filtering and sampling. At the end of the process, the toolkit is designed to generate reusable source code and executable binary files for download and use by the scientists. The application is also designed to store all data

  9. STAR: Software Toolkit for Analysis Research

    SciTech Connect

    Doak, J.E.; Prommel, J.M.; Whiteson, R.; Hoffbauer, B.L.; Thomas, T.R.; Helman, P.

    1993-08-01

    Analyzing vast quantities of data from diverse information sources is an increasingly important element for nonproliferation and arms control analysis. Much of the work in this area has used human analysts to assimilate, integrate, and interpret complex information gathered from various sources. With the advent of fast computers, we now have the capability to automate this process thereby shifting this burden away from humans. In addition, there now exist huge data storage capabilities which have made it possible to formulate large integrated databases comprising many thereabouts of information spanning a variety of subjects. We are currently designing a Software Toolkit for Analysis Research (STAR) to address these issues. The goal of STAR is to Produce a research tool that facilitates the development and interchange of algorithms for locating phenomena of interest to nonproliferation and arms control experts. One major component deals with the preparation of information. The ability to manage and effectively transform raw data into a meaningful form is a prerequisite for analysis by any methodology. The relevant information to be analyzed can be either unstructured text structured data, signals, or images. Text can be numerical and/or character, stored in raw data files, databases, streams of bytes, or compressed into bits in formats ranging from fixed, to character-delimited, to a count followed by content The data can be analyzed in real-time or batch mode. Once the data are preprocessed, different analysis techniques can be applied. Some are built using expert knowledge. Others are trained using data collected over a period of time. Currently, we are considering three classes of analyzers for use in our software toolkit: (1) traditional machine learning techniques, (2) the purely statistical system, and (3) expert systems.

  10. A Toolkit for Teacher Engagement

    ERIC Educational Resources Information Center

    Grantmakers for Education, 2014

    2014-01-01

    Teachers are critical to the success of education grantmaking strategies, yet in talking with them we discovered that the world of philanthropy is often a mystery. GFE's Toolkit for Teacher Engagement aims to assist funders in authentically and effectively involving teachers in the education reform and innovation process. Built directly from the…

  11. Build an Assistive Technology Toolkit

    ERIC Educational Resources Information Center

    Ahrens, Kelly

    2011-01-01

    Assistive technology (AT) by its very nature consists of a variety of personal and customized tools for multiple learning styles and physical challenges. The author not only encourages students, parents, and educators to advocate for AT services, she also wants them to go a step further and build their own AT toolkits that can instill independence…

  12. pypet: A Python Toolkit for Data Management of Parameter Explorations

    PubMed Central

    Meyer, Robert; Obermayer, Klaus

    2016-01-01

    pypet (Python parameter exploration toolkit) is a new multi-platform Python toolkit for managing numerical simulations. Sampling the space of model parameters is a key aspect of simulations and numerical experiments. pypet is designed to allow easy and arbitrary sampling of trajectories through a parameter space beyond simple grid searches. pypet collects and stores both simulation parameters and results in a single HDF5 file. This collective storage allows fast and convenient loading of data for further analyses. pypet provides various additional features such as multiprocessing and parallelization of simulations, dynamic loading of data, integration of git version control, and supervision of experiments via the electronic lab notebook Sumatra. pypet supports a rich set of data formats, including native Python types, Numpy and Scipy data, Pandas DataFrames, and BRIAN(2) quantities. Besides these formats, users can easily extend the toolkit to allow customized data types. pypet is a flexible tool suited for both short Python scripts and large scale projects. pypet's various features, especially the tight link between parameters and results, promote reproducible research in computational neuroscience and simulation-based disciplines. PMID:27610080

  13. pypet: A Python Toolkit for Data Management of Parameter Explorations.

    PubMed

    Meyer, Robert; Obermayer, Klaus

    2016-01-01

    pypet (Python parameter exploration toolkit) is a new multi-platform Python toolkit for managing numerical simulations. Sampling the space of model parameters is a key aspect of simulations and numerical experiments. pypet is designed to allow easy and arbitrary sampling of trajectories through a parameter space beyond simple grid searches. pypet collects and stores both simulation parameters and results in a single HDF5 file. This collective storage allows fast and convenient loading of data for further analyses. pypet provides various additional features such as multiprocessing and parallelization of simulations, dynamic loading of data, integration of git version control, and supervision of experiments via the electronic lab notebook Sumatra. pypet supports a rich set of data formats, including native Python types, Numpy and Scipy data, Pandas DataFrames, and BRIAN(2) quantities. Besides these formats, users can easily extend the toolkit to allow customized data types. pypet is a flexible tool suited for both short Python scripts and large scale projects. pypet's various features, especially the tight link between parameters and results, promote reproducible research in computational neuroscience and simulation-based disciplines. PMID:27610080

  14. Parallel Power Grid Simulation Toolkit

    SciTech Connect

    Smith, Steve; Kelley, Brian; Banks, Lawrence; Top, Philip; Woodward, Carol

    2015-09-14

    ParGrid is a 'wrapper' that integrates a coupled Power Grid Simulation toolkit consisting of a library to manage the synchronization and communication of independent simulations. The included library code in ParGid, named FSKIT, is intended to support the coupling multiple continuous and discrete even parallel simulations. The code is designed using modern object oriented C++ methods utilizing C++11 and current Boost libraries to ensure compatibility with multiple operating systems and environments.

  15. User`s guide for SDDS toolkit Version 1.4

    SciTech Connect

    Borland, M.

    1995-07-06

    The Self Describing Data Sets (SDDS) file protocol is the basis for a powerful and expanding toolkit of over 40 generic programs. These programs are used for simulation postprocessing, graphics, data preparation, program interfacing, and experimental data analysis. This document describes Version 1.4 of the SDDS commandline toolkit. Those wishing to write programs using SDDS should consult the Application Programmer`s Guide for SDDS Version 1.4. The first section of the present document is shared with this reference. This document does not describe SDDS-compliant EPICS applications, of which there are presently 25.

  16. The Connectome Viewer Toolkit: An Open Source Framework to Manage, Analyze, and Visualize Connectomes

    PubMed Central

    Gerhard, Stephan; Daducci, Alessandro; Lemkaddem, Alia; Meuli, Reto; Thiran, Jean-Philippe; Hagmann, Patric

    2011-01-01

    Advanced neuroinformatics tools are required for methods of connectome mapping, analysis, and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration, and sharing. We have designed and implemented the Connectome Viewer Toolkit – a set of free and extensible open source neuroimaging tools written in Python. The key components of the toolkit are as follows: (1) The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. (2) The Connectome File Format Library enables management and sharing of connectome files. (3) The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration, and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/ PMID:21713110

  17. Design Optimization Toolkit: Users' Manual

    SciTech Connect

    Aguilo Valentin, Miguel Alejandro

    2014-07-01

    The Design Optimization Toolkit (DOTk) is a stand-alone C++ software package intended to solve complex design optimization problems. DOTk software package provides a range of solution methods that are suited for gradient/nongradient-based optimization, large scale constrained optimization, and topology optimization. DOTk was design to have a flexible user interface to allow easy access to DOTk solution methods from external engineering software packages. This inherent flexibility makes DOTk barely intrusive to other engineering software packages. As part of this inherent flexibility, DOTk software package provides an easy-to-use MATLAB interface that enables users to call DOTk solution methods directly from the MATLAB command window.

  18. Python-ARM Radar Toolkit

    SciTech Connect

    Jonathan Helmus, Scott Collis

    2013-03-17

    The Python-ARM Radar Toolkit (Py-ART) is a collection of radar quality control and retrieval codes which all work on two unifying Python objects: the PyRadar and PyGrid objects. By building ingests to several popular radar formats and then abstracting the interface Py-ART greatly simplifies data processing over several other available utilities. In addition Py-ART makes use of Numpy arrays as its primary storage mechanism enabling use of existing and extensive community software tools.

  19. The REACH Youth Program Learning Toolkit

    ERIC Educational Resources Information Center

    Sierra Health Foundation, 2011

    2011-01-01

    Believing in the value of using video documentaries and data as learning tools, members of the REACH technical assistance team collaborated to develop this toolkit. The learning toolkit was designed using and/or incorporating components of the "Engaging Youth in Community Change: Outcomes and Lessons Learned from Sierra Health Foundation's REACH…

  20. An Introduction to the Einstein Toolkit

    NASA Astrophysics Data System (ADS)

    Zilhão, Miguel; Löffler, Frank

    2013-09-01

    We give an introduction to the Einstein Toolkit, a mature, open-source computational infrastructure for numerical relativity based on the Cactus Framework, for the target group of new users. This toolkit is composed of several different modules, is developed by researchers from different institutions throughout the world and is in active continuous development. Documentation for the toolkit and its several modules is often scattered across different locations, a difficulty new users may at times have to struggle with. Scientific papers exist describing the toolkit and its methods in detail, but they might be overwhelming at first. With these lecture notes we hope to provide an initial overview for new users. We cover how to obtain, compile and run the toolkit, and give an overview of some of the tools and modules provided with it.

  1. The SCRAM tool-kit

    NASA Technical Reports Server (NTRS)

    Tamir, David; Flanigan, Lee A.; Weeks, Jack L.; Siewert, Thomas A.; Kimbrough, Andrew G.; Mcclure, Sidney R.

    1994-01-01

    This paper proposes a new series of on-orbit capabilities to support the near-term Hubble Space Telescope, Extended Duration Orbiter, Long Duration Orbiter, Space Station Freedom, other orbital platforms, and even the future manned Lunar/Mars missions. These proposed capabilities form a toolkit termed Space Construction, Repair, and Maintenance (SCRAM). SCRAM addresses both intra-Vehicular Activity (IVA) and Extra-Vehicular Activity (EVA) needs. SCRAM provides a variety of tools which enable welding, brazing, cutting, coating, heating, and cleaning, as well as corresponding nondestructive examination. Near-term IVA-SCRAM applications include repair and modification to fluid lines, structure, and laboratory equipment inside a shirt-sleeve environment (i.e. inside Spacelab or Space Station). Near-term EVA-SCRAM applications include construction of fluid lines and structural members, repair of punctures by orbital debris, refurbishment of surfaces eroded by contaminants. The SCRAM tool-kit also promises future EVA applications involving mass production tasks automated by robotics and artificial intelligence, for construction of large truss, aerobrake, and nuclear reactor shadow shields structures. The leading candidate tool processes for SCRAM, currently undergoing research and development, include Electron Beam, Gas Tungsten Arc, Plasma Arc, and Laser Beam. A series of strategic space flight experiments would make SCRAM available to help conquer the space frontier.

  2. ADMIT: ALMA Data Mining Toolkit

    NASA Astrophysics Data System (ADS)

    Friedel, Douglas N.; Xu, Lisa; Looney, Leslie; Teuben, Peter J.; Pound, Marc W.; Rauch, Kevin P.; Mundy, Lee G.; Kern, Jeffrey S.

    2015-01-01

    ADMIT (ALMA Data Mining Toolkit) is a toolkit for the creation and analysis of new science products from ALMA data. ADMIT is an ALMA Development Project written purely in Python. While specifically targeted for ALMA science and production use after the ALMA pipeline, it is designed to be generally applicable to radio-astronomical data. ADMIT quickly provides users with a detailed overview of their science products: line identifications, line 'cutout' cubes, moment maps, emission type analysis (e.g., feature detection), etc. Users can download the small ADMIT pipeline product (< 20MB), analyze the results, then fine-tune and re-run the ADMIT pipeline (or any part thereof) on their own machines and interactively inspect the results. ADMIT will have both a GUI and command line interface available for this purpose. By analyzing multiple data cubes simultaneously, data mining between many astronomical sources and line transitions will be possible. Users will also be able to enhance the capabilities of ADMIT by creating customized ADMIT tasks satisfying any special processing needs. Future implementations of ADMIT may include EVLA and other instruments.

  3. Admit: Alma Data Mining Toolkit

    NASA Astrophysics Data System (ADS)

    Friedel, Douglas; Looney, Leslie; Xu, Lisa; Pound, Marc W.; Teuben, Peter J.; Rauch, Kevin P.; Mundy, Lee; Kern, Jeffrey S.

    2015-06-01

    ADMIT (ALMA Data Mining Toolkit) is a toolkit for the creation and analysis of new science products from ALMA data. ADMIT is an ALMA Development Project written purely in Python. While specifically targeted for ALMA science and production use after the ALMA pipeline, it is designed to be generally applicable to radio-astronomical data. ADMIT quickly provides users with a detailed overview of their science products: line identifications, line 'cutout' cubes, moment maps, emission type analysis (e.g., feature detection), etc. Users can download the small ADMIT pipeline product (<20MB), analyze the results, then fine-tune and re-run the ADMIT pipeline (or any part thereof) on their own machines and interactively inspect the results. ADMIT will have both a GUI and command line interface available for this purpose. By analyzing multiple data cubes simultaneously, data mining between many astronomical sources and line transitions will be possible. Users will also be able to enhance the capabilities of ADMIT by creating customized ADMIT tasks satisfying any special processing needs. Future implementations of ADMIT may include EVLA and other instruments.

  4. The NetLogger Toolkit V2.0

    SciTech Connect

    Gunter, Dan; Lee, Jason; Stoufer, Martin; Tierney, Brian

    2003-03-28

    The NetLogger Toolkit is designed to monitor, under actual operating conditions, the behavior of all the elements of the application-to-application communication path in order to determine exactly where time is spent within a complex system Using NetLogger, distnbuted application components are modified to produce timestamped logs of "interesting" events at all the critical points of the distributed system Events from each component are correlated, which allov^ one to characterize the performance of all aspects of the system and network in detail. The NetLogger Toolkit itself consists of four components an API and library of functions to simplify the generation of application-level event logs, a set of tools for collecting and sorting log files, an event archive system, and a tool for visualization and analysis of the log files In order to instrument an application to produce event logs, the application developer inserts calls to the NetLogger API at all the critical points in the code, then links the application with the NetLogger library All the tools in the NetLogger Toolkit share a common log format, and assume the existence of accurate and synchronized system clocks NetLogger messages can be logged using an easy-to-read text based format based on the lETF-proposed ULM format, or a binary format that can still be used through the same API but that is several times faster and smaller, with performance comparable or better than binary message formats such as MPI, XDR, SDDF-Binary, and PBIO. The NetLogger binary format is both highly efficient and self-describing, thus optimized for the dynamic message construction and parsing of application instrumentation. NetLogger includes an "activation" API that allows NetLogger logging to be turned on, off, or modified by changing an external file This IS useful for activating logging in daemons/services (e g GndFTP server). The NetLogger reliability API provides the ability to specify backup logging locations and

  5. The DLESE Evaluation Toolkit Project

    NASA Astrophysics Data System (ADS)

    Buhr, S. M.; Barker, L. J.; Marlino, M.

    2002-12-01

    The Evaluation Toolkit and Community project is a new Digital Library for Earth System Education (DLESE) collection designed to raise awareness of project evaluation within the geoscience education community, and to enable principal investigators, teachers, and evaluators to implement project evaluation more readily. This new resource is grounded in the needs of geoscience educators, and will provide a virtual home for a geoscience education evaluation community. The goals of the project are to 1) provide a robust collection of evaluation resources useful for Earth systems educators, 2) establish a forum and community for evaluation dialogue within DLESE, and 3) disseminate the resources through the DLESE infrastructure and through professional society workshops and proceedings. Collaboration and expertise in education, geoscience and evaluation are necessary if we are to conduct the best possible geoscience education. The Toolkit allows users to engage in evaluation at whichever level best suits their needs, get more evaluation professional development if desired, and access the expertise of other segments of the community. To date, a test web site has been built and populated, initial community feedback from the DLESE and broader community is being garnered, and we have begun to heighten awareness of geoscience education evaluation within our community. The web site contains features that allow users to access professional development about evaluation, search and find evaluation resources, submit resources, find or offer evaluation services, sign up for upcoming workshops, take the user survey, and submit calendar items. The evaluation resource matrix currently contains resources that have met our initial review. The resources are currently organized by type; they will become searchable on multiple dimensions of project type, audience, objectives and evaluation resource type as efforts to develop a collection-specific search engine mature. The peer review

  6. Electronic toolkit for nursing education.

    PubMed

    Trangenstein, Patricia A

    2008-12-01

    In an ever-increasing hectic and mobile society, Web-based instructional tools can enhance and supplement student learning and improve communication and collaboration among participants, give rapid feedback on one's progress, and address diverse ways of learning. Web-based formats offer distinct advantages by allowing the learner to view course materials when they choose, from any Internet connection, and as often as they want. The challenge for nurse educators is to assimilate the knowledge and expertise to understand and appropriately use these tools. A variety of Web-based instructional tools are described in this article. As nurse educators increase their awareness of these potential adjuncts they can select appropriate applications that are supported by their institution to construct their own "toolkit." PMID:18940410

  7. Flightspeed Integral Image Analysis Toolkit

    NASA Technical Reports Server (NTRS)

    Thompson, David R.

    2009-01-01

    The Flightspeed Integral Image Analysis Toolkit (FIIAT) is a C library that provides image analysis functions in a single, portable package. It provides basic low-level filtering, texture analysis, and subwindow descriptor for applications dealing with image interpretation and object recognition. Designed with spaceflight in mind, it addresses: Ease of integration (minimal external dependencies) Fast, real-time operation using integer arithmetic where possible (useful for platforms lacking a dedicated floatingpoint processor) Written entirely in C (easily modified) Mostly static memory allocation 8-bit image data The basic goal of the FIIAT library is to compute meaningful numerical descriptors for images or rectangular image regions. These n-vectors can then be used directly for novelty detection or pattern recognition, or as a feature space for higher-level pattern recognition tasks. The library provides routines for leveraging training data to derive descriptors that are most useful for a specific data set. Its runtime algorithms exploit a structure known as the "integral image." This is a caching method that permits fast summation of values within rectangular regions of an image. This integral frame facilitates a wide range of fast image-processing functions. This toolkit has applicability to a wide range of autonomous image analysis tasks in the space-flight domain, including novelty detection, object and scene classification, target detection for autonomous instrument placement, and science analysis of geomorphology. It makes real-time texture and pattern recognition possible for platforms with severe computational restraints. The software provides an order of magnitude speed increase over alternative software libraries currently in use by the research community. FIIAT can commercially support intelligent video cameras used in intelligent surveillance. It is also useful for object recognition by robots or other autonomous vehicles

  8. Pydpiper: a flexible toolkit for constructing novel registration pipelines.

    PubMed

    Friedel, Miriam; van Eede, Matthijs C; Pipitone, Jon; Chakravarty, M Mallar; Lerch, Jason P

    2014-01-01

    Using neuroimaging technologies to elucidate the relationship between genotype and phenotype and brain and behavior will be a key contribution to biomedical research in the twenty-first century. Among the many methods for analyzing neuroimaging data, image registration deserves particular attention due to its wide range of applications. Finding strategies to register together many images and analyze the differences between them can be a challenge, particularly given that different experimental designs require different registration strategies. Moreover, writing software that can handle different types of image registration pipelines in a flexible, reusable and extensible way can be challenging. In response to this challenge, we have created Pydpiper, a neuroimaging registration toolkit written in Python. Pydpiper is an open-source, freely available software package that provides multiple modules for various image registration applications. Pydpiper offers five key innovations. Specifically: (1) a robust file handling class that allows access to outputs from all stages of registration at any point in the pipeline; (2) the ability of the framework to eliminate duplicate stages; (3) reusable, easy to subclass modules; (4) a development toolkit written for non-developers; (5) four complete applications that run complex image registration pipelines "out-of-the-box." In this paper, we will discuss both the general Pydpiper framework and the various ways in which component modules can be pieced together to easily create new registration pipelines. This will include a discussion of the core principles motivating code development and a comparison of Pydpiper with other available toolkits. We also provide a comprehensive, line-by-line example to orient users with limited programming knowledge and highlight some of the most useful features of Pydpiper. In addition, we will present the four current applications of the code. PMID:25126069

  9. Pydpiper: a flexible toolkit for constructing novel registration pipelines

    PubMed Central

    Friedel, Miriam; van Eede, Matthijs C.; Pipitone, Jon; Chakravarty, M. Mallar; Lerch, Jason P.

    2014-01-01

    Using neuroimaging technologies to elucidate the relationship between genotype and phenotype and brain and behavior will be a key contribution to biomedical research in the twenty-first century. Among the many methods for analyzing neuroimaging data, image registration deserves particular attention due to its wide range of applications. Finding strategies to register together many images and analyze the differences between them can be a challenge, particularly given that different experimental designs require different registration strategies. Moreover, writing software that can handle different types of image registration pipelines in a flexible, reusable and extensible way can be challenging. In response to this challenge, we have created Pydpiper, a neuroimaging registration toolkit written in Python. Pydpiper is an open-source, freely available software package that provides multiple modules for various image registration applications. Pydpiper offers five key innovations. Specifically: (1) a robust file handling class that allows access to outputs from all stages of registration at any point in the pipeline; (2) the ability of the framework to eliminate duplicate stages; (3) reusable, easy to subclass modules; (4) a development toolkit written for non-developers; (5) four complete applications that run complex image registration pipelines “out-of-the-box.” In this paper, we will discuss both the general Pydpiper framework and the various ways in which component modules can be pieced together to easily create new registration pipelines. This will include a discussion of the core principles motivating code development and a comparison of Pydpiper with other available toolkits. We also provide a comprehensive, line-by-line example to orient users with limited programming knowledge and highlight some of the most useful features of Pydpiper. In addition, we will present the four current applications of the code. PMID:25126069

  10. Development of an Integrated Human Factors Toolkit

    NASA Technical Reports Server (NTRS)

    Resnick, Marc L.

    2003-01-01

    An effective integration of human abilities and limitations is crucial to the success of all NASA missions. The Integrated Human Factors Toolkit facilitates this integration by assisting system designers and analysts to select the human factors tools that are most appropriate for the needs of each project. The HF Toolkit contains information about a broad variety of human factors tools addressing human requirements in the physical, information processing and human reliability domains. Analysis of each tool includes consideration of the most appropriate design stage, the amount of expertise in human factors that is required, the amount of experience with the tool and the target job tasks that are needed, and other factors that are critical for successful use of the tool. The benefits of the Toolkit include improved safety, reliability and effectiveness of NASA systems throughout the agency. This report outlines the initial stages of development for the Integrated Human Factors Toolkit.

  11. Jefferson Lab Plotting Toolkit for accelerator controls

    SciTech Connect

    Chen, J; Keesee, M; Larrieu, C; Lei, G

    1999-03-01

    Experimental physics generates numerous data sets that scientists analyze using plots, graphs, etc. The Jefferson Lab Plotting Toolkit, JPT, a graphical user interface toolkit, was developed at Jefferson Lab to do data plotting. JPT provides data structures for sets of data, analyzes the range of the data, calculates the reasonable maximum, minimum, and scale of axes, sets line styles and marker styles, plots curves and fills areas.

  12. SlideToolkit: An Assistive Toolset for the Histological Quantification of Whole Slide Images

    PubMed Central

    Nelissen, Bastiaan G. L.; van Herwaarden, Joost A.; Moll, Frans L.; van Diest, Paul J.; Pasterkamp, Gerard

    2014-01-01

    The demand for accurate and reproducible phenotyping of a disease trait increases with the rising number of biobanks and genome wide association studies. Detailed analysis of histology is a powerful way of phenotyping human tissues. Nonetheless, purely visual assessment of histological slides is time-consuming and liable to sampling variation and optical illusions and thereby observer variation, and external validation may be cumbersome. Therefore, within our own biobank, computerized quantification of digitized histological slides is often preferred as a more precise and reproducible, and sometimes more sensitive approach. Relatively few free toolkits are, however, available for fully digitized microscopic slides, usually known as whole slides images. In order to comply with this need, we developed the slideToolkit as a fast method to handle large quantities of low contrast whole slides images using advanced cell detecting algorithms. The slideToolkit has been developed for modern personal computers and high-performance clusters (HPCs) and is available as an open-source project on github.com. We here illustrate the power of slideToolkit by a repeated measurement of 303 digital slides containing CD3 stained (DAB) abdominal aortic aneurysm tissue from a tissue biobank. Our workflow consists of four consecutive steps. In the first step (acquisition), whole slide images are collected and converted to TIFF files. In the second step (preparation), files are organized. The third step (tiles), creates multiple manageable tiles to count. In the fourth step (analysis), tissue is analyzed and results are stored in a data set. Using this method, two consecutive measurements of 303 slides showed an intraclass correlation of 0.99. In conclusion, slideToolkit provides a free, powerful and versatile collection of tools for automated feature analysis of whole slide images to create reproducible and meaningful phenotypic data sets. PMID:25372389

  13. ISO/IEEE 11073 PHD message generation toolkit to standardize healthcare device.

    PubMed

    Lim, Joon-Ho; Park, Chanyong; Park, Soo-Jun; Lee, Kyu-Chul

    2011-01-01

    As senior population increases, various healthcare devices and services are developed such as fall detection device, home hypertension management service, and etc. However, to vitalize healthcare devices and services market, standardization for interoperability between device and service must precede. To achieve the standardization goal, the IEEE 11073 Personal Health Device (PHD) group has been standardized many healthcare devices, but until now there are few devices compatible with the PHD standard. One of main reasons is that it isn't easy for device manufactures to implement standard communication module by analyzing standard documents of over 600 pages. In this paper, we propose a standard message generation toolkit to easily standardize existing non-standard healthcare devices. The proposed toolkit generates standard PHD messages using inputted device information, and the generated messages are adapted to the device with the standard state machine file. For the experiments, we develop a reference H/W, and test the proposed toolkit with three healthcare devices: blood pressure, weighting scale, and glucose meter. The proposed toolkit has an advantage that even if the user doesn't know the standard in detail, the user can easily standardize the non-standard healthcare devices. PMID:22254521

  14. The detector simulation toolkit HORUS

    NASA Astrophysics Data System (ADS)

    Becker, J.; Pennicard, D.; Graafsma, H.

    2012-10-01

    In recent years, X-ray detectors used and developed at synchrotron sources and Free Electron Lasers (FELs) have become increasing powerful and versatile. However, as the capabilities of modern X-ray cameras grew so did their complexity and therefore their response functions are far from trivial. Since understanding the detecting system and its behavior is vital for any physical experiment, the need for dedicated powerful simulation tools arose. The HPAD Output Response fUnction Simulator (HORUS) was originally developed to analyze the performance implications of certain design choices for the Adaptive Gain Integrating Pixel Detector (AGIPD) and over the years grew to a more universal detector simulation toolkit covering the relevant physics in the energy range from below 1 keV to a few hundred keV. HORUS has already been used to study possible improvements of the AGIPD for X-ray Photon Correlation Spectroscopy (XPCS) at the European XFEL and its performance at low beam energies. It is currently being used to study the optimum detector layout for Coherent Diffration Imaging (CDI) at the European XFEL. Simulations of the charge summing mode of the Medipix3 chip have been essential for the improvements of the charge summing mode in the Medipix3 RX chip. HORUS is universal enough to support arbitrary hybrid pixel detector systems (within limitations). To date, the following detector systems are predefined within HORUS: The AGIPD, the Large Pixel Detector (LPD), the Cornell-Stanford Pixel Array Detector (CSPAD), the Mixed-Mode (MMPAD) and KEKPAD, and the Medipix2, Medipix3 and Medipix3 RX chips.

  15. Integrated Systems Health Management (ISHM) Toolkit

    NASA Technical Reports Server (NTRS)

    Venkatesh, Meera; Kapadia, Ravi; Walker, Mark; Wilkins, Kim

    2013-01-01

    A framework of software components has been implemented to facilitate the development of ISHM systems according to a methodology based on Reliability Centered Maintenance (RCM). This framework is collectively referred to as the Toolkit and was developed using General Atomics' Health MAP (TM) technology. The toolkit is intended to provide assistance to software developers of mission-critical system health monitoring applications in the specification, implementation, configuration, and deployment of such applications. In addition to software tools designed to facilitate these objectives, the toolkit also provides direction to software developers in accordance with an ISHM specification and development methodology. The development tools are based on an RCM approach for the development of ISHM systems. This approach focuses on defining, detecting, and predicting the likelihood of system functional failures and their undesirable consequences.

  16. Desensitized Optimal Filtering and Sensor Fusion Toolkit

    NASA Technical Reports Server (NTRS)

    Karlgaard, Christopher D.

    2015-01-01

    Analytical Mechanics Associates, Inc., has developed a software toolkit that filters and processes navigational data from multiple sensor sources. A key component of the toolkit is a trajectory optimization technique that reduces the sensitivity of Kalman filters with respect to model parameter uncertainties. The sensor fusion toolkit also integrates recent advances in adaptive Kalman and sigma-point filters for non-Gaussian problems with error statistics. This Phase II effort provides new filtering and sensor fusion techniques in a convenient package that can be used as a stand-alone application for ground support and/or onboard use. Its modular architecture enables ready integration with existing tools. A suite of sensor models and noise distribution as well as Monte Carlo analysis capability are included to enable statistical performance evaluations.

  17. Cluster-based parallel image processing toolkit

    NASA Astrophysics Data System (ADS)

    Squyres, Jeffery M.; Lumsdaine, Andrew; Stevenson, Robert L.

    1995-03-01

    Many image processing tasks exhibit a high degree of data locality and parallelism and map quite readily to specialized massively parallel computing hardware. However, as network technologies continue to mature, workstation clusters are becoming a viable and economical parallel computing resource, so it is important to understand how to use these environments for parallel image processing as well. In this paper we discuss our implementation of parallel image processing software library (the Parallel Image Processing Toolkit). The Toolkit uses a message- passing model of parallelism designed around the Message Passing Interface (MPI) standard. Experimental results are presented to demonstrate the parallel speedup obtained with the Parallel Image Processing Toolkit in a typical workstation cluster over a wide variety of image processing tasks. We also discuss load balancing and the potential for parallelizing portions of image processing tasks that seem to be inherently sequential, such as visualization and data I/O.

  18. The Ames MER Microscopic Imager Toolkit

    NASA Technical Reports Server (NTRS)

    Sargent, Randy; Deans, Matthew; Kunz, Clayton; Sims, Michael; Herkenhoff, Ken

    2005-01-01

    The Mars Exploration Rovers, Spirit and Opportunity, have spent several successful months on Mars, returning gigabytes of images and spectral data to scientists on Earth. One of the instruments on the MER rovers, the Athena Microscopic Imager (MI), is a fixed focus, megapixel camera providing a plus or minus mm depth of field and a 3lx31mm field of view at a working distance of 63 mm from the lens to the object being imaged. In order to maximize the science return from this instrument, we developed the Ames MI Toolkit and supported its use during the primary mission. The MI Toolkit is a set of programs that operate on collections of MI images, with the goal of making the data more understandable to the scientists on the ground. Because of the limited depth of field of the camera, and the often highly variable topography of the terrain being imaged, MI images of a given rock are often taken as a stack, with the Instrument Deployment Device (IDD) moving along a computed normal vector, pausing every few millimeters for the MI to acquire an image. The MI Toolkit provides image registration and focal section merging, which combine these images to form a single, maximally in-focus image, while compensating for changes in lighting as well as parallax due to the motion of the camera. The MI Toolkit also provides a 3-D reconstruction of the surface being imaged using stereo and can embed 2-D MI images as texture maps into 3-D meshes produced by other imagers on board the rover to provide context. The 2-D images and 3-D meshes output from the Toolkit are easily viewed by scientists using other mission tools, such as Viz or the MI Browser. This paper describes the MI Toolkit in detail, as well as our experience using it with scientists at JPL during the primary MER mission.

  19. The Ames MER microscopic imager toolkit

    USGS Publications Warehouse

    Sargent, R.; Deans, Matthew; Kunz, C.; Sims, M.; Herkenhoff, K.

    2005-01-01

    12The Mars Exploration Rovers, Spirit and Opportunity, have spent several successful months on Mars, returning gigabytes of images and spectral data to scientists on Earth. One of the instruments on the MER rovers, the Athena Microscopic Imager (MI), is a fixed focus, megapixel camera providing a ??3mm depth of field and a 31??31mm field of view at a working distance of 63 mm from the lens to the object being imaged. In order to maximize the science return from this instrument, we developed the Ames MI Toolkit and supported its use during the primary mission. The MI Toolkit is a set of programs that operate on collections of MI images, with the goal of making the data more understandable to the scientists on the ground. Because of the limited depth of field of the camera, and the often highly variable topography of the terrain being imaged, MI images of a given rock are often taken as a stack, with the Instrument Deployment Device (IDD) moving along a computed normal vector, pausing every few millimeters for the MI to acquire an image. The MI Toolkit provides image registration and focal section merging, which combine these images to form a single, maximally in-focus image, while compensating for changes in lighting as well as parallax due to the motion of the camera. The MI Toolkit also provides a 3-D reconstruction of the surface being imaged using stereo and can embed 2-D MI images as texture maps into 3-D meshes produced by other imagers on board the rover to provide context. The 2-D images and 3-D meshes output from the Toolkit are easily viewed by scientists using other mission tools, such as Viz or the MI Browser.This paper describes the MI Toolkit in detail, as well as our experience using it with scientists at JPL during the primary MER mission. ?? 2005 IEEE.

  20. "Handy Manny" and the Emergent Literacy Technology Toolkit

    ERIC Educational Resources Information Center

    Hourcade, Jack J.; Parette, Howard P., Jr.; Boeckmann, Nichole; Blum, Craig

    2010-01-01

    This paper outlines the use of a technology toolkit to support emergent literacy curriculum and instruction in early childhood education settings. Components of the toolkit include hardware and software that can facilitate key emergent literacy skills. Implementation of the comprehensive technology toolkit enhances the development of these…

  1. Anchor Toolkit - a secure mobile agent system

    SciTech Connect

    Mudumbai, Srilekha S.; Johnston, William; Essiari, Abdelilah

    1999-05-19

    Mobile agent technology facilitates intelligent operation insoftware systems with less human interaction. Major challenge todeployment of mobile agents include secure transmission of agents andpreventing unauthorized access to resources between interacting systems,as either hosts, or agents, or both can act maliciously. The Anchortoolkit, designed by LBNL, handles the transmission and secure managementof mobile agents in a heterogeneous distributed computing environment. Itprovides users with the option of incorporating their security managers.This paper concentrates on the architecture, features, access control anddeployment of Anchor toolkit. Application of this toolkit in a securedistributed CVS environment is discussed as a case study.

  2. SIERRA Toolkit v. 1.0

    Energy Science and Technology Software Center (ESTSC)

    2010-02-24

    The SIERRA Toolkit is a collection of libraries to facilitate the development of parallel engineering analysis applications. These libraries supply basic core services that an engineering analysis application may need such as a parallel distributed and dynamic mesh database (for unstructured meshes), mechanics algorithm support (parallel infrastructure only), interfaces to parallel solvers, parallel mesh and data I/O, and various utilities (timers, diagnostic tools, etc.).The toolkit is intended to reduce the effort required to develop anmore » engineering analysis application by removing the need to develop core capabilities that most every application would require.« less

  3. Autism Speaks Toolkits: Resources for Busy Physicians.

    PubMed

    Bellando, Jayne; Fussell, Jill J; Lopez, Maya

    2016-02-01

    Given the increased prevalence of autism spectrum disorders (ASD), it is likely that busy primary care providers (PCP) are providing care to individuals with ASD in their practice. Autism Speaks provides a wealth of educational, medical, and treatment/intervention information resources for PCPs and families, including at least 32 toolkits. This article serves to familiarize PCPs and families on the different toolkits that are available on the Autism Speaks website. This article is intended to increase physicians' knowledge on the issues that families with children with ASD frequently encounter, to increase their ability to share evidence-based information to guide treatment and care for affected families in their practice. PMID:26149848

  4. TRSkit: A Simple Digital Library Toolkit

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Esler, Sandra L.

    1997-01-01

    This paper introduces TRSkit, a simple and effective toolkit for building digital libraries on the World Wide Web. The toolkit was developed for the creation of the Langley Technical Report Server and the NASA Technical Report Server, but is applicable to most simple distribution paradigms. TRSkit contains a handful of freely available software components designed to be run under the UNIX operating system and served via the World Wide Web. The intended customer is the person that must continuously and synchronously distribute anywhere from 100 - 100,000's of information units and does not have extensive resources to devote to the problem.

  5. Medical Applications of the Geant4 Toolkit

    NASA Astrophysics Data System (ADS)

    Agostinelli, S.; Chauvie, S.; Foppiano, F.; Garelli, S.; Marchetto, F.; Pia, M. G.; Nieminen, P.; Rolando, V.; Solano, A.

    A powerful and suitable tool for attacking the problem of the production and transport of different beams in biological matter is offered by the Geant4 Simulation Toolkit. Various activities in progress in the domain of medical applications are presented: studies on calibration of br achy therapie sources and termoluminescent dosimeters, studies of a complete 3-D inline dosimeter, development of general tools for CT interface for treatment planning, studies involving neutron transport, etc. A novel approach, based on the Geant4 Toolkit, for the study of radiation damage at the cellular and DNA level, is also presented.

  6. The NetLogger Toolkit V2.0

    Energy Science and Technology Software Center (ESTSC)

    2003-03-28

    The NetLogger Toolkit is designed to monitor, under actual operating conditions, the behavior of all the elements of the application-to-application communication path in order to determine exactly where time is spent within a complex system Using NetLogger, distnbuted application components are modified to produce timestamped logs of "interesting" events at all the critical points of the distributed system Events from each component are correlated, which allov^ one to characterize the performance of all aspects ofmore » the system and network in detail. The NetLogger Toolkit itself consists of four components an API and library of functions to simplify the generation of application-level event logs, a set of tools for collecting and sorting log files, an event archive system, and a tool for visualization and analysis of the log files In order to instrument an application to produce event logs, the application developer inserts calls to the NetLogger API at all the critical points in the code, then links the application with the NetLogger library All the tools in the NetLogger Toolkit share a common log format, and assume the existence of accurate and synchronized system clocks NetLogger messages can be logged using an easy-to-read text based format based on the lETF-proposed ULM format, or a binary format that can still be used through the same API but that is several times faster and smaller, with performance comparable or better than binary message formats such as MPI, XDR, SDDF-Binary, and PBIO. The NetLogger binary format is both highly efficient and self-describing, thus optimized for the dynamic message construction and parsing of application instrumentation. NetLogger includes an "activation" API that allows NetLogger logging to be turned on, off, or modified by changing an external file This IS useful for activating logging in daemons/services (e g GndFTP server). The NetLogger reliability API provides the ability to specify backup logging locations and

  7. Construction aggregates

    USGS Publications Warehouse

    Tepordei, V.V.

    1995-01-01

    Part of the 1994 Industrial Minerals Review. The production, consumption, and applications of construction aggregates are reviewed. In 1994, the production of construction aggregates, which includes crushed stone and construction sand and gravel combined, increased 7.7 percent to 2.14 Gt compared with the previous year. These record production levels are mostly a result of funding for highway construction work provided by the Intermodal Surface Transportation Efficiency Act of 1991. Demand is expected to increase for construction aggregates in 1995.

  8. Services development toolkit for Open Research Data (Promarket)

    NASA Astrophysics Data System (ADS)

    Som de Cerff, W.; Schwichtenberg, H.; Gemünd, A.; Claus, S.; Reichwald, J.; Denvil, S.; Mazetti, P.; Nativi, S.

    2012-04-01

    According to the declaration of the Organisation for Economic Co-operation and Development (OECD) on Open Access: "OECD Principles and Guidelines for Access to Research Data from Public Funding" research data should be available for everyone and Europe follows these directions (Digital Agenda, N. Kroes). Data being 'open' does not mean directly applicable: research data are often complex to use and difficult to interpret by non-experts. Also, if extra services are needed, e.g. certain delivery guarantees, SLAs need to be negotiated. Comparable to OSS development models, where software is open and services and support are paid for, there is a large potential for commercial activities and services around this open and free research data. E.g. Climate, weather or data from instruments can be used to generate business values when offered as easy and reliable services for Apps integration. The project will design a toolkit for developers in research data centres. The tools will allow to develop services to provide research data and map business processes e.g. automatic service level agreements to their service to make open research data attractive for commercial and academic use by the centre and others. It will enable to build and develop open, reliable and scalable services and end products, e.g. accessible from end user devices such as smart phones. Researchers, interested citizen or company developers will be enabled to access open data as an "easy-to-use" service and aggregate it with other services. The project will address a broad range of developers and give them a toolkit in well-known settings, portable, scalable, open and useable in public development environments and tools. This topic will be addressed technically by utilizing service-oriented approaches based on open standards and protocols, combined with new programming models and techniques.

  9. Teacher Quality Toolkit. 2nd Edition

    ERIC Educational Resources Information Center

    Lauer, Patricia A.; Dean, Ceri B.; Martin-Glenn, Mya L.; Asensio, Margaret L.

    2005-01-01

    The Teacher Quality Toolkit addresses the continuum of teacher learning by providing tools that can be used to improve both preservice, and inservice teacher education. Each chapter provides self assessment tools that can guide progress toward improved teacher quality and describes resources for designing exemplary programs and practices. Chapters…

  10. A toolkit for building earth system models

    SciTech Connect

    Foster, I.

    1993-03-01

    An earth system model is a computer code designed to simulate the interrelated processes that determine the earth's weather and climate, such as atmospheric circulation, atmospheric physics, atmospheric chemistry, oceanic circulation, and biosphere. I propose a toolkit that would support a modular, or object-oriented, approach to the implementation of such models.

  11. A toolkit for building earth system models

    SciTech Connect

    Foster, I.

    1993-03-01

    An earth system model is a computer code designed to simulate the interrelated processes that determine the earth`s weather and climate, such as atmospheric circulation, atmospheric physics, atmospheric chemistry, oceanic circulation, and biosphere. I propose a toolkit that would support a modular, or object-oriented, approach to the implementation of such models.

  12. A Toolkit for the Effective Teaching Assistant

    ERIC Educational Resources Information Center

    Tyrer, Richard; Gunn, Stuart; Lee, Chris; Parker, Maureen; Pittman, Mary; Townsend, Mark

    2004-01-01

    This book offers the notion of a "toolkit" to allow Teaching Assistants (TAs) and colleagues to review and revise their thinking and practice about real issues and challenges in managing individuals, groups, colleagues and themselves in school. In a rapidly changing educational environment the book focuses on combining the underpinning knowledge…

  13. Plus 50: Business Community Outreach Toolkit

    ERIC Educational Resources Information Center

    American Association of Community Colleges (NJ1), 2009

    2009-01-01

    This toolkit is designed to support you in building partnerships with the business community. It includes a series of fact sheets you can distribute to employers that discuss the value in hiring plus 50 workers. Individual sections contain footnotes. (Contains 5 web resources.)

  14. Portable Extensible Toolkit for Scientific Computation

    Energy Science and Technology Software Center (ESTSC)

    1995-06-30

    PETSC2.0 is a software toolkit for portable, parallel (and serial) numerical solution of partial differential equations and minimization problems. It includes software for the solution of linear and nonlinear systems of equations. These codes are written in a data-structure-neutral manner to enable easy reuse and flexibility.

  15. The Two-Way Immersion Toolkit

    ERIC Educational Resources Information Center

    Howard, Elizabeth; Sugarman, Julie; Perdomo, Marleny; Adger, Carolyn Temple

    2005-01-01

    This Toolkit is meant to be a resource for teachers, parents, and administrators involved with two-way immersion (TWI) programs, particularly those at the elementary level. Two-way immersion is a form of dual language instruction that brings together students from two native language groups for language, literacy, and academic content instruction…

  16. Healthy People 2010: Oral Health Toolkit

    ERIC Educational Resources Information Center

    Isman, Beverly

    2007-01-01

    The purpose of this Toolkit is to provide guidance, technical tools, and resources to help states, territories, tribes and communities develop and implement successful oral health components of Healthy People 2010 plans as well as other oral health plans. These plans are useful for: (1) promoting, implementing and tracking oral health objectives;…

  17. Media Toolkit for Anti-Drug Action.

    ERIC Educational Resources Information Center

    Office of National Drug Control Policy, Washington, DC.

    This toolkit provides proven methods, models, and templates for tying anti-drug efforts to the National Youth Anti-Drug Media Campaign. It helps organizations deliver the Campaign's messages to the media and to other groups and individuals who care about keeping the nation's youth drug free. Eight sections focus on: (1) "Campaign Overview"…

  18. Ready, Set, Respect! GLSEN's Elementary School Toolkit

    ERIC Educational Resources Information Center

    Gay, Lesbian and Straight Education Network (GLSEN), 2012

    2012-01-01

    "Ready, Set, Respect!" provides a set of tools to help elementary school educators ensure that all students feel safe and respected and develop respectful attitudes and behaviors. It is not a program to be followed but instead is designed to help educators prepare themselves for teaching about and modeling respect. The toolkit responds to…

  19. Integrated System Health Management Development Toolkit

    NASA Technical Reports Server (NTRS)

    Figueroa, Jorge; Smith, Harvey; Morris, Jon

    2009-01-01

    This software toolkit is designed to model complex systems for the implementation of embedded Integrated System Health Management (ISHM) capability, which focuses on determining the condition (health) of every element in a complex system (detect anomalies, diagnose causes, and predict future anomalies), and to provide data, information, and knowledge (DIaK) to control systems for safe and effective operation.

  20. Fragment Impact Toolkit: A Toolkit for Modeling Fragment Generation and Impacts on Targets

    NASA Astrophysics Data System (ADS)

    Shevitz, Daniel

    2005-07-01

    In this talk we will detail the status of the Fragment Impact Toolkit. The toolkit is used to model nearby explosion problems and assess probabilities of user-specified outcomes. The toolkit offers a framework, without locking the user into any particular set of states, assumptions, or constraints. The toolkit breaks a fragment impact problem into five components, all of which are extendable: (1) source description that includes the geometry of the source; (2) fragment generation that comprises the fragmentation process, including fragment size distributions (if required) and assignment of initial conditions, such a velocity; (3) fragment flight that includes what occurs to fragments while airborne; (4) target intersection that includes specification of target geometry, position, and orientation; and (5) target consequence that includes what occurs when fragments hit a target. Two notable contributions of the toolkit are the ability to have sources that break up with position-dependent and user-specifiable size probability distributions and then impact targets of arbitrary complexity. In this paper we will show examples of how to use the toolkit and simulate targets, including airplanes and stacks of munitions.

  1. Global Arrays Parallel Programming Toolkit

    SciTech Connect

    Nieplocha, Jaroslaw; Krishnan, Manoj Kumar; Palmer, Bruce J.; Tipparaju, Vinod; Harrison, Robert J.; Chavarría-Miranda, Daniel

    2011-01-01

    The two predominant classes of programming models for parallel computing are distributed memory and shared memory. Both shared memory and distributed memory models have advantages and shortcomings. Shared memory model is much easier to use but it ignores data locality/placement. Given the hierarchical nature of the memory subsystems in modern computers this characteristic can have a negative impact on performance and scalability. Careful code restructuring to increase data reuse and replacing fine grain load/stores with block access to shared data can address the problem and yield performance for shared memory that is competitive with message-passing. However, this performance comes at the cost of compromising the ease of use that the shared memory model advertises. Distributed memory models, such as message-passing or one-sided communication, offer performance and scalability but they are difficult to program. The Global Arrays toolkit attempts to offer the best features of both models. It implements a shared-memory programming model in which data locality is managed by the programmer. This management is achieved by calls to functions that transfer data between a global address space (a distributed array) and local storage. In this respect, the GA model has similarities to the distributed shared-memory models that provide an explicit acquire/release protocol. However, the GA model acknowledges that remote data is slower to access than local data and allows data locality to be specified by the programmer and hence managed. GA is related to the global address space languages such as UPC, Titanium, and, to a lesser extent, Co-Array Fortran. In addition, by providing a set of data-parallel operations, GA is also related to data-parallel languages such as HPF, ZPL, and Data Parallel C. However, the Global Array programming model is implemented as a library that works with most languages used for technical computing and does not rely on compiler technology for achieving

  2. Weighted aggregation

    NASA Technical Reports Server (NTRS)

    Feiveson, A. H. (Principal Investigator)

    1979-01-01

    The use of a weighted aggregation technique to improve the precision of the overall LACIE estimate is considered. The manner in which a weighted aggregation technique is implemented given a set of weights is described. The problem of variance estimation is discussed and the question of how to obtain the weights in an operational environment is addressed.

  3. The Reconstruction Toolkit (RTK), an open-source cone-beam CT reconstruction toolkit based on the Insight Toolkit (ITK)

    NASA Astrophysics Data System (ADS)

    Rit, S.; Vila Oliva, M.; Brousmiche, S.; Labarbe, R.; Sarrut, D.; Sharp, G. C.

    2014-03-01

    We propose the Reconstruction Toolkit (RTK, http://www.openrtk.org), an open-source toolkit for fast cone-beam CT reconstruction, based on the Insight Toolkit (ITK) and using GPU code extracted from Plastimatch. RTK is developed by an open consortium (see affiliations) under the non-contaminating Apache 2.0 license. The quality of the platform is daily checked with regression tests in partnership with Kitware, the company supporting ITK. Several features are already available: Elekta, Varian and IBA inputs, multi-threaded Feldkamp-David-Kress reconstruction on CPU and GPU, Parker short scan weighting, multi-threaded CPU and GPU forward projectors, etc. Each feature is either accessible through command line tools or C++ classes that can be included in independent software. A MIDAS community has been opened to share CatPhan datasets of several vendors (Elekta, Varian and IBA). RTK will be used in the upcoming cone-beam CT scanner developed by IBA for proton therapy rooms. Many features are under development: new input format support, iterative reconstruction, hybrid Monte Carlo / deterministic CBCT simulation, etc. RTK has been built to freely share tomographic reconstruction developments between researchers and is open for new contributions.

  4. Construction aggregates

    USGS Publications Warehouse

    Langer, W.H.; Tepordei, V.V.; Bolen, W.P.

    2000-01-01

    Construction aggregates consist primarily of crushed stone and construction sand and gravel. Total estimated production of construction aggregates increased in 1999 by about 2% to 2.39 Gt (2.64 billion st) compared with 1998. This record production level continued an expansion that began in 1992. By commodities, crushed stone production increased 3.3%, while sand and gravel production increased by about 0.5%.

  5. Construction aggregates

    USGS Publications Warehouse

    Tepordei, V.V.

    1994-01-01

    Part of a special section on industrial minerals in 1993. The 1993 production of construction aggregates increased 6.3 percent over the 1992 figure, to reach 2.01 Gt. This represents the highest estimated annual production of combined crushed stone and construction sand and gravel ever recorded in the U.S. The outlook for construction aggregates and the issues facing the industry are discussed.

  6. The Interactive Learning Toolkit: supporting interactive classrooms

    NASA Astrophysics Data System (ADS)

    Dutta, S.; McCauley, V.; Mazur, E.

    2004-05-01

    Research-based interactive learning techniques have dramatically improved student understanding. We have created the 'Interactive Learning Toolkit' (ILT), a web-based learning management system, to help implement two such pedagogies: Just in Time Teaching and Peer Instruction. Our main goal in developing this toolkit is to save the instructor time and effort and to use technology to facilitate the interaction between the students and the instructor (and between students themselves). After a brief review of both pedagogies, we will demonstrate the many exciting new features of the ILT. We will show how technology can not only implement, but also supplement and improve these pedagogies. We would like acknowdge grants from NSF and DEAS, Harvard University

  7. A toolkit for detecting technical surprise.

    SciTech Connect

    Trahan, Michael Wayne; Foehse, Mark C.

    2010-10-01

    The detection of a scientific or technological surprise within a secretive country or institute is very difficult. The ability to detect such surprises would allow analysts to identify the capabilities that could be a military or economic threat to national security. Sandia's current approach utilizing ThreatView has been successful in revealing potential technological surprises. However, as data sets become larger, it becomes critical to use algorithms as filters along with the visualization environments. Our two-year LDRD had two primary goals. First, we developed a tool, a Self-Organizing Map (SOM), to extend ThreatView and improve our understanding of the issues involved in working with textual data sets. Second, we developed a toolkit for detecting indicators of technical surprise in textual data sets. Our toolkit has been successfully used to perform technology assessments for the Science & Technology Intelligence (S&TI) program.

  8. Early origin of the bilaterian developmental toolkit

    PubMed Central

    Erwin, Douglas H.

    2009-01-01

    Whole-genome sequences from the choanoflagellate Monosiga brevicollis, the placozoan Trichoplax adhaerens and the cnidarian Nematostella vectensis have confirmed results from comparative evolutionary developmental studies that much of the developmental toolkit once thought to be characteristic of bilaterians appeared much earlier in the evolution of animals. The diversity of transcription factors and signalling pathway genes in animals with a limited number of cell types and a restricted developmental repertoire is puzzling, particularly in light of claims that such highly conserved elements among bilaterians provide evidence of a morphologically complex protostome–deuterostome ancestor. Here, I explore the early origination of elements of what became the bilaterian toolkit, and suggest that placozoans and cnidarians represent a depauperate residue of a once more diverse assemblage of early animals, some of which may be represented in the Ediacaran fauna (c. 585–542 Myr ago). PMID:19571245

  9. HVAC Fault Detection and Diagnosis Toolkit

    Energy Science and Technology Software Center (ESTSC)

    2004-12-31

    This toolkit supports component-level model-based fault detection methods in commercial building HVAC systems. The toolbox consists of five basic modules: a parameter estimator for model calibration, a preprocessor, an AHU model simulator, a steady-state detector, and a comparator. Each of these modules and the fuzzy logic rules for fault diagnosis are described in detail. The toolbox is written in C++ and also invokes the SPARK simulation program.

  10. chemf: A purely functional chemistry toolkit

    PubMed Central

    2012-01-01

    Background Although programming in a type-safe and referentially transparent style offers several advantages over working with mutable data structures and side effects, this style of programming has not seen much use in chemistry-related software. Since functional programming languages were designed with referential transparency in mind, these languages offer a lot of support when writing immutable data structures and side-effects free code. We therefore started implementing our own toolkit based on the above programming paradigms in a modern, versatile programming language. Results We present our initial results with functional programming in chemistry by first describing an immutable data structure for molecular graphs together with a couple of simple algorithms to calculate basic molecular properties before writing a complete SMILES parser in accordance with the OpenSMILES specification. Along the way we show how to deal with input validation, error handling, bulk operations, and parallelization in a purely functional way. At the end we also analyze and improve our algorithms and data structures in terms of performance and compare it to existing toolkits both object-oriented and purely functional. All code was written in Scala, a modern multi-paradigm programming language with a strong support for functional programming and a highly sophisticated type system. Conclusions We have successfully made the first important steps towards a purely functional chemistry toolkit. The data structures and algorithms presented in this article perform well while at the same time they can be safely used in parallelized applications, such as computer aided drug design experiments, without further adjustments. This stands in contrast to existing object-oriented toolkits where thread safety of data structures and algorithms is a deliberate design decision that can be hard to implement. Finally, the level of type-safety achieved by Scala highly increased the reliability of our code

  11. Application experiences with the Globus toolkit.

    SciTech Connect

    Brunett, S.

    1998-06-09

    The Globus grid toolkit is a collection of software components designed to support the development of applications for high-performance distributed computing environments, or ''computational grids'' [14]. The Globus toolkit is an implementation of a ''bag of services'' architecture, which provides application and tool developers not with a monolithic system but rather with a set of stand-alone services. Each Globus component provides a basic service, such as authentication, resource allocation, information, communication, fault detection, and remote data access. Different applications and tools can combine these services in different ways to construct ''grid-enabled'' systems. The Globus toolkit has been used to construct the Globus Ubiquitous Supercomputing Testbed, or GUSTO: a large-scale testbed spanning 20 sites and included over 4000 compute nodes for a total compute power of over 2 TFLOPS. Over the past six months, we and others have used this testbed to conduct a variety of application experiments, including multi-user collaborative environments (tele-immersion), computational steering, distributed supercomputing, and high throughput computing. The goal of this paper is to review what has been learned from these experiments regarding the effectiveness of the toolkit approach. To this end, we describe two of the application experiments in detail, noting what worked well and what worked less well. The two applications are a distributed supercomputing application, SF-Express, in which multiple supercomputers are harnessed to perform large distributed interactive simulations; and a tele-immersion application, CAVERNsoft, in which the focus is on connecting multiple people to a distributed simulated world.

  12. Mission Operations and Navigation Toolkit Environment

    NASA Technical Reports Server (NTRS)

    Sunseri, Richard F.; Wu, Hsi-Cheng; Hanna, Robert A.; Mossey, Michael P.; Duncan, Courtney B.; Evans, Scott E.; Evans, James R.; Drain, Theodore R.; Guevara, Michelle M.; Martin Mur, Tomas J.; Attiyah, Ahlam A.

    2009-01-01

    MONTE (Mission Operations and Navigation Toolkit Environment) Release 7.3 is an extensible software system designed to support trajectory and navigation analysis/design for space missions. MONTE is intended to replace the current navigation and trajectory analysis software systems, which, at the time of this reporting, are used by JPL's Navigation and Mission Design section. The software provides an integrated, simplified, and flexible system that can be easily maintained to serve the needs of future missions in need of navigation services.

  13. VIDE: The Void IDentification and Examination toolkit

    NASA Astrophysics Data System (ADS)

    Sutter, P. M.; Lavaux, G.; Hamaus, N.; Pisani, A.; Wandelt, B. D.; Warren, M.; Villaescusa-Navarro, F.; Zivick, P.; Mao, Q.; Thompson, B. B.

    2015-03-01

    We present VIDE, the Void IDentification and Examination toolkit, an open-source Python/C++ code for finding cosmic voids in galaxy redshift surveys and N -body simulations, characterizing their properties, and providing a platform for more detailed analysis. At its core, VIDE uses a substantially enhanced version of ZOBOV (Neyinck 2008) to calculate a Voronoi tessellation for estimating the density field and performing a watershed transform to construct voids. Additionally, VIDE provides significant functionality for both pre- and post-processing: for example, VIDE can work with volume- or magnitude-limited galaxy samples with arbitrary survey geometries, or dark matter particles or halo catalogs in a variety of common formats. It can also randomly subsample inputs and includes a Halo Occupation Distribution model for constructing mock galaxy populations. VIDE uses the watershed levels to place voids in a hierarchical tree, outputs a summary of void properties in plain ASCII, and provides a Python API to perform many analysis tasks, such as loading and manipulating void catalogs and particle members, filtering, plotting, computing clustering statistics, stacking, comparing catalogs, and fitting density profiles. While centered around ZOBOV, the toolkit is designed to be as modular as possible and accommodate other void finders. VIDE has been in development for several years and has already been used to produce a wealth of results, which we summarize in this work to highlight the capabilities of the toolkit. VIDE is publicly available at

  14. An Overview of the Geant4 Toolkit

    SciTech Connect

    Apostolakis, John; Wright, Dennis H.

    2007-03-19

    Geant4 is a toolkit for the simulation of the transport of radiation through matter. With a flexible kernel and choices between different physics modeling choices, it has been tailored to the requirements of a wide range of applications.With the toolkit a user can describe a setup's or detector's geometry and materials, navigate inside it, simulate the physical interactions using a choice of physics engines, underlying physics cross-sections and models, visualise and store results.Physics models describing electromagnetic and hadronic interactions are provided, as are decays and processes for optical photons. Several models, with different precision and performance are available for many processes. The toolkit includes coherent physics model configurations, which are called physics lists. Users can choose an existing physics list or create their own, depending on their requirements and the application area. A clear structure and readable code, enable the user to investigate the origin of physics results.Application areas include detector simulation and background simulation in High Energy Physics experiments, simulation of accelerator setups, studies in medical imaging and treatment, and the study of the effects of solar radiation on spacecraft instruments.

  15. An Overview of the GEANT4 Toolkit

    SciTech Connect

    Apostolakis, John; Wright, Dennis H.; /SLAC

    2007-10-05

    Geant4 is a toolkit for the simulation of the transport of radiation through matter. With a flexible kernel and choices between different physics modeling choices, it has been tailored to the requirements of a wide range of applications. With the toolkit a user can describe a setup's or detector's geometry and materials, navigate inside it, simulate the physical interactions using a choice of physics engines, underlying physics cross-sections and models, visualize and store results. Physics models describing electromagnetic and hadronic interactions are provided, as are decays and processes for optical photons. Several models, with different precision and performance are available for many processes. The toolkit includes coherent physics model configurations, which are called physics lists. Users can choose an existing physics list or create their own, depending on their requirements and the application area. A clear structure and readable code, enable the user to investigate the origin of physics results. Application areas include detector simulation and background simulation in High Energy Physics experiments, simulation of accelerator setups, studies in medical imaging and treatment, and the study of the effects of solar radiation on spacecraft instruments.

  16. Capturing Petascale Application Characteristics with the Sequoia Toolkit

    SciTech Connect

    Vetter, Jeffrey S; Bhatia, Nikhil; Grobelny, Eric M; Roth, Philip C

    2005-09-01

    Characterization of the computation, communication, memory, and I/O demands of current scientific applications is crucial for identifying which technologies will enable petascale scientific computing. In this paper, we present the Sequoia Toolkit for characterizing HPC applications. The Sequoia Toolkit consists of the Sequoia trace capture library and the Sequoia Event Analysis Library, or SEAL, that facilitates the development of tools for analyzing Sequoia event traces. Using the Sequoia Toolkit, we have characterized the behavior of application runs with up to 2048 application processes. To illustrate the use of the Sequoia Toolkit, we present a preliminary characterization of LAMMPS, a molecular dynamics application of great interest to the computational biology community.

  17. An evaluation capacity building toolkit for principal investigators of undergraduate research experiences: A demonstration of transforming theory into practice.

    PubMed

    Rorrer, Audrey S

    2016-04-01

    This paper describes the approach and process undertaken to develop evaluation capacity among the leaders of a federally funded undergraduate research program. An evaluation toolkit was developed for Computer and Information Sciences and Engineering(1) Research Experiences for Undergraduates(2) (CISE REU) programs to address the ongoing need for evaluation capacity among principal investigators who manage program evaluation. The toolkit was the result of collaboration within the CISE REU community with the purpose being to provide targeted instructional resources and tools for quality program evaluation. Challenges were to balance the desire for standardized assessment with the responsibility to account for individual program contexts. Toolkit contents included instructional materials about evaluation practice, a standardized applicant management tool, and a modulated outcomes measure. Resulting benefits from toolkit deployment were having cost effective, sustainable evaluation tools, a community evaluation forum, and aggregate measurement of key program outcomes for the national program. Lessons learned included the imperative of understanding the evaluation context, engaging stakeholders, and building stakeholder trust. Results from project measures are presented along with a discussion of guidelines for facilitating evaluation capacity building that will serve a variety of contexts. PMID:26788814

  18. Water Security Toolkit User Manual Version 1.2.

    SciTech Connect

    Klise, Katherine A.; Siirola, John Daniel; Hart, David; Hart, William Eugene; Phillips, Cynthia Ann; Haxton, Terranna; Murray, Regan; Janke, Robert; Taxon, Thomas; Laird, Carl; Seth, Arpan; Hackebeil, Gabriel; McGee, Shawn; Mann, Angelica

    2014-08-01

    The Water Security Toolkit (WST) is a suite of open source software tools that can be used by water utilities to create response strategies to reduce the impact of contamination in a water distribution network . WST includes hydraulic and water quality modeling software , optimizati on methodologies , and visualization tools to identify: (1) sensor locations to detect contamination, (2) locations in the network in which the contamination was introduced, (3) hydrants to remove contaminated water from the distribution system, (4) locations in the network to inject decontamination agents to inactivate, remove, or destroy contaminants, (5) locations in the network to take grab sample s to help identify the source of contamination and (6) valves to close in order to isolate contaminate d areas of the network. This user manual describes the different components of WST , along w ith examples and case studies. License Notice The Water Security Toolkit (WST) v.1.2 Copyright c 2012 Sandia Corporation. Under the terms of Contract DE-AC04-94AL85000, there is a non-exclusive license for use of this work by or on behalf of the U.S. government. This software is distributed under the Revised BSD License (see below). In addition, WST leverages a variety of third-party software packages, which have separate licensing policies: Acro Revised BSD License argparse Python Software Foundation License Boost Boost Software License Coopr Revised BSD License Coverage BSD License Distribute Python Software Foundation License / Zope Public License EPANET Public Domain EPANET-ERD Revised BSD License EPANET-MSX GNU Lesser General Public License (LGPL) v.3 gcovr Revised BSD License GRASP AT&T Commercial License for noncommercial use; includes randomsample and sideconstraints executable files LZMA SDK Public Domain nose GNU Lesser General Public License (LGPL) v.2.1 ordereddict MIT License pip MIT License PLY BSD License PyEPANET Revised BSD License Pyro MIT License PyUtilib Revised BSD License Py

  19. NGS QC Toolkit: A Toolkit for Quality Control of Next Generation Sequencing Data

    PubMed Central

    Patel, Ravi K.; Jain, Mukesh

    2012-01-01

    Next generation sequencing (NGS) technologies provide a high-throughput means to generate large amount of sequence data. However, quality control (QC) of sequence data generated from these technologies is extremely important for meaningful downstream analysis. Further, highly efficient and fast processing tools are required to handle the large volume of datasets. Here, we have developed an application, NGS QC Toolkit, for quality check and filtering of high-quality data. This toolkit is a standalone and open source application freely available at http://www.nipgr.res.in/ngsqctoolkit.html. All the tools in the application have been implemented in Perl programming language. The toolkit is comprised of user-friendly tools for QC of sequencing data generated using Roche 454 and Illumina platforms, and additional tools to aid QC (sequence format converter and trimming tools) and analysis (statistics tools). A variety of options have been provided to facilitate the QC at user-defined parameters. The toolkit is expected to be very useful for the QC of NGS data to facilitate better downstream analysis. PMID:22312429

  20. Accelerator physics analysis with an integrated toolkit

    SciTech Connect

    Holt, J.A.; Michelotti, L.; Satogata, T.

    1992-08-01

    Work is in progress on an integrated software toolkit for linear and nonlinear accelerator design, analysis, and simulation. As a first application, ``beamline`` and ``MXYZPTLK`` (differential algebra) class libraries, were used with an X Windows graphics library to build an user-friendly, interactive phase space tracker which, additionally, finds periodic orbits. This program was used to analyse a theoretical lattice which contains octupoles and decapoles to find the 20th order, stable and unstable periodic orbits and to explore the local phase space structure.

  1. Tips from the toolkit: 1 - know yourself.

    PubMed

    Steer, Neville

    2010-01-01

    High performance organisations review their strategy and business processes as part of usual business operations. If you are new to the field of general practice, do you have a career plan for the next 5-10 years? If you are an experienced general practitioner, are you using much the same business model and processes as when you started out? The following article sets out some ideas you might use to have a fresh approach to your professional career. It is based on The Royal Australian College of General Practitioners' 'General practice management toolkit'. PMID:20369141

  2. MCS Large Cluster Systems Software Toolkit

    Energy Science and Technology Software Center (ESTSC)

    2002-11-01

    This package contains a number of systems utilities for managing a set of computers joined in a "cluster". The utilities assist a team of systems administrators in managing the cluster by automating routine tasks, centralizing information, and monitoring individual computers within the cluster. Included in the toolkit are scripts used to boot a computer from a floppy, a program to turn on and off the power to a system, and a system for using amore » database to organize cluster information.« less

  3. Graph algorithms in the titan toolkit.

    SciTech Connect

    McLendon, William Clarence, III; Wylie, Brian Neil

    2009-10-01

    Graph algorithms are a key component in a wide variety of intelligence analysis activities. The Graph-Based Informatics for Non-Proliferation and Counter-Terrorism project addresses the critical need of making these graph algorithms accessible to Sandia analysts in a manner that is both intuitive and effective. Specifically we describe the design and implementation of an open source toolkit for doing graph analysis, informatics, and visualization that provides Sandia with novel analysis capability for non-proliferation and counter-terrorism.

  4. An Incident Management Preparedness and Coordination Toolkit

    SciTech Connect

    Koch, Daniel B; Payne, Patricia W

    2012-01-01

    Although the use of Geographic Information Systems (GIS) by centrally-located operations staff is well established in the area of emergency response, utilization by first responders in the field is uneven. Cost, complexity, and connectivity are often the deciding factors preventing wider adoption. For the past several years, Oak Ridge National Laboratory (ORNL) has been developing a mobile GIS solution using free and open-source software targeting the needs of front-line personnel. Termed IMPACT, for Incident Management Preparedness and Coordination Toolkit, this ORNL application can complement existing GIS infrastructure and extend its power and capabilities to responders first on the scene of a natural or man-made disaster.

  5. Veterinary Immunology Committee Toolkit Workshop 2010: Progress and plans

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The Third Veterinary Immunology Committee (VIC) Toolkit Workshop took place at the Ninth International Veterinary Immunology Symposium (IVIS) in Tokyo, Japan on August 18, 2020. The Workshop built on previous Toolkit Workshops and covered various aspects of reagent development, commercialisation an...

  6. 77 FR 73022 - U.S. Environmental Solutions Toolkit

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-07

    ... International Trade Administration U.S. Environmental Solutions Toolkit AGENCY: International Trade... web- based U.S. Environmental Solutions Toolkit to be used by foreign environmental officials and foreign end-users of environmental technologies that will outline U.S. approaches to a series...

  7. 77 FR 73023 - U.S. Environmental Solutions Toolkit

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-07

    ... International Trade Administration U.S. Environmental Solutions Toolkit AGENCY: International Trade.... Environmental Solutions Toolkit to be used by foreign environmental officials and foreign end-users of environmental technologies that will outline U.S. ] approaches to a series of environmental problems...

  8. Quality Assurance Toolkit for Distance Higher Education Institutions and Programmes

    ERIC Educational Resources Information Center

    Rama, Kondapalli, Ed.; Hope, Andrea, Ed.

    2009-01-01

    The Commonwealth of Learning is proud to partner with the Sri Lankan Ministry of Higher Education and UNESCO to produce this "Quality Assurance Toolkit for Distance Higher Education Institutions and Programmes". The Toolkit has been prepared with three features. First, it is a generic document on quality assurance, complete with a glossary of…

  9. The Best Ever Alarm System Toolkit

    SciTech Connect

    Kasemir, Kay; Chen, Xihui; Danilova, Katia

    2009-01-01

    Learning from our experience with the standard Experimental Physics and Industrial Control System (EPICS) alarm handler (ALH) as well as a similar intermediate approach based on script-generated operator screens, we developed the Best Ever Alarm System Toolkit (BEAST). It is based on Java and Eclipse on the Control System Studio (CSS) platform, using a relational database (RDB) to store the configuration and log actions. It employs a Java Message Service (JMS) for communication between the modular pieces of the toolkit, which include an Alarm Server to maintain the current alarm state, an arbitrary number of Alarm Client user interfaces (GUI), and tools to annunciate alarms or log alarm related actions. Web reports allow us to monitor the alarm system performance and spot deficiencies in the alarm configuration. The Alarm Client GUI not only gives the end users various ways to view alarms in tree and table, but also makes it easy to access the guidance information, the related operator displays and other CSS tools. It also allows online configuration to be simply modified from the GUI. Coupled with a good "alarm philosophy" on how to provide useful alarms, we can finally improve the configuration to achieve an effective alarm system.

  10. MIS: A Miriad Interferometry Singledish Toolkit

    NASA Astrophysics Data System (ADS)

    Pound, Marc; Teuben, Peter

    2011-10-01

    MIS is a pipeline toolkit using the package MIRIAD to combine Interferometric and Single Dish data. This was prompted by our observations made with the Combined Array For Research in Millimeter-wave Astronomy (CARMA) interferometer of the star-forming region NGC 1333, a large survey highlighting the new 23-element and singledish observing modes. The project consists of 20 CARMA datasets each containing interferometric as well as simultaneously obtained single dish data, for 3 molecular spectral lines and continuum, in 527 different pointings, covering an area of about 8 by 11 arcminutes. A small group of collaborators then shared this toolkit and their parameters via CVS, and scripts were developed to ensure uniform data reduction across the group. The pipeline was run end-to-end each night that new observations were obtained, producing maps that contained all the data to date. This approach could serve as a model for repeated calibration and mapping of large mixed-mode correlation datasets from ALMA.

  11. MIS: A MIRIAD Interferometry Singledish Toolkit

    NASA Astrophysics Data System (ADS)

    Pound, M. W.; Teuben, P.

    2012-09-01

    Building on the “drPACS” contribution at ADASS XX of a simple Unix pipeline infrastructure, we implemented a pipeline toolkit using the package MIRIAD to combine Interferometric and Single Dish data (MIS). This was prompted by our observations made with the Combined Array For Research in Millimeter-wave Astronomy (CARMA) interferometer of the star-forming region NGC 1333, a large survey highlighting the new 23-element and singledish observing modes. The project consists of 20 CARMA datasets each containing interferometric as well as simultaneously obtained single dish data, for 3 molecular spectral lines and continuum, in 527 different pointings, covering an area of about 8 by 11 arcminutes. A small group of collaborators then shared this toolkit and their parameters via CVS, and scripts were developed to ensure uniform data reduction across the group. The pipeline was run end-to-end each night as new observations were obtained, producing maps that contained all the data to date. We will show examples of the scripts and data products. This approach could serve as a model for repeated calibration and mapping of large mixed-mode correlation datasets from ALMA.

  12. ADMIT: The ALMA Data Mining Toolkit

    NASA Astrophysics Data System (ADS)

    Teuben, P.; Pound, M.; Mundy, L.; Rauch, K.; Friedel, D.; Looney, L.; Xu, L.; Kern, J.

    2015-09-01

    ADMIT (ALMA Data Mining ToolkiT), a toolkit for the creation of new science products from ALMA data, is being developed as an ALMA Development Project. It is written in Python and, while specifically targeted for a uniform analysis of the ALMA science products that come out of the ALMA pipeline, it is designed to be generally applicable to (radio) astronomical data. It first provides users with a detailed view of their science products created by ADMIT inside the ALMA pipeline: line identifications, line ‘cutout' cubes, moment maps, emission type analysis (e.g., feature detection). Using descriptor vectors the ALMA data archive is enriched with useful information to make archive data mining possible. Users can also opt to download the (small) ADMIT pipeline product, then fine-tune and re-run the pipeline and inspect their hopefully improved data. By running many projects in a parallel fashion, data mining between many astronomical sources and line transitions will also be possible. Future implementations of ADMIT may include EVLA and other instruments.

  13. UTGB toolkit for personalized genome browsers

    PubMed Central

    Saito, Taro L.; Yoshimura, Jun; Sasaki, Shin; Ahsan, Budrul; Sasaki, Atsushi; Kuroshu, Reginaldo; Morishita, Shinichi

    2009-01-01

    The advent of high-throughput DNA sequencers has increased the pace of collecting enormous amounts of genomic information, yielding billions of nucleotides on a weekly basis. This advance represents an improvement of two orders of magnitude over traditional Sanger sequencers in terms of the number of nucleotides per unit time, allowing even small groups of researchers to obtain huge volumes of genomic data over fairly short period. Consequently, a pressing need exists for the development of personalized genome browsers for analyzing these immense amounts of locally stored data. The UTGB (University of Tokyo Genome Browser) Toolkit is designed to meet three major requirements for personalization of genome browsers: easy installation of the system with minimum efforts, browsing locally stored data and rapid interactive design of web interfaces tailored to individual needs. The UTGB Toolkit is licensed under an open source license. Availability: The software is freely available at http://utgenome.org/. Contact: moris@cb.k.u-tokyo.ac.jp PMID:19497937

  14. Construction aggregates

    USGS Publications Warehouse

    Tepordei, V.V.

    1993-01-01

    Part of a special section on the market performance of industrial minerals in 1992. Production of construction aggregates increased by 4.6 percent in 1992. This increase was due, in part, to the increased funding for transportation and infrastructure projects. The U.S. produced about 1.05 Gt of crushed stone and an estimated 734 Mt of construction sand and gravel in 1992. Demand is expected to increase by about 5 percent in 1993.

  15. Construction aggregates

    USGS Publications Warehouse

    Tepordei, V.V.

    1996-01-01

    Part of the Annual Commodities Review 1995. Production of construction aggregates such as crushed stone and construction sand and gravel showed a marginal increase in 1995. Most of the 1995 increases were due to funding for highway construction work. The major areas of concern to the industry included issues relating to wetlands classification and the classification of crystalline silica as a probable human carcinogen. Despite this, an increase in demand is anticipated for 1996.

  16. Construction aggregates

    USGS Publications Warehouse

    Nelson, T.I.; Bolen, W.P.

    2007-01-01

    Construction aggregates, primarily stone, sand and gravel, are recovered from widespread naturally occurring mineral deposits and processed for use primarily in the construction industry. They are mined, crushed, sorted by size and sold loose or combined with portland cement or asphaltic cement to make concrete products to build roads, houses, buildings, and other structures. Much smaller quantities are used in agriculture, cement manufacture, chemical and metallurgical processes, glass production and many other products.

  17. The Image-Guided Surgery ToolKit IGSTK: an open source C++ software toolkit

    NASA Astrophysics Data System (ADS)

    Cheng, Peng; Ibanez, Luis; Gobbi, David; Gary, Kevin; Aylward, Stephen; Jomier, Julien; Enquobahrie, Andinet; Zhang, Hui; Kim, Hee-su; Blake, M. Brian; Cleary, Kevin

    2007-03-01

    The Image-Guided Surgery Toolkit (IGSTK) is an open source C++ software library that provides the basic components needed to develop image-guided surgery applications. The focus of the toolkit is on robustness using a state machine architecture. This paper presents an overview of the project based on a recent book which can be downloaded from igstk.org. The paper includes an introduction to open source projects, a discussion of our software development process and the best practices that were developed, and an overview of requirements. The paper also presents the architecture framework and main components. This presentation is followed by a discussion of the state machine model that was incorporated and the associated rationale. The paper concludes with an example application.

  18. Construction aggregates

    USGS Publications Warehouse

    Bolen, W.P.; Tepordei, V.V.

    2001-01-01

    The estimated production during 2000 of construction aggregates, crushed stone, and construction sand and gravel increased by about 2.6% to 2.7 Gt (3 billion st), compared with 1999. The expansion that started in 1992 continued with record production levels for the ninth consecutive year. By commodity, construction sand and gravel production increased by 4.5% to 1.16 Gt (1.28 billion st), while crushed stone production increased by 1.3% to 1.56 Gt (1.72 billion st).

  19. The image-guided surgery toolkit IGSTK: an open source C++ software toolkit.

    PubMed

    Enquobahrie, Andinet; Cheng, Patrick; Gary, Kevin; Ibanez, Luis; Gobbi, David; Lindseth, Frank; Yaniv, Ziv; Aylward, Stephen; Jomier, Julien; Cleary, Kevin

    2007-11-01

    This paper presents an overview of the image-guided surgery toolkit (IGSTK). IGSTK is an open source C++ software library that provides the basic components needed to develop image-guided surgery applications. It is intended for fast prototyping and development of image-guided surgery applications. The toolkit was developed through a collaboration between academic and industry partners. Because IGSTK was designed for safety-critical applications, the development team has adopted lightweight software processes that emphasizes safety and robustness while, at the same time, supporting geographically separated developers. A software process that is philosophically similar to agile software methods was adopted emphasizing iterative, incremental, and test-driven development principles. The guiding principle in the architecture design of IGSTK is patient safety. The IGSTK team implemented a component-based architecture and used state machine software design methodologies to improve the reliability and safety of the components. Every IGSTK component has a well-defined set of features that are governed by state machines. The state machine ensures that the component is always in a valid state and that all state transitions are valid and meaningful. Realizing that the continued success and viability of an open source toolkit depends on a strong user community, the IGSTK team is following several key strategies to build an active user community. These include maintaining a users and developers' mailing list, providing documentation (application programming interface reference document and book), presenting demonstration applications, and delivering tutorial sessions at relevant scientific conferences. PMID:17703338

  20. Monitoring Extreme-scale Lustre Toolkit

    SciTech Connect

    Brim, Michael J; Lothian, Josh

    2015-01-01

    We discuss the design and ongoing development of the Monitoring Extreme-scale Lustre Toolkit (MELT), a unified Lustre performance monitoring and analysis infrastructure that provides continuous, low-overhead summary information on the health and performance of Lustre, as well as on-demand, in-depth problem diagnosis and root-cause analysis. The MELT infrastructure leverages a distributed overlay network to enable monitoring of center-wide Lustre filesystems where clients are located across many network domains. We preview interactive command-line utilities that help administrators and users to observe Lustre performance at various levels of resolution, from individual servers or clients to whole filesystems, including job-level reporting. Finally, we discuss our future plans for automating the root-cause analysis of common Lustre performance problems.

  1. Introduction to the Geant4 Simulation toolkit

    SciTech Connect

    Guatelli, S.; Cutajar, D.; Rosenfeld, A. B.; Oborn, B.

    2011-05-05

    Geant4 is a Monte Carlo simulation Toolkit, describing the interactions of particles with matter. Geant4 is widely used in radiation physics research, from High Energy Physics, to medical physics and space science, thanks to its sophisticated physics component, coupled with advanced functionality in geometry description. Geant4 is widely used at the Centre for Medical Radiation Physics (CMRP), at the University of Wollongong, to characterise and optimise novel detector concepts, radiotherapy treatments, and imaging solutions. This lecture consists of an introduction to Monte Carlo method, and to Geant4. Particular attention will be devoted to the Geant4 physics component, and to the physics models describing electromagnetic and hadronic physics interactions. The second part of the lecture will be focused on the methodology to adopt to develop a Geant4 simulation application.

  2. TEVA-SPOT Toolkit 1.2

    Energy Science and Technology Software Center (ESTSC)

    2007-07-26

    The TEVA-SPOT Toolkit (SPOT) supports the design of contaminant warning systems (CWSs) that use real-time sensors to detect contaminants in municipal water distribution networks. Specifically, SPOT provides the capability to select the locations for installing sensors in order to maximize the utility and effectiveness of the CWS. SPOT models the sensor placement process as an optimization problem, and the user can specify a wide range of performance objectives for contaminant warning system design, including populationmore » health effects, time to detection, extent of contamination, volume consumed and number of failed detections. For example, a SPOT user can integrate expert knowledge during the design process by specigying required sensor placements or designating network locations as forbidden. Further, cost considerations can be integrated by limiting the design with user-specified installation costs at each location.« less

  3. Introduction to the Geant4 Simulation toolkit

    NASA Astrophysics Data System (ADS)

    Guatelli, S.; Cutajar, D.; Oborn, B.; Rosenfeld, A. B.

    2011-05-01

    Geant4 is a Monte Carlo simulation Toolkit, describing the interactions of particles with matter. Geant4 is widely used in radiation physics research, from High Energy Physics, to medical physics and space science, thanks to its sophisticated physics component, coupled with advanced functionality in geometry description. Geant4 is widely used at the Centre for Medical Radiation Physics (CMRP), at the University of Wollongong, to characterise and optimise novel detector concepts, radiotherapy treatments, and imaging solutions. This lecture consists of an introduction to Monte Carlo method, and to Geant4. Particular attention will be devoted to the Geant4 physics component, and to the physics models describing electromagnetic and hadronic physics interactions. The second part of the lecture will be focused on the methodology to adopt to develop a Geant4 simulation application.

  4. Data Exploration Toolkit for serial diffraction experiments

    DOE PAGESBeta

    Zeldin, Oliver B.; Brewster, Aaron S.; Hattne, Johan; Uervirojnangkoorn, Monarin; Lyubimov, Artem Y.; Zhou, Qiangjun; Zhao, Minglei; Weis, William I.; Sauter, Nicholas K.; Brunger, Axel T.

    2015-01-23

    Ultrafast diffraction at X-ray free-electron lasers (XFELs) has the potential to yield new insights into important biological systems that produce radiation-sensitive crystals. An unavoidable feature of the 'diffraction before destruction' nature of these experiments is that images are obtained from many distinct crystals and/or different regions of the same crystal. Combined with other sources of XFEL shot-to-shot variation, this introduces significant heterogeneity into the diffraction data, complicating processing and interpretation. To enable researchers to get the most from their collected data, a toolkit is presented that provides insights into the quality of, and the variation present in, serial crystallography datamore » sets. These tools operate on the unmerged, partial intensity integration results from many individual crystals, and can be used on two levels: firstly to guide the experimental strategy during data collection, and secondly to help users make informed choices during data processing.« less

  5. NBII-SAIN Data Management Toolkit

    USGS Publications Warehouse

    Burley, Thomas E.; Peine, John D.

    2009-01-01

    percent of the cost of a spatial information system is associated with spatial data collection and management (U.S. General Accounting Office, 2003). These figures indicate that the resources (time, personnel, money) of many agencies and organizations could be used more efficiently and effectively. Dedicated and conscientious data management coordination and documentation is critical for reducing such redundancy. Substantial cost savings and increased efficiency are direct results of a pro-active data management approach. In addition, details of projects as well as data and information are frequently lost as a result of real-world occurrences such as the passing of time, job turnover, and equipment changes and failure. A standardized, well documented database allows resource managers to identify issues, analyze options, and ultimately make better decisions in the context of adaptive management (National Land and Water Resources Audit and the Australia New Zealand Land Information Council on behalf of the Australian National Government, 2003). Many environmentally focused, scientific, or natural resource management organizations collect and create both spatial and non-spatial data in some form. Data management appropriate for those data will be contingent upon the project goal(s) and objectives and thus will vary on a case-by-case basis. This project and the resulting Data Management Toolkit, hereafter referred to as the Toolkit, is therefore not intended to be comprehensive in terms of addressing all of the data management needs of all projects that contain biological, geospatial, and other types of data. The Toolkit emphasizes the idea of connecting a project's data and the related management needs to the defined project goals and objectives from the outset. In that context, the Toolkit presents and describes the fundamental components of sound data and information management that are common to projects involving biological, geospatial, and other related data

  6. Data Exploration Toolkit for serial diffraction experiments

    PubMed Central

    Zeldin, Oliver B.; Brewster, Aaron S.; Hattne, Johan; Uervirojnangkoorn, Monarin; Lyubimov, Artem Y.; Zhou, Qiangjun; Zhao, Minglei; Weis, William I.; Sauter, Nicholas K.; Brunger, Axel T.

    2015-01-01

    Ultrafast diffraction at X-ray free-electron lasers (XFELs) has the potential to yield new insights into important biological systems that produce radiation-sensitive crystals. An unavoidable feature of the ‘diffraction before destruction’ nature of these experiments is that images are obtained from many distinct crystals and/or different regions of the same crystal. Combined with other sources of XFEL shot-to-shot variation, this introduces significant heterogeneity into the diffraction data, complicating processing and interpretation. To enable researchers to get the most from their collected data, a toolkit is presented that provides insights into the quality of, and the variation present in, serial crystallography data sets. These tools operate on the unmerged, partial intensity integration results from many individual crystals, and can be used on two levels: firstly to guide the experimental strategy during data collection, and secondly to help users make informed choices during data processing. PMID:25664746

  7. A Perl toolkit for LIMS development

    PubMed Central

    Morris, James A; Gayther, Simon A; Jacobs, Ian J; Jones, Christopher

    2008-01-01

    Background High throughput laboratory techniques generate huge quantities of scientific data. Laboratory Information Management Systems (LIMS) are a necessary requirement, dealing with sample tracking, data storage and data reporting. Commercial LIMS solutions are available, but these can be both costly and overly complex for the task. The development of bespoke LIMS solutions offers a number of advantages, including the flexibility to fulfil all a laboratory's requirements at a fraction of the price of a commercial system. The programming language Perl is a perfect development solution for LIMS applications because of Perl's powerful but simple to use database and web interaction, it is also well known for enabling rapid application development and deployment, and boasts a very active and helpful developer community. The development of an in house LIMS from scratch however can take considerable time and resources, so programming tools that enable the rapid development of LIMS applications are essential but there are currently no LIMS development tools for Perl. Results We have developed ArrayPipeline, a Perl toolkit providing object oriented methods that facilitate the rapid development of bespoke LIMS applications. The toolkit includes Perl objects that encapsulate key components of a LIMS, providing methods for creating interactive web pages, interacting with databases, error tracking and reporting, and user and session management. The MT_Plate object provides methods for manipulation and management of microtitre plates, while a given LIMS can be encapsulated by extension of the core modules, providing system specific methods for database interaction and web page management. Conclusion This important addition to the Perl developer's library will make the development of in house LIMS applications quicker and easier encouraging laboratories to create bespoke LIMS applications to meet their specific data management requirements. PMID:18353174

  8. Handheld access to radiology teaching files: an automated system for format conversion and content creation

    NASA Astrophysics Data System (ADS)

    Raman, Raghav; Raman, Lalithakala; Raman, Bhargav; Gold, Garry; Beaulieu, Christopher F.

    2002-05-01

    Current handheld Personal Digital Assistants (PDAs) can be used to view radiology teaching files. We have developed a toolkit that allows rapid creation of radiology teaching files in handheld formats from existing repositories. Our toolkit incorporated a desktop converter, a web conversion server and an application programming interface (API). Our API was integrated with an existing pilot teaching file database. We evaluated our system by obtaining test DICOM and JPEG images from our PACS system, our pilot database and from personal collections and converting them on a Windows workstation (Microsoft, Redmond, CA) and on other platforms using the web server. Our toolkit anonymized, annotated and categorized images using DICOM header information and data entered by the authors. Image processing was automatically customized for the target handheld device. We used freeware handheld image viewers as well as our custom applications that allowed window/level manipulation and viewing of additional textual information. Our toolkit provides desktop and web access to image conversion tools to produce organized handheld teaching file packages for most handheld devices and our API allows existing teaching file databases to incorporate handheld compatibility. The distribution of radiology teaching files on PDAs can increase the accessibility to radiology teaching.

  9. The Galley Parallel File System

    NASA Technical Reports Server (NTRS)

    Nieuwejaar, Nils; Kotz, David

    1996-01-01

    Most current multiprocessor file systems are designed to use multiple disks in parallel, using the high aggregate bandwidth to meet the growing I/0 requirements of parallel scientific applications. Many multiprocessor file systems provide applications with a conventional Unix-like interface, allowing the application to access multiple disks transparently. This interface conceals the parallelism within the file system, increasing the ease of programmability, but making it difficult or impossible for sophisticated programmers and libraries to use knowledge about their I/O needs to exploit that parallelism. In addition to providing an insufficient interface, most current multiprocessor file systems are optimized for a different workload than they are being asked to support. We introduce Galley, a new parallel file system that is intended to efficiently support realistic scientific multiprocessor workloads. We discuss Galley's file structure and application interface, as well as the performance advantages offered by that interface.

  10. General relativistic magneto-hydrodynamics with the Einstein Toolkit

    NASA Astrophysics Data System (ADS)

    Moesta, Philipp; Mundim, Bruno; Faber, Joshua; Noble, Scott; Bode, Tanja; Haas, Roland; Loeffler, Frank; Ott, Christian; Reisswig, Christian; Schnetter, Erik

    2013-04-01

    The Einstein Toolkit Consortium is developing and supporting open software for relativistic astrophysics. Its aim is to provide the core computational tools that can enable new science, broaden our community, facilitate interdisciplinary research and take advantage of petascale computers and advanced cyberinfrastructure. The Einstein Toolkit currently consists of an open set of over 100 modules for the Cactus framework, primarily for computational relativity along with associated tools for simulation management and visualization. The toolkit includes solvers for vacuum spacetimes as well as relativistic magneto-hydrodynamics. This talk will present the current capabilities of the Einstein Toolkit with a particular focus on recent improvements made to the general relativistic magneto-hydrodynamics modeling and will point to information how to leverage it for future research.

  11. Charon Message-Passing Toolkit for Scientific Computations

    NASA Technical Reports Server (NTRS)

    VanderWijngaart, Rob F.; Saini, Subhash (Technical Monitor)

    1998-01-01

    The Charon toolkit for piecemeal development of high-efficiency parallel programs for scientific computing is described. The portable toolkit, callable from C and Fortran, provides flexible domain decompositions and high-level distributed constructs for easy translation of serial legacy code or design to distributed environments. Gradual tuning can subsequently be applied to obtain high performance, possibly by using explicit message passing. Charon also features general structured communications that support stencil-based computations with complex recurrences. Through the separation of partitioning and distribution, the toolkit can also be used for blocking of uni-processor code, and for debugging of parallel algorithms on serial machines. An elaborate review of recent parallelization aids is presented to highlight the need for a toolkit like Charon. Some performance results of parallelizing the NAS Parallel Benchmark SP program using Charon are given, showing good scalability.

  12. Charon Message-Passing Toolkit for Scientific Computations

    NASA Technical Reports Server (NTRS)

    VanderWijngarrt, Rob F.; Saini, Subhash (Technical Monitor)

    1998-01-01

    The Charon toolkit for piecemeal development of high-efficiency parallel programs for scientific computing is described. The portable toolkit, callable from C and Fortran, provides flexible domain decompositions and high-level distributed constructs for easy translation of serial legacy code or design to distributed environments. Gradual tuning can subsequently be applied to obtain high performance, possibly by using explicit message passing. Charon also features general structured communications that support stencil-based computations with complex recurrences. Through the separation of partitioning and distribution, the toolkit can also be used for blocking of uni-processor code, and for debugging of parallel algorithms on serial machines. An elaborate review of recent parallelization aids is presented to highlight the need for a toolkit like Charon. Some performance results of parallelizing the NAS Parallel Benchmark SP program using Charon are given, showing good scalability. Some performance results of parallelizing the NAS Parallel Benchmark SP program using Charon are given, showing good scalability.

  13. The Radar Software Toolkit: Anaylsis software for the ITM community

    NASA Astrophysics Data System (ADS)

    Barnes, R. J.; Greenwald, R.

    2005-05-01

    The Radar Software Toolkit is a collection of data analysis, modelling and visualization tools originally developed for the SuperDARN project. It has evolved over the years into a robust, multi-platform software toolkit for working with a variety of ITM data sets including data from the Polar, TIMED and ACE spacecraft, ground based magnetometers, Incoherrent Scatter Radars, and SuperDARN. The toolkit includes implementations of the Altitude Adjusted Coordinate System (AACGM), the International Geomagnetic Reference Field (IGRF), SGP4 and a set of coordinate transform functions. It also includes a sophisticated XML based data visualization system. The toolkit is written using a combination of ANSI C, Java and the Interactive Data Language (IDL) and has been tested on a variety of platforms.

  14. Toolkit for evaluating impacts of public participation in scientific research

    NASA Astrophysics Data System (ADS)

    Bonney, R.; Phillips, T.

    2011-12-01

    The Toolkit for Evaluating Impacts of Public Participation in Scientific Research is being developed to meet a major need in the field of visitor studies: To provide project developers and other professionals, especially those with limited knowledge or understanding of evaluation techniques, with a systematic method for assessing project impact that facilitates longitudinal and cross-project comparisons. The need for the toolkit was first identified at the Citizen Science workshop held at the Cornell Lab of Ornithology in 2007 (McEver et al. 2007) and reaffirmed by a CAISE inquiry group that produced the recent report: "Public Participation in Scientific Research: Defining the Field and Assessing its Potential for Informal Science Education" (Bonney et al. 2009). This presentation will introduce the Toolkit, show how it is intended to be used, and describe ways that project directors can use their programmatic goals and use toolkit materials to outline a plan for evaluating the impacts of their project.

  15. Integrating legacy software toolkits into China-VO system

    NASA Astrophysics Data System (ADS)

    Wang, Xiao-Qian, Cui, Chen-Zhou, Zhao, Yong-Heng

    2005-12-01

    Virtual Observatory (VO) is a collection of data-archives and software toolkits. It aims to provide astronomers research resources with uniformed interfaces, using advanced information technologies. In this article, we discuss the necessaries and feasibilities of integrating legacy software toolkits into China-VO system at first; then analyse granularity about integrating. Three general integrating methods are given in detail. At last, we introduce an instance of integrating "Image Magick" - an software for image processing and discuss more about VO integration.

  16. The Trick Simulation Toolkit: A NASA/Opensource Framework for Running Time Based Physics Models

    NASA Technical Reports Server (NTRS)

    Penn, John M.

    2016-01-01

    The Trick Simulation Toolkit is a simulation development environment used to create high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. Its purpose is to generate a simulation executable from a collection of user-supplied models and a simulation definition file. For each Trick-based simulation, Trick automatically provides job scheduling, numerical integration, the ability to write and restore human readable checkpoints, data recording, interactive variable manipulation, a run-time interpreter, and many other commonly needed capabilities. This allows simulation developers to concentrate on their domain expertise and the algorithms and equations of their models. Also included in Trick are tools for plotting recorded data and various other supporting utilities and libraries. Trick is written in C/C++ and Java and supports both Linux and MacOSX computer operating systems. This paper describes Trick's design and use at NASA Johnson Space Center.

  17. Multi-Physics Demonstration Problem with the SHARP Reactor Simulation Toolkit

    SciTech Connect

    Merzari, E.; Shemon, E. R.; Yu, Y. Q.; Thomas, J. W.; Obabko, A.; Jain, Rajeev; Mahadevan, Vijay; Tautges, Timothy; Solberg, Jerome; Ferencz, Robert Mark; Whitesides, R.

    2015-12-21

    This report describes to employ SHARP to perform a first-of-a-kind analysis of the core radial expansion phenomenon in an SFR. This effort required significant advances in the framework Multi-Physics Demonstration Problem with the SHARP Reactor Simulation Toolkit used to drive the coupled simulations, manipulate the mesh in response to the deformation of the geometry, and generate the necessary modified mesh files. Furthermore, the model geometry is fairly complex, and consistent mesh generation for the three physics modules required significant effort. Fully-integrated simulations of a 7-assembly mini-core test problem have been performed, and the results are presented here. Physics models of a full-core model of the Advanced Burner Test Reactor have also been developed for each of the three physics modules. Standalone results of each of the three physics modules for the ABTR are presented here, which provides a demonstration of the feasibility of the fully-integrated simulation.

  18. Biomechanical ToolKit: Open-source framework to visualize and process biomechanical data.

    PubMed

    Barre, Arnaud; Armand, Stéphane

    2014-04-01

    C3D file format is widely used in the biomechanical field by companies and laboratories to store motion capture systems data. However, few software packages can visualize and modify the integrality of the data in the C3D file. Our objective was to develop an open-source and multi-platform framework to read, write, modify and visualize data from any motion analysis systems using standard (C3D) and proprietary file formats (used by many companies producing motion capture systems). The Biomechanical ToolKit (BTK) was developed to provide cost-effective and efficient tools for the biomechanical community to easily deal with motion analysis data. A large panel of operations is available to read, modify and process data through C++ API, bindings for high-level languages (Matlab, Octave, and Python), and standalone application (Mokka). All these tools are open-source and cross-platform and run on all major operating systems (Windows, Linux, MacOS X). PMID:24548899

  19. Security Assessment Simulation Toolkit (SAST) Final Report

    SciTech Connect

    Meitzler, Wayne D.; Ouderkirk, Steven J.; Hughes, Chad O.

    2009-11-15

    The Department of Defense Technical Support Working Group (DoD TSWG) investment in the Pacific Northwest National Laboratory (PNNL) Security Assessment Simulation Toolkit (SAST) research planted a technology seed that germinated into a suite of follow-on Research and Development (R&D) projects culminating in software that is used by multiple DoD organizations. The DoD TSWG technology transfer goal for SAST is already in progress. The Defense Information Systems Agency (DISA), the Defense-wide Information Assurance Program (DIAP), the Marine Corps, Office Of Naval Research (ONR) National Center For Advanced Secure Systems Research (NCASSR) and Office Of Secretary Of Defense International Exercise Program (OSD NII) are currently investing to take SAST to the next level. PNNL currently distributes the software to over 6 government organizations and 30 DoD users. For the past five DoD wide Bulwark Defender exercises, the adoption of this new technology created an expanding role for SAST. In 2009, SAST was also used in the OSD NII International Exercise and is currently scheduled for use in 2010.

  20. CART—a chemical annotation retrieval toolkit

    PubMed Central

    Deghou, Samy; Zeller, Georg; Iskar, Murat; Driessen, Marja; Castillo, Mercedes; van Noort, Vera; Bork, Peer

    2016-01-01

    Motivation: Data on bioactivities of drug-like chemicals are rapidly accumulating in public repositories, creating new opportunities for research in computational systems pharmacology. However, integrative analysis of these data sets is difficult due to prevailing ambiguity between chemical names and identifiers and a lack of cross-references between databases. Results: To address this challenge, we have developed CART, a Chemical Annotation Retrieval Toolkit. As a key functionality, it matches an input list of chemical names into a comprehensive reference space to assign unambiguous chemical identifiers. In this unified space, bioactivity annotations can be easily retrieved from databases covering a wide variety of chemical effects on biological systems. Subsequently, CART can determine annotations enriched in the input set of chemicals and display these in tabular format and interactive network visualizations, thereby facilitating integrative analysis of chemical bioactivity data. Availability and Implementation: CART is available as a Galaxy web service (cart.embl.de). Source code and an easy-to-install command line tool can also be obtained from the web site. Contact: bork@embl.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27256313

  1. UQ Toolkit v 2.0

    Energy Science and Technology Software Center (ESTSC)

    2013-10-03

    The Uncertainty Quantification (UQ) Toolkit is a software library for the characterizaton and propagation of uncertainties in computational models. For the characterization of uncertainties, Bayesian inference tools are provided to infer uncertain model parameters, as well as Bayesian compressive sensing methods for discovering sparse representations of high-dimensional input-output response surfaces, and also Karhunen-Loève expansions for representing stochastic processes. Uncertain parameters are treated as random variables and represented with Polynomial Chaos expansions (PCEs). The library implementsmore » several spectral basis function types (e.g. Hermite basis functions in terms of Gaussian random variables or Legendre basis functions in terms of uniform random variables) that can be used to represent random variables with PCEs. For propagation of uncertainty, tools are provided to propagate PCEs that describe the input uncertainty through the computational model using either intrusive methods (Galerkin projection of equations onto basis functions) or non-intrusive methods (perform deterministic operation at sampled values of the random values and project the obtained results onto basis functions).« less

  2. Expanding the conceptual toolkit of organ gifting.

    PubMed

    M Shaw, Rhonda

    2015-07-01

    In jurisdictions where the sale of body tissue and organs is illegal, organ transplantation is often spoken of as a gift of life. In the social sciences and bioethics this concept has been subject to critique over the course of the last two decades for failing to reflect the complexities of organ and tissue exchange. I suggest that a new ethical model of organ donation and transplantation is needed to capture the range of experiences in this domain. The proposed model is both analytical and empirically oriented, and draws on research findings linking a series of qualitative sociological studies undertaken in New Zealand between 2007 and 2013. The studies were based on document analysis, field notes and 127 semi-structured in-depth interviews with people from different cultural and constituent groups directly involved in organ transfer processes. The aim of the article is to contribute to sociological knowledge about organ exchange and to expand the conceptual toolkit of organ donation to include the unconditional gift, the gift relation, gift exchange, body project, and body work. The rationale for the proposed model is to provide an explanatory framework for organ donors and transplant recipients and to assist the development of ethical guidelines and health policy discourse. PMID:25728628

  3. Data Exploration Toolkit for serial diffraction experiments

    SciTech Connect

    Zeldin, Oliver B.; Brewster, Aaron S.; Hattne, Johan; Uervirojnangkoorn, Monarin; Lyubimov, Artem Y.; Zhou, Qiangjun; Zhao, Minglei; Weis, William I.; Sauter, Nicholas K.; Brunger, Axel T.

    2015-02-01

    This paper describes a set of tools allowing experimentalists insight into the variation present within large serial data sets. Ultrafast diffraction at X-ray free-electron lasers (XFELs) has the potential to yield new insights into important biological systems that produce radiation-sensitive crystals. An unavoidable feature of the ‘diffraction before destruction’ nature of these experiments is that images are obtained from many distinct crystals and/or different regions of the same crystal. Combined with other sources of XFEL shot-to-shot variation, this introduces significant heterogeneity into the diffraction data, complicating processing and interpretation. To enable researchers to get the most from their collected data, a toolkit is presented that provides insights into the quality of, and the variation present in, serial crystallography data sets. These tools operate on the unmerged, partial intensity integration results from many individual crystals, and can be used on two levels: firstly to guide the experimental strategy during data collection, and secondly to help users make informed choices during data processing.

  4. UQ Toolkit v 2.0

    SciTech Connect

    2013-10-03

    The Uncertainty Quantification (UQ) Toolkit is a software library for the characterizaton and propagation of uncertainties in computational models. For the characterization of uncertainties, Bayesian inference tools are provided to infer uncertain model parameters, as well as Bayesian compressive sensing methods for discovering sparse representations of high-dimensional input-output response surfaces, and also Karhunen-Loève expansions for representing stochastic processes. Uncertain parameters are treated as random variables and represented with Polynomial Chaos expansions (PCEs). The library implements several spectral basis function types (e.g. Hermite basis functions in terms of Gaussian random variables or Legendre basis functions in terms of uniform random variables) that can be used to represent random variables with PCEs. For propagation of uncertainty, tools are provided to propagate PCEs that describe the input uncertainty through the computational model using either intrusive methods (Galerkin projection of equations onto basis functions) or non-intrusive methods (perform deterministic operation at sampled values of the random values and project the obtained results onto basis functions).

  5. Asteroids Outreach Toolkit Development: Using Iterative Feedback In Informal Education

    NASA Astrophysics Data System (ADS)

    White, Vivian; Berendsen, M.; Gurton, S.; Dusenbery, P. B.

    2011-01-01

    The Night Sky Network is a collaboration of close to 350 astronomy clubs across the US that actively engage in public outreach within their communities. Since 2004, the Astronomical Society of the Pacific has been creating outreach ToolKits filled with carefully crafted sets of physical materials designed to help these volunteer clubs explain the wonders of the night sky to the public. The effectiveness of the ToolKit activities and demonstrations is the direct result of a thorough testing and vetting process. Find out how this iterative assessment process can help other programs create useful tools for both formal and informal educators. The current Space Rocks Outreach ToolKit focuses on explaining asteroids, comets, and meteorites to the general public using quick, big-picture activities that get audiences involved. Eight previous ToolKits cover a wide range of topics from the Moon to black holes. In each case, amateur astronomers and the public helped direct the development the activities along the way through surveys, focus groups, and active field-testing. The resulting activities have been embraced by the larger informal learning community and are enthusiastically being delivered to millions of people across the US and around the world. Each ToolKit is delivered free of charge to active Night Sky Network astronomy clubs. All activity write-ups are available free to download at the website listed here. Amateur astronomers receive frequent questions from the public about Earth impacts, meteors, and comets so this set of activities will help them explain the dynamics of these phenomena to the public. The Space Rocks ToolKit resources complement the Great Balls of Fire museum exhibit produced by Space Science Institute's National Center for Interactive Learning and scheduled for release in 2011. NSF has funded this national traveling exhibition and outreach ToolKit under Grant DRL-0813528.

  6. The Configuration Space Toolkit (C-Space Toolkit or CSTK) Ver. 2.5 beta

    Energy Science and Technology Software Center (ESTSC)

    2010-02-24

    The C-Space Toolkit provides a software library that makes it easier to program motion planning, simulation, robotics, and virtual reality codes using the Configuration Space abstraction. Key functionality (1) enables the user to special create representations of movable and stationary rigid geometric objects, and (2) perform fast distance, interference (clash) detection, collision detection, closest-feature pairs, and contact queries in terms of object configuration. Not only can queries be computed at any given point in configurationmore » space, but they can be done exactly over linear-translational path segments and approximately for rotational path segments. Interference detection and distance computations can be done with respect to the Minkowski sum of the original geometry and a piece of convex geometry. The Toolkit takes as raw model input (1) collections of convex polygons that form the boundaries of models and (2) convex polyhedra, cones, cylinders, and discs that are models and model components. Configurations are given in terms of homogeneous transforms. A simple OpenGL-based system for displaying and animating the geometric objects is included in the implementation. This version, 2.5 Beta, incorporates feature additions and enhancements, improvements in algorithms, improved robustness, bug fixes and cleaned-up source code, better compliance with standards and recent programming convention, changes to the build process for the software, support for more recent hardware and software platforms, and improvements to documentation and source-code comments.« less

  7. The Configuration Space Toolkit (C-Space Toolkit or CSTK) Ver. 2.5 beta

    SciTech Connect

    Chen, Pang-Chieh; Hwang, Yong; Xavier, Patrick; Lewis, Christopher; Lafarge, Robert; & Watterberg, Peter

    2010-02-24

    The C-Space Toolkit provides a software library that makes it easier to program motion planning, simulation, robotics, and virtual reality codes using the Configuration Space abstraction. Key functionality (1) enables the user to special create representations of movable and stationary rigid geometric objects, and (2) perform fast distance, interference (clash) detection, collision detection, closest-feature pairs, and contact queries in terms of object configuration. Not only can queries be computed at any given point in configuration space, but they can be done exactly over linear-translational path segments and approximately for rotational path segments. Interference detection and distance computations can be done with respect to the Minkowski sum of the original geometry and a piece of convex geometry. The Toolkit takes as raw model input (1) collections of convex polygons that form the boundaries of models and (2) convex polyhedra, cones, cylinders, and discs that are models and model components. Configurations are given in terms of homogeneous transforms. A simple OpenGL-based system for displaying and animating the geometric objects is included in the implementation. This version, 2.5 Beta, incorporates feature additions and enhancements, improvements in algorithms, improved robustness, bug fixes and cleaned-up source code, better compliance with standards and recent programming convention, changes to the build process for the software, support for more recent hardware and software platforms, and improvements to documentation and source-code comments.

  8. A Toolkit for Eye Recognition of LAMOST Spectroscopy

    NASA Astrophysics Data System (ADS)

    Yuan, H.; Zhang, H.; Zhang, Y.; Lei, Y.; Dong, Y.; Zhao, Y.

    2014-05-01

    The Large sky Area Multi-Object fiber Spectroscopic Telescope (LAMOST, also named the Guo Shou Jing Telescope) has finished the pilot survey and now begun the normal survey by the end of 2012 September. There have already been millions of targets observed, including thousands of quasar candidates. Because of the difficulty in the automatic identification of quasar spectra, eye recognition is always necessary and efficient. However massive spectra identification by eye is a huge job. In order to improve the efficiency and effectiveness of spectra , a toolkit for eye recognition of LAMOST spectroscopy is developed. Spectral cross-correlation templates from the Sloan Digital Sky Survey (SDSS) are applied as references, including O star, O/B transition star, B star, A star, F/A transition star, F star, G star, K star, M1 star, M3 star,M5 star,M8 star, L1 star, magnetic white dwarf, carbon star, white dwarf, B white dwarf, low metallicity K sub-dwarf, "Early-type" galaxy, galaxy, "Later-type" galaxy, Luminous Red Galaxy, QSO, QSO with some BAL activity and High-luminosity QSO. By adjusting the redshift and flux ratio of the template spectra in an interactive graphic interface, the spectral type of the target can be discriminated in a easy and feasible way and the redshift is estimated at the same time with a precision of about millesimal. The advantage of the tool in dealing with low quality spectra is indicated. Spectra from the Pilot Survey of LAMSOT are applied as examples and spectra from SDSS are also tested from comparison. Target spectra in both image format and fits format are supported. For convenience several spectra accessing manners are provided. All the spectra from LAMOST pilot survey can be located and acquired via the VOTable files on the internet as suggested by International Virtual Observatory Alliance (IVOA). After the construction of the Simple Spectral Access Protocol (SSAP) service by the Chinese Astronomical Data Center (CAsDC), spectra can be

  9. The Einstein Toolkit: a community computational infrastructure for relativistic astrophysics

    NASA Astrophysics Data System (ADS)

    Löffler, Frank; Faber, Joshua; Bentivegna, Eloisa; Bode, Tanja; Diener, Peter; Haas, Roland; Hinder, Ian; Mundim, Bruno C.; Ott, Christian D.; Schnetter, Erik; Allen, Gabrielle; Campanelli, Manuela; Laguna, Pablo

    2012-06-01

    We describe the Einstein Toolkit, a community-driven, freely accessible computational infrastructure intended for use in numerical relativity, relativistic astrophysics, and other applications. The toolkit, developed by a collaboration involving researchers from multiple institutions around the world, combines a core set of components needed to simulate astrophysical objects such as black holes, compact objects, and collapsing stars, as well as a full suite of analysis tools. The Einstein Toolkit is currently based on the Cactus framework for high-performance computing and the Carpet adaptive mesh refinement driver. It implements spacetime evolution via the BSSN evolution system and general relativistic hydrodynamics in a finite-volume discretization. The toolkit is under continuous development and contains many new code components that have been publicly released for the first time and are described in this paper. We discuss the motivation behind the release of the toolkit, the philosophy underlying its development, and the goals of the project. A summary of the implemented numerical techniques is included, as are results of numerical test covering a variety of sample astrophysical problems.

  10. Cyber Security Audit and Attack Detection Toolkit

    SciTech Connect

    Peterson, Dale

    2012-05-31

    This goal of this project was to develop cyber security audit and attack detection tools for industrial control systems (ICS). Digital Bond developed and released a tool named Bandolier that audits ICS components commonly used in the energy sector against an optimal security configuration. The Portaledge Project developed a capability for the PI Historian, the most widely used Historian in the energy sector, to aggregate security events and detect cyber attacks.

  11. Validation of Power Output for the WIND Toolkit

    SciTech Connect

    King, J.; Clifton, A.; Hodge, B. M.

    2014-09-01

    Renewable energy integration studies require wind data sets of high quality with realistic representations of the variability, ramping characteristics, and forecast performance for current wind power plants. The Wind Integration National Data Set (WIND) Toolkit is meant to be an update for and expansion of the original data sets created for the weather years from 2004 through 2006 during the Western Wind and Solar Integration Study and the Eastern Wind Integration Study. The WIND Toolkit expands these data sets to include the entire continental United States, increasing the total number of sites represented, and it includes the weather years from 2007 through 2012. In addition, the WIND Toolkit has a finer resolution for both the temporal and geographic dimensions. Three separate data sets will be created: a meteorological data set, a wind power data set, and a forecast data set. This report describes the validation of the wind power data set.

  12. Evolution of the plant-microbe symbiotic 'toolkit'.

    PubMed

    Delaux, Pierre-Marc; Séjalon-Delmas, Nathalie; Bécard, Guillaume; Ané, Jean-Michel

    2013-06-01

    Beneficial associations between plants and arbuscular mycorrhizal fungi play a major role in terrestrial environments and in the sustainability of agroecosystems. Proteins, microRNAs, and small molecules have been identified in model angiosperms as required for the establishment of arbuscular mycorrhizal associations and define a symbiotic 'toolkit' used for other interactions such as the rhizobia-legume symbiosis. Based on recent studies, we propose an evolutionary framework for this toolkit. Some components appeared recently in angiosperms, whereas others are highly conserved even in land plants unable to form arbuscular mycorrhizal associations. The exciting finding that some components pre-date the appearance of arbuscular mycorrhizal fungi suggests the existence of unknown roles for this toolkit and even the possibility of symbiotic associations in charophyte green algae. PMID:23462549

  13. The PRIDE (Partnership to Improve Diabetes Education) Toolkit

    PubMed Central

    Wolff, Kathleen; Chambers, Laura; Bumol, Stefan; White, Richard O.; Gregory, Becky Pratt; Davis, Dianne; Rothman, Russell L.

    2016-01-01

    Purpose Patients with low literacy, low numeracy, and/or linguistic needs can experience challenges understanding diabetes information and applying concepts to their self-management. The authors designed a toolkit of education materials that are sensitive to patients' literacy and numeracy levels, language preferences, and cultural norms and that encourage shared goal setting to improve diabetes self-management and health outcomes. The Partnership to Improve Diabetes Education (PRIDE) toolkit was developed to facilitate diabetes self-management education and support. Methods The PRIDE toolkit includes a comprehensive set of 30 interactive education modules in English and Spanish to support diabetes self-management activities. The toolkit builds upon the authors' previously validated Diabetes Literacy and Numeracy Education Toolkit (DLNET) by adding a focus on shared goal setting, addressing the needs of Spanish-speaking patients, and including a broader range of diabetes management topics. Each PRIDE module was evaluated using the Suitability Assessment of Materials (SAM) instrument to determine the material's cultural appropriateness and its sensitivity to the needs of patients with low literacy and low numeracy. Reading grade level was also assessed using the Automated Readability Index (ARI), Coleman-Liau, Flesch-Kincaid, Fry, and SMOG formulas. Conclusions The average reading grade level of the materials was 5.3 (SD 1.0), with a mean SAM of 91.2 (SD 5.4). All of the 30 modules received a “superior” score (SAM >70%) when evaluated by 2 independent raters. The PRIDE toolkit modules can be used by all members of a multidisciplinary team to assist patients with low literacy and low numeracy in managing their diabetes. PMID:26647414

  14. XML Files

    MedlinePlus

    ... nlm.nih.gov/medlineplus/xml.html MedlinePlus XML Files To use the sharing features on this page, ... information on all English and Spanish topic groups. Files generated on July 09, 2016 MedlinePlus Health Topic ...

  15. WIRM: An Open Source Toolkit for Building Biomedical Web Applications

    PubMed Central

    Jakobovits, Rex M.; Rosse, Cornelius; Brinkley, James F.

    2002-01-01

    This article describes an innovative software toolkit that allows the creation of web applications that facilitate the acquisition, integration, and dissemination of multimedia biomedical data over the web, thereby reducing the cost of knowledge sharing. There is a lack of high-level web application development tools suitable for use by researchers, clinicians, and educators who are not skilled programmers. Our Web Interfacing Repository Manager (WIRM) is a software toolkit that reduces the complexity of building custom biomedical web applications. WIRM’s visual modeling tools enable domain experts to describe the structure of their knowledge, from which WIRM automatically generates full-featured, customizable content management systems. PMID:12386108

  16. User's manual for the two-dimensional transputer graphics toolkit

    NASA Technical Reports Server (NTRS)

    Ellis, Graham K.

    1988-01-01

    The user manual for the 2-D graphics toolkit for a transputer based parallel processor is presented. The toolkit consists of a package of 2-D display routines that can be used for the simulation visualizations. It supports multiple windows, double buffered screens for animations, and simple graphics transformations such as translation, rotation, and scaling. The display routines are written in occam to take advantage of the multiprocessing features available on transputers. The package is designed to run on a transputer separate from the graphics board.

  17. RAVE—a Detector-independent vertex reconstruction toolkit

    NASA Astrophysics Data System (ADS)

    Waltenberger, Wolfgang; Mitaroff, Winfried; Moser, Fabian

    2007-10-01

    A detector-independent toolkit for vertex reconstruction (RAVE ) is being developed, along with a standalone framework (VERTIGO ) for testing, analyzing and debugging. The core algorithms represent state of the art for geometric vertex finding and fitting by both linear (Kalman filter) and robust estimation methods. Main design goals are ease of use, flexibility for embedding into existing software frameworks, extensibility, and openness. The implementation is based on modern object-oriented techniques, is coded in C++ with interfaces for Java and Python, and follows an open-source approach. A beta release is available. VERTIGO = "vertex reconstruction toolkit and interface to generic objects".

  18. Incident Management Preparedness and Coordination Toolkit

    Energy Science and Technology Software Center (ESTSC)

    2013-04-01

    As with many professions, safety planners and first responders tend to be specialists in certain areas. To be truly useful, tools should be tailored to meet their specific needs. Thus, general software suites aimed at the professional geographic information system (GIS) community might not be the best solution for a first responder with little training in GIS terminology and techniques. On the other hand, commonly used web-based map viewers may not have the capability tomore » be customized for the planning, response, and recovery (PR&R) mission. Data formats should be open and foster easy information flow among local, state, and federal partners. Tools should be free or low-cost to address real-world budget constraints at the local level. They also need to work both with and without a network connection to be robust. The Incident Management Preparedness and Coordination Toolkit (IMPACT) can satisfy many of these needs while working in harmony with established systems at the local, state, and federal levels. The IMPACT software framework, termed the Geospatial Integrated Problem Solving Environment (GIPSE), organizes tasks, tools, and resources for the end user. It uses the concept of software wizards to both customize and extend its functionality. On the Tasks panel are a number of buttons used to initiate various operations. Similar to macros, these task buttons launch scripts that utilize the full functionality of the underlying foundational components such as the SQL spatial database and ORNL-developed map editor. The user is presented with a series of instruction pages which are implemented with HTML for interactivity. On each page are links which initiate specific actions such as creating a map showing various features. Additional tasks may be quickly programmed and added to the panel. The end user can customize the graphical interface to faciltate its use during an emergency. One of the major components of IMPACT is the ORNL Geospatial Viewer (OGV). It is

  19. Incident Management Preparedness and Coordination Toolkit

    SciTech Connect

    Koch, Daniel B.

    2013-04-01

    As with many professions, safety planners and first responders tend to be specialists in certain areas. To be truly useful, tools should be tailored to meet their specific needs. Thus, general software suites aimed at the professional geographic information system (GIS) community might not be the best solution for a first responder with little training in GIS terminology and techniques. On the other hand, commonly used web-based map viewers may not have the capability to be customized for the planning, response, and recovery (PR&R) mission. Data formats should be open and foster easy information flow among local, state, and federal partners. Tools should be free or low-cost to address real-world budget constraints at the local level. They also need to work both with and without a network connection to be robust. The Incident Management Preparedness and Coordination Toolkit (IMPACT) can satisfy many of these needs while working in harmony with established systems at the local, state, and federal levels. The IMPACT software framework, termed the Geospatial Integrated Problem Solving Environment (GIPSE), organizes tasks, tools, and resources for the end user. It uses the concept of software wizards to both customize and extend its functionality. On the Tasks panel are a number of buttons used to initiate various operations. Similar to macros, these task buttons launch scripts that utilize the full functionality of the underlying foundational components such as the SQL spatial database and ORNL-developed map editor. The user is presented with a series of instruction pages which are implemented with HTML for interactivity. On each page are links which initiate specific actions such as creating a map showing various features. Additional tasks may be quickly programmed and added to the panel. The end user can customize the graphical interface to faciltate its use during an emergency. One of the major components of IMPACT is the ORNL Geospatial Viewer (OGV). It is used to

  20. SatelliteDL: a Toolkit for Analysis of Heterogeneous Satellite Datasets

    NASA Astrophysics Data System (ADS)

    Galloy, M. D.; Fillmore, D.

    2014-12-01

    SatelliteDL is an IDL toolkit for the analysis of satellite Earth observations from a diverse set of platforms and sensors. The core function of the toolkit is the spatial and temporal alignment of satellite swath and geostationary data. The design features an abstraction layer that allows for easy inclusion of new datasets in a modular way. Our overarching objective is to create utilities that automate the mundane aspects of satellite data analysis, are extensible and maintainable, and do not place limitations on the analysis itself. IDL has a powerful suite of statistical and visualization tools that can be used in conjunction with SatelliteDL. Toward this end we have constructed SatelliteDL to include (1) HTML and LaTeX API document generation,(2) a unit test framework,(3) automatic message and error logs,(4) HTML and LaTeX plot and table generation, and(5) several real world examples with bundled datasets available for download. For ease of use, datasets, variables and optional workflows may be specified in a flexible format configuration file. Configuration statements may specify, for example, a region and date range, and the creation of images, plots and statistical summary tables for a long list of variables. SatelliteDL enforces data provenance; all data should be traceable and reproducible. The output NetCDF file metadata holds a complete history of the original datasets and their transformations, and a method exists to reconstruct a configuration file from this information. Release 0.1.0 distributes with ingest methods for GOES, MODIS, VIIRS and CERES radiance data (L1) as well as select 2D atmosphere products (L2) such as aerosol and cloud (MODIS and VIIRS) and radiant flux (CERES). Future releases will provide ingest methods for ocean and land surface products, gridded and time averaged datasets (L3 Daily, Monthly and Yearly), and support for 3D products such as temperature and water vapor profiles. Emphasis will be on NPP Sensor, Environmental and

  1. Geospatial Toolkits and Resource Maps for Selected Countries from the National Renewable Energy Laboratory (NREL)

    DOE Data Explorer

    NREL developed the Geospatial Toolkit (GsT), a map-based software application that integrates resource data and geographic information systems (GIS) for integrated resource assessment. A variety of agencies within countries, along with global datasets, provided country-specific data. Originally developed in 2005, the Geospatial Toolkit was completely redesigned and re-released in November 2010 to provide a more modern, easier-to-use interface with considerably faster analytical querying capabilities. Toolkits are available for 21 countries and each one can be downloaded separately. The source code for the toolkit is also available. [Taken and edited from http://www.nrel.gov/international/geospatial_toolkits.html

  2. Cubit Mesh Generation Toolkit V11.1

    Energy Science and Technology Software Center (ESTSC)

    2009-03-25

    CUBIT prepares models to be used in computer-based simulation of real-world events. CUBIT is a full-featured software toolkit for robust generation of two- and three-dimensional finite element meshes (grids) and geometry preparation. Its main goal is to reduce the time to generate meshes, particularly large hex meshes of complicated, interlocking assemblies.

  3. A Beginning Rural Principal's Toolkit: A Guide for Success

    ERIC Educational Resources Information Center

    Ashton, Brian; Duncan, Heather E.

    2012-01-01

    The purpose of this article is to explore both the challenges and skills needed to effectively assume a leadership position and thus to create an entry plan or "toolkit" for a new rural school leader. The entry plan acts as a guide beginning principals may use to navigate the unavoidable confusion that comes with leadership. It also assists…

  4. Using AASL's "Health and Wellness" and "Crisis Toolkits"

    ERIC Educational Resources Information Center

    Logan, Debra Kay

    2009-01-01

    Whether a school library program is the picture of good health in a state that mandates a professionally staffed library media center in every building or is suffering in a low-wealth district that is facing drastic cuts, the recently launched toolkits by the American Association of School Librarians (AASL) are stocked with useful strategies and…

  5. The Archivists' Toolkit: Another Step toward Streamlined Archival Processing

    ERIC Educational Resources Information Center

    Westbrook, Bradley D.; Mandell, Lee; Shepherd, Kelcy; Stevens, Brian; Varghese, Jason

    2006-01-01

    The Archivists' Toolkit is a software application currently in development and designed to support the creation and management of archival information. This article summarizes the development of the application, including some of the problems the application is designed to resolve. Primary emphasis is placed on describing the application's…

  6. A Guide to Today's Teacher Recruitment Challenges. RNT Toolkit.

    ERIC Educational Resources Information Center

    Simmons, Anne

    This publication is the introductory guide to the Recruiting New Teachers, Inc.'s Toolkit, which is designed to help states and school districts meet their teacher recruitment and retention challenges. The guide provides information on: today's national teacher shortage crisis; how to make a case for stepping up recruitment efforts for diverse…

  7. Roles of the Volunteer in Development: Toolkits for Building Capacity.

    ERIC Educational Resources Information Center

    Slater, Marsha; Allsman, Ava; Savage, Ron; Havens, Lani; Blohm, Judee; Raftery, Kate

    This document, which was developed to assist Peace Corps volunteers and those responsible for training them, presents an introductory booklet and six toolkits for use in the training provided to and by volunteers involved in community development. All the materials emphasize long-term participatory approaches to sustainable development and a…

  8. 78 FR 58520 - U.S. Environmental Solutions Toolkit

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-24

    ... International Trade Administration U.S. Environmental Solutions Toolkit AGENCY: International Trade... from U.S. businesses capable of exporting their goods or services relevant to (a) arsenic removal... wastewater treatment. The Department of Commerce continues to develop the web-based U.S....

  9. The Data Toolkit: Ten Tools for Supporting School Improvement

    ERIC Educational Resources Information Center

    Hess, Robert T.; Robbins, Pam

    2012-01-01

    Using data for school improvement is a key goal of Race to the Top, and now is the time to make data-driven school improvement a priority. However, many educators are drowning in data. Boost your professional learning community's ability to translate data into action with this new book from Pam Robbins and Robert T. Hess. "The Data Toolkit"…

  10. New MISR Toolkit Version 1.4.1 Available

    Atmospheric Science Data Center

    2014-09-03

    ... of the MISR Toolkit (MTK) is now available from the The Open Channel Foundation .  The MTK is a simplified programming ... HDF-EOS to access MISR Level 1B2, Level 2, and ancillary data products. It also handles the MISR conventional format. The interface ...

  11. A Toolkit to Implement Graduate Attributes in Geography Curricula

    ERIC Educational Resources Information Center

    Spronken-Smith, Rachel; McLean, Angela; Smith, Nell; Bond, Carol; Jenkins, Martin; Marshall, Stephen; Frielick, Stanley

    2016-01-01

    This article uses findings from a project on engagement with graduate outcomes across higher education institutions in New Zealand to produce a toolkit for implementing graduate attributes in geography curricula. Key facets include strong leadership; academic developers to facilitate conversations about graduate attributes and teaching towards…

  12. Resource Toolkit for Working with Education Service Providers

    ERIC Educational Resources Information Center

    National Association of Charter School Authorizers (NJ1), 2005

    2005-01-01

    This resource toolkit for working education service providers contains four sections. Section 1, "Roles Responsibilities, and Relationships," contains: (1) "Purchasing Services from an Educational Management Organization," excerpted from "The Charter School Administrative and Governance Guide" (Massachusetts Dept. of Education); (2) ESP Agreement…

  13. Policy to Performance Toolkit: Transitioning Adults to Opportunity

    ERIC Educational Resources Information Center

    Alamprese, Judith A.; Limardo, Chrys

    2012-01-01

    The "Policy to Performance Toolkit" is designed to provide state adult education staff and key stakeholders with guidance and tools to use in developing, implementing, and monitoring state policies and their associated practices that support an effective state adult basic education (ABE) to postsecondary education and training transition…

  14. Manufacturer’s CORBA Interface Testing Toolkit: Overview

    PubMed Central

    Flater, David

    1999-01-01

    The Manufacturer’s CORBA Interface Testing Toolkit (MCITT) is a software package that supports testing of CORBA components and interfaces. It simplifies the testing of complex distributed systems by producing “dummy components” from Interface Testing Language and Component Interaction Specifications and by automating some error-prone programming tasks. It also provides special commands to support conformance, performance, and stress testing.

  15. ELCAT: An E-Learning Content Adaptation Toolkit

    ERIC Educational Resources Information Center

    Clements, Iain; Xu, Zhijie

    2005-01-01

    Purpose: The purpose of this paper is to present an e-learning content adaptation toolkit--ELCAT--that helps to achieve the objectives of the KTP project No. 3509. Design/methodology/approach: The chosen methodology is absolutely practical. The tool was put into motion and results were observed as university and the collaborating company members…

  16. Educating Globally Competent Citizens: A Toolkit. Second Edition

    ERIC Educational Resources Information Center

    Elliott-Gower, Steven; Falk, Dennis R.; Shapiro, Martin

    2012-01-01

    Educating Globally Competent Citizens, a product of AASCU's American Democracy Project and its Global Engagement Initiative, introduces readers to a set of global challenges facing society based on the Center for Strategic and International Studies' 7 Revolutions. The toolkit is designed to aid faculty in incorporating global challenges into new…

  17. Mentoring Immigrant & Refugee Youth: A Toolkit for Program Coordinators

    ERIC Educational Resources Information Center

    MENTOR, 2011

    2011-01-01

    "Mentoring Immigrant Youth: A Toolkit for Program Coordinators" is a comprehensive resource that is designed to offer program staff important background information, promising program practices and strategies to build and sustain high-quality mentoring relationships for different categories of immigrant youth. Included in this resource are five…

  18. Capturing and Using Knowledge about the Use of Visualization Toolkits

    SciTech Connect

    Del Rio, Nicholas R.; Pinheiro da Silva, Paulo

    2012-11-02

    When constructing visualization pipelines using toolkits such as Visualization Toolkit (VTK) and Generic Mapping Tools (GMT), developers must understand (1) what toolkit operators will transform their data from its raw state to some required view state and (2) what viewers are available to present the generated view. Traditionally, developers learn about how to construct visualization pipelines by reading documentation and inspecting code examples, which can be costly in terms of the time and effort expended. Once an initial pipeline is constructed, developers may still have to undergo a trial and error process before a satisfactory visualization is generated. This paper presents the Visualization Knowledge Project (VisKo) that is built on a knowledge base of visualization toolkit operators and how they can be piped together to form visualization pipelines. Developers may now rely on VisKo to guide them when constructing visualization pipelines and in some cases, when VisKo has complete knowledge about some set of operators (i.e., sequencing and parameter settings), automatically generate a fully functional visualization pipeline.

  19. The Complete Guide to RTI: An Implementation Toolkit

    ERIC Educational Resources Information Center

    Burton, Dolores; Kappenberg, John

    2012-01-01

    This comprehensive toolkit will bring you up to speed on why RTI is one of the most important educational initiatives in recent history and sets the stage for its future role in teacher education and practice. The authors demonstrate innovative ways to use RTI to inform instruction and guide curriculum development in inclusive classroom settings.…

  20. GMH: A Message Passing Toolkit for GPU Clusters

    SciTech Connect

    Jie Chen, W. Watson, Weizhen Mao

    2011-01-01

    Driven by the market demand for high-definition 3D graphics, commodity graphics processing units (GPUs) have evolved into highly parallel, multi-threaded, many-core processors, which are ideal for data parallel computing. Many applications have been ported to run on a single GPU with tremendous speedups using general C-style programming languages such as CUDA. However, large applications require multiple GPUs and demand explicit message passing. This paper presents a message passing toolkit, called GMH (GPU Message Handler), on NVIDIA GPUs. This toolkit utilizes a data-parallel thread group as a way to map multiple GPUs on a single host to an MPI rank, and introduces a notion of virtual GPUs as a way to bind a thread to a GPU automatically. This toolkit provides high performance MPI style point-to-point and collective communication, but more importantly, facilitates event-driven APIs to allow an application to be managed and executed by the toolkit at runtime.

  1. Practitioner Toolkit: Working with Adult English Language Learners.

    ERIC Educational Resources Information Center

    Lieshoff, Sylvia Cobos; Aguilar, Noemi; McShane, Susan; Burt, Miriam; Peyton, Joy Kreeft; Terrill, Lynda; Van Duzer, Carol

    2004-01-01

    This document is designed to give support to adult education and family literacy instructors who are new to serving adult English language learners and their families in rural, urban, and faith- and community-based programs. The Toolkit is designed to have a positive impact on the teaching and learning in these programs. The results of two…

  2. Using Toolkits to Achieve STEM Enterprise Learning Outcomes

    ERIC Educational Resources Information Center

    Watts, Carys A.; Wray, Katie

    2012-01-01

    Purpose: The purpose of this paper is to evaluate the effectiveness of using several commercial tools in science, technology, engineering and maths (STEM) subjects for enterprise education at Newcastle University, UK. Design/methodology/approach: The paper provides an overview of existing toolkit use in higher education, before reviewing where and…

  3. Dataset of aggregate producers in New Mexico

    USGS Publications Warehouse

    Orris, Greta J.

    2000-01-01

    This report presents data, including latitude and longitude, for aggregate sites in New Mexico that were believed to be active in the period 1997-1999. The data are presented in paper form in Part A of this report and as Microsoft Excel 97 and Data Interchange Format (DIF) files in Part B. The work was undertaken as part of the effort to update information for the National Atlas. This compilation includes data from: the files of U.S. Geological Survey (USGS); company contacts; the New Mexico Bureau of Mines and Mineral Resources, New Mexico Bureau of Mine Inspection, and the Mining and Minerals Division of the New Mexico Energy, Minerals and Natural Resources Department (Hatton and others, 1998); the Bureau of Land Management Information; and direct communications with some of the aggregate operators. Additional information on most of the sites is available in Hatton and others (1998).

  4. Platelet aggregation test

    MedlinePlus

    ... this page: //medlineplus.gov/ency/article/003669.htm Platelet aggregation test To use the sharing features on this page, please enable JavaScript. The platelet aggregation blood test checks how well platelets , a ...

  5. Thermodynamics of Protein Aggregation

    NASA Astrophysics Data System (ADS)

    Osborne, Kenneth L.; Barz, Bogdan; Bachmann, Michael; Strodel, Birgit

    Amyloid protein aggregation characterizes many neurodegenerative disorders, including Alzheimer's, Parkinson's, and Creutz- feldt-Jakob disease. Evidence suggests that amyloid aggregates may share similar aggregation pathways, implying simulation of full-length amyloid proteins is not necessary for understanding amyloid formation. In this study we simulate GNNQQNY, the N-terminal prion-determining domain of the yeast protein Sup35 to investigate the thermodynamics of structural transitions during aggregation. We use a coarse-grained model with replica-exchange molecular dynamics to investigate the association of 3-, 6-, and 12-chain GNNQQNY systems and we determine the aggregation pathway by studying aggregation states of GN- NQQNY. We find that the aggregation of the hydrophilic GNNQQNY sequence is mainly driven by H-bond formation, leading to the formation of /3-sheets from the very beginning of the assembly process. Condensation (aggregation) and ordering take place simultaneously, which is underpinned by the occurrence of a single heat capacity peak only.

  6. Platelet aggregation test

    MedlinePlus

    The platelet aggregation blood test checks how well platelets , a part of blood, clump together and cause blood to clot. ... Decreased platelet aggregation may be due to: Autoimmune ... Fibrin degradation products Inherited platelet function defects ...

  7. Dissemination of Earth Remote Sensing Data for Use in the NOAA/NWS Damage Assessment Toolkit

    NASA Technical Reports Server (NTRS)

    Molthan, Andrew; Burks, Jason; Camp, Parks; McGrath, Kevin; Bell, Jordan

    2015-01-01

    The National Weather Service has developed the Damage Assessment Toolkit (DAT), an application for smartphones and tablets that allows for the collection, geolocation, and aggregation of various damage indicators that are collected during storm surveys. The DAT supports the often labor-intensive process where meteorologists venture into the storm-affected area, allowing them to acquire geotagged photos of the observed damage while also assigning estimated EF-scale categories based upon their observations. Once the data are collected, the DAT infrastructure aggregates the observations into a server that allows other meteorologists to perform quality control and other analysis steps before completing their survey and making the resulting data available to the public. In addition to in-person observations, Earth remote sensing from operational, polar-orbiting satellites can support the damage assessment process by identifying portions of damage tracks that may be missed due to road limitations, access to private property, or time constraints. Products resulting from change detection techniques can identify damage to vegetation and the land surface, aiding in the survey process. In addition, higher resolution commercial imagery can corroborate ground-based surveys by examining higher-resolution commercial imagery. As part of an ongoing collaboration, NASA and NOAA are working to integrate near real-time Earth remote sensing observations into the NOAA/NWS Damage Assessment Toolkit. This presentation will highlight recent developments in a streamlined approach for disseminating Earth remote sensing data via web mapping services and a new menu interface that has been integrated within the DAT. A review of current and future products will be provided, including products derived from MODIS and VIIRS for preliminary track identification, along with conduits for higher-resolution Landsat, ASTER, and commercial imagery as they become available. In addition to tornado damage

  8. Geological hazards: from early warning systems to public health toolkits.

    PubMed

    Samarasundera, Edgar; Hansell, Anna; Leibovici, Didier; Horwell, Claire J; Anand, Suchith; Oppenheimer, Clive

    2014-11-01

    Extreme geological events, such as earthquakes, are a significant global concern and sometimes their consequences can be devastating. Geographic information plays a critical role in health protection regarding hazards, and there are a range of initiatives using geographic information to communicate risk as well as to support early warning systems operated by geologists. Nevertheless we consider there to remain shortfalls in translating information on extreme geological events into health protection tools, and suggest that social scientists have an important role to play in aiding the development of a new generation of toolkits aimed at public health practitioners. This viewpoint piece reviews the state of the art in this domain and proposes potential contributions different stakeholder groups, including social scientists, could bring to the development of new toolkits. PMID:25255167

  9. Developing Mixed Reality Educational Applications: The Virtual Touch Toolkit.

    PubMed

    Mateu, Juan; Lasala, María José; Alamán, Xavier

    2015-01-01

    In this paper, we present Virtual Touch, a toolkit that allows the development of educational activities through a mixed reality environment such that, using various tangible elements, the interconnection of a virtual world with the real world is enabled. The main goal of Virtual Touch is to facilitate the installation, configuration and programming of different types of technologies, abstracting the creator of educational applications from the technical details involving the use of tangible interfaces and virtual worlds. Therefore, it is specially designed to enable teachers to themselves create educational activities for their students in a simple way, taking into account that teachers generally lack advanced knowledge in computer programming and electronics. The toolkit has been used to develop various educational applications that have been tested in two secondary education high schools in Spain. PMID:26334275

  10. Toolkit of automated database creation and cross-match

    NASA Astrophysics Data System (ADS)

    Zhang, Yanxia; Zheng, Hongwen; Pei, Tong; Zhao, Yongheng

    2012-09-01

    Astronomy steps into a fullwave and data-avalanche era. Astronomical data is measured by Terabyte, even Petabyte. How to save, manage, analyze so massive data is an important issue in astronomy. In order to let astronomers free of the data processing burden and expert in science, various valuable and convenient tools (e.g. Aladin, VOSpec, VOPlot) are developed by VO projects. To suit this requirement, we develop a toolkit to realize automated database creation, automated database index creation and cross-match. The toolkit provides a good interface for users to apply. The cross-match task may be implemented between local databases, remote databases or local database and remote database. The large-scale cross-match is also easily achieved. Moreover, the speed for large-scale cross-match is rather satisfactory.

  11. Developing Mixed Reality Educational Applications: The Virtual Touch Toolkit

    PubMed Central

    Mateu, Juan; Lasala, María José; Alamán, Xavier

    2015-01-01

    In this paper, we present Virtual Touch, a toolkit that allows the development of educational activities through a mixed reality environment such that, using various tangible elements, the interconnection of a virtual world with the real world is enabled. The main goal of Virtual Touch is to facilitate the installation, configuration and programming of different types of technologies, abstracting the creator of educational applications from the technical details involving the use of tangible interfaces and virtual worlds. Therefore, it is specially designed to enable teachers to themselves create educational activities for their students in a simple way, taking into account that teachers generally lack advanced knowledge in computer programming and electronics. The toolkit has been used to develop various educational applications that have been tested in two secondary education high schools in Spain. PMID:26334275

  12. Monitoring the grid with the globus toolkit MDS4.

    SciTech Connect

    Schopf, J. M.; Pearlman, L.; Miller, N.; Kesselman, C.; Foster, I.; D'Arcy, M.; Chervenak, A.; Mathematics and Computer Science; Univ. of Chicago; Univ. of Southern California; Univ. of Edinburgh

    2006-01-01

    The Globus Toolkit Monitoring and Discovery System (MDS4) defines and implements mechanisms for service and resource discovery and monitoring in distributed environments. MDS4 is distinguished from previous similar systems by its extensive use of interfaces and behaviors defined in the WS-Resource Framework and WS-Notification specifications, and by its deep integration into essentially every component of the Globus Toolkit. We describe the MDS4 architecture and the Web service interfaces and behaviors that allow users to discover resources and services, monitor resource and service states, receive updates on current status, and visualize monitoring results. We present two current deployments to provide insights into the functionality that can be achieved via the use of these mechanisms.

  13. Pervasive Collaboratorive Computing Environment Jabber Toolkit

    Energy Science and Technology Software Center (ESTSC)

    2004-05-15

    PCCE Project background: Our experience in building distributed collaboratories has shown us that there is a growing need for simple, non-intrusive, and flexible ways to stay in touch and work together. Towards this goal we are developing a Pervasive Collaborative Computing Environment (PCCE) within which participants can rendezvous and interact with each other. The PCCE aims to support continuous or ad hoc collaboration, target daily tasks and base connectivity, be easy to use and installmore » across multiple platforms, leverage off of existing components when possible, use standards-based components, and leverage off of Grid services (e.g., security and directory services). A key concept for this work is "incremental trust", which allows the system's "trust" of a given user to change dynamically. PCCE Jabber client software: This leverages Jabber. an open Instant Messaging (IM) protocol and the related Internet Engineering Task Force (IETF) standards "XMPP" and "XMPP-IM" to allow collaborating parties to chat either one-on-one or in "chat rooms". Standard Jabber clients will work within this framework, but the software will also include extensions to a (multi-platform) GUI client (Gaim) for X.509-based security, search, and incremental trust. This software also includes Web interfaces for managing user registration to a Jabber server. PCCE Jabber server software: Extensions to the code, database, and configuration files for the dominant open-source Jabber server, "jabberd". Extensions for search, X.509 security, and incremental trust. Note that the jabberd software is not included as part of this software.« less

  14. Pervasive Collaboratorive Computing Environment Jabber Toolkit

    SciTech Connect

    Gunter, Dan; Lee, Jason

    2004-05-15

    PCCE Project background: Our experience in building distributed collaboratories has shown us that there is a growing need for simple, non-intrusive, and flexible ways to stay in touch and work together. Towards this goal we are developing a Pervasive Collaborative Computing Environment (PCCE) within which participants can rendezvous and interact with each other. The PCCE aims to support continuous or ad hoc collaboration, target daily tasks and base connectivity, be easy to use and install across multiple platforms, leverage off of existing components when possible, use standards-based components, and leverage off of Grid services (e.g., security and directory services). A key concept for this work is "incremental trust", which allows the system's "trust" of a given user to change dynamically. PCCE Jabber client software: This leverages Jabber. an open Instant Messaging (IM) protocol and the related Internet Engineering Task Force (IETF) standards "XMPP" and "XMPP-IM" to allow collaborating parties to chat either one-on-one or in "chat rooms". Standard Jabber clients will work within this framework, but the software will also include extensions to a (multi-platform) GUI client (Gaim) for X.509-based security, search, and incremental trust. This software also includes Web interfaces for managing user registration to a Jabber server. PCCE Jabber server software: Extensions to the code, database, and configuration files for the dominant open-source Jabber server, "jabberd". Extensions for search, X.509 security, and incremental trust. Note that the jabberd software is not included as part of this software.

  15. Bayesian Analysis Toolkit: 1.0 and beyond

    NASA Astrophysics Data System (ADS)

    Beaujean, Frederik; Caldwell, Allen; Greenwald, D.; Kluth, S.; Kröninger, Kevin; Schulz, O.

    2015-12-01

    The Bayesian Analysis Toolkit is a C++ package centered around Markov-chain Monte Carlo sampling. It is used in high-energy physics analyses by experimentalists and theorists alike. The software has matured over the last few years. We present new features to enter version 1.0, then summarize some of the software-engineering lessons learned and give an outlook on future versions.

  16. A medical imaging and visualization toolkit in Java.

    PubMed

    Huang, Su; Baimouratov, Rafail; Xiao, Pengdong; Ananthasubramaniam, Anand; Nowinski, Wieslaw L

    2006-03-01

    Medical imaging research and clinical applications usually require combination and integration of various techniques ranging from image processing and analysis to realistic visualization to user-friendly interaction. Researchers with different backgrounds coming from diverse areas have been using numerous types of hardware, software, and environments to obtain their results. We also observe that students often build their tools from scratch resulting in redundant work. A generic and flexible medical imaging and visualization toolkit would be helpful in medical research and educational institutes to reduce redundant development work and hence increase research efficiency. This paper presents our experience in developing a Medical Imaging and Visualization Toolkit (BIL-kit) that is a set of comprehensive libraries as well as a number of interactive tools. The BIL-kit covers a wide range of fundamental functions from image conversion and transformation, image segmentation, and analysis to geometric model generation and manipulation, all the way up to 3D visualization and interactive simulation. The toolkit design and implementation emphasize the reusability and flexibility. BIL-kit is implemented in the Java language so that it works in hybrid and dynamic research and educational environments. This also allows the toolkit to extend its usage for the development of Web-based applications. Several BIL-kit-based tools and applications are presented including image converter, image processor, general anatomy model simulator, vascular modeling environment, and volume viewer. BIL-kit is a suitable platform for researchers and students to develop visualization and simulation prototypes, and it can also be used for the development of clinical applications. PMID:16323064

  17. MAVEN IDL Toolkit: Integrated Data Access and Visualization

    NASA Astrophysics Data System (ADS)

    Larsen, K. W.; Martin, J.; De Wolfe, A. W.; Brain, D. A.

    2014-12-01

    The Mars Atmosphere and Volatile EvolutioN (MAVEN) mission has arrived at Mars and begun its investigations into the state of the upper atmosphere. Because atmospheric processes are subject to a wide variety of internal and external variables, understanding the overall forces driving the composition, structure, and dynamics requires an integrated analysis from all the available data. Eight instruments on the spacecraft are collecting in-situ and remote sensing data on the fields and particles, neutral and ionized, that make up Mars' upper atmosphere. As the scientific questions MAVEN is designed to answer require an understanding of the data from multiple instruments, the project has designed a software toolkit to facilitate the access, analysis, and visualization of the disparate data. The toolkit provides mission scientists with easy access to the data from within the IDL environment, designed to ensure that users are always working with the most recent data available and to eliminate the usual difficulties of data from a variety of data sources and formats. The Toolkit also includes 1-, 2-, and interactive 3-D visualizations to enable the scientists to examine the inter-relations between data from all instruments, as well as from external models. The data and visualizations have been designed to be as flexible and extensible as possible, allowing the scientists to rapidly and easily examine and manipulate the data in the context of the mission and their wider research programs.

  18. On combining computational differentiation and toolkits for parallel scientific computing.

    SciTech Connect

    Bischof, C. H.; Buecker, H. M.; Hovland, P. D.

    2000-06-08

    Automatic differentiation is a powerful technique for evaluating derivatives of functions given in the form of a high-level programming language such as Fortran, C, or C++. The program is treated as a potentially very long sequence of elementary statements to which the chain rule of differential calculus is applied over and over again. Combining automatic differentiation and the organizational structure of toolkits for parallel scientific computing provides a mechanism for evaluating derivatives by exploiting mathematical insight on a higher level. In these toolkits, algorithmic structures such as BLAS-like operations, linear and nonlinear solvers, or integrators for ordinary differential equations can be identified by their standardized interfaces and recognized as high-level mathematical objects rather than as a sequence of elementary statements. In this note, the differentiation of a linear solver with respect to some parameter vector is taken as an example. Mathematical insight is used to reformulate this problem into the solution of multiple linear systems that share the same coefficient matrix but differ in their right-hand sides. The experiments reported here use ADIC, a tool for the automatic differentiation of C programs, and PETSC, an object-oriented toolkit for the parallel solution of scientific problems modeled by partial differential equations.

  19. Guide to Using the WIND Toolkit Validation Code

    SciTech Connect

    Lieberman-Cribbin, W.; Draxl, C.; Clifton, A.

    2014-12-01

    In response to the U.S. Department of Energy's goal of using 20% wind energy by 2030, the Wind Integration National Dataset (WIND) Toolkit was created to provide information on wind speed, wind direction, temperature, surface air pressure, and air density on more than 126,000 locations across the United States from 2007 to 2013. The numerical weather prediction model output, gridded at 2-km and at a 5-minute resolution, was further converted to detail the wind power production time series of existing and potential wind facility sites. For users of the dataset it is important that the information presented in the WIND Toolkit is accurate and that errors are known, as then corrective steps can be taken. Therefore, we provide validation code written in R that will be made public to provide users with tools to validate data of their own locations. Validation is based on statistical analyses of wind speed, using error metrics such as bias, root-mean-square error, centered root-mean-square error, mean absolute error, and percent error. Plots of diurnal cycles, annual cycles, wind roses, histograms of wind speed, and quantile-quantile plots are created to visualize how well observational data compares to model data. Ideally, validation will confirm beneficial locations to utilize wind energy and encourage regional wind integration studies using the WIND Toolkit.

  20. Risk of Resource Failure and Toolkit Variation in Small-Scale Farmers and Herders

    PubMed Central

    Collard, Mark; Ruttle, April; Buchanan, Briggs; O’Brien, Michael J.

    2012-01-01

    Recent work suggests that global variation in toolkit structure among hunter-gatherers is driven by risk of resource failure such that as risk of resource failure increases, toolkits become more diverse and complex. Here we report a study in which we investigated whether the toolkits of small-scale farmers and herders are influenced by risk of resource failure in the same way. In the study, we applied simple linear and multiple regression analysis to data from 45 small-scale food-producing groups to test the risk hypothesis. Our results were not consistent with the hypothesis; none of the risk variables we examined had a significant impact on toolkit diversity or on toolkit complexity. It appears, therefore, that the drivers of toolkit structure differ between hunter-gatherers and small-scale food-producers. PMID:22844421

  1. Matlab based Toolkits used to Interface with Optical Design Software for NASA's James Webb Space Telescope

    NASA Technical Reports Server (NTRS)

    Howard, Joseph

    2007-01-01

    The viewgraph presentation provides an introduction to the James Webb Space Telescope (JWST). The first part provides a brief overview of Matlab toolkits including CodeV, OSLO, and Zemax Toolkits. The toolkit overview examines purpose, layout, how Matlab gets data from CodeV, function layout, and using cvHELP. The second part provides examples of use with JWST, including wavefront sensitivities and alignment simulations.

  2. GMinterp, A Matlab Based Toolkit for Gravity and Magnetic Data Analysis: Example Application to the Airborne Magnetic Anomalies of Biga Peninsula, NW Turkey

    NASA Astrophysics Data System (ADS)

    Ekinci, Y. L.; Yiǧitbaş, E.

    2012-04-01

    The analysis of gravity and magnetic field methods is becoming increasingly significant for the earth sciences as a whole and these potential field methods efficiently assist in working out both shallow and deep geologic problems and play important role on modeling and interpretation procedures. The main advantage of some gravity and magnetic data processing techniques is to present the subtle details in the data which are not clearly identified in anomaly maps, without specifying any prior information about the nature of the source bodies. If the data quality permits, many analyzing techniques can be carried out that help to build a general understanding of the details and parameters of the shallower or deeper causative body distributions such as depth, thickness, lateral and vertical extensions. Gravity and magnetic field data are usually analyzed by means of analytic signal (via directional derivatives) methods, linear transformations, regional and residual anomaly separation techniques, spectral methods, filtering and forward and inverse modeling techniques. Some commercial software packages are commonly used for analyzing potential field data by employing some of the techniques specified above. Additionally, many freeware and open-source codes can be found in the literature, but unfortunately they are focused on special issues of the potential fields. In this study, a toolkit, that performs numerous interpretation and modeling techniques for potential field data, is presented. The toolkit, named GMinterp, is MATLAB-based consisting of a series of linked functions along with a graphical user interface (GUI). GMinterp allows performing complex processing such as transformations and filtering, editing, gridding, mapping, digitizing, extracting cross-sections, forward and inverse modeling and interpretation tasks. The toolkit enables to work with both profile and gridded data as an input file. Tests on the theoretically produced data showed the reliability of

  3. Practical Aspects of the Cellular Force Inference Toolkit (CellFIT)

    PubMed Central

    Veldhuis, Jim H.; Mashburn, David; Hutson, M. Shane; Brodland, G. Wayne

    2016-01-01

    If we are to fully understand the reasons that cells and tissues move and acquire their distinctive geometries during processes such as embryogenesis and wound healing, we will need detailed maps of the forces involved. One of the best current prospects for obtaining this information is force-from-images techniques such as CellFIT, the Cellular Force Inference Toolkit, whose various steps are discussed here. Like other current quasi-static approaches, this one assumes that cell shapes are produced by interactions between interfacial tensions and intracellular pressures. CellFIT, however, allows cells to have curvilinear boundaries, which can significantly improve inference accuracy and reduce noise sensitivity. The quality of a CellFIT analysis depends on how accurately the junction angles and edge curvatures are measured, and a software tool we describe facilitates determination and evaluation of this information. Special attention is required when edges are crenulated or significantly different in shape from a circular arc. Because the tension and pressure equations are overdetermined, a select number of edges can be removed from the analysis, and these might include edges that are poorly defined in the source image, too short to provide accurate angles or curvatures, or non-circular. The approach works well for aggregates with as many as 1000 cells, and introduced errors have significant effects on only a few adjacent cells. An understanding of these considerations will help CellFIT users to get the most out of this promising new technique. PMID:25640437

  4. The GBIF Integrated Publishing Toolkit: Facilitating the Efficient Publishing of Biodiversity Data on the Internet

    PubMed Central

    Robertson, Tim; Döring, Markus; Guralnick, Robert; Bloom, David; Wieczorek, John; Braak, Kyle; Otegui, Javier; Russell, Laura; Desmet, Peter

    2014-01-01

    The planet is experiencing an ongoing global biodiversity crisis. Measuring the magnitude and rate of change more effectively requires access to organized, easily discoverable, and digitally-formatted biodiversity data, both legacy and new, from across the globe. Assembling this coherent digital representation of biodiversity requires the integration of data that have historically been analog, dispersed, and heterogeneous. The Integrated Publishing Toolkit (IPT) is a software package developed to support biodiversity dataset publication in a common format. The IPT’s two primary functions are to 1) encode existing species occurrence datasets and checklists, such as records from natural history collections or observations, in the Darwin Core standard to enhance interoperability of data, and 2) publish and archive data and metadata for broad use in a Darwin Core Archive, a set of files following a standard format. Here we discuss the key need for the IPT, how it has developed in response to community input, and how it continues to evolve to streamline and enhance the interoperability, discoverability, and mobilization of new data types beyond basic Darwin Core records. We close with a discussion how IPT has impacted the biodiversity research community, how it enhances data publishing in more traditional journal venues, along with new features implemented in the latest version of the IPT, and future plans for more enhancements. PMID:25099149

  5. PAT: a protein analysis toolkit for integrated biocomputing on the web

    PubMed Central

    Gracy, Jérôme; Chiche, Laurent

    2005-01-01

    PAT, for Protein Analysis Toolkit, is an integrated biocomputing server. The main goal of its design was to facilitate the combination of different processing tools for complex protein analyses and to simplify the automation of repetitive tasks. The PAT server provides a standardized web interface to a wide range of protein analysis tools. It is designed as a streamlined analysis environment that implements many features which strongly simplify studies dealing with protein sequences and structures and improve productivity. PAT is able to read and write data in many bioinformatics formats and to create any desired pipeline by seamlessly sending the output of a tool to the input of another tool. PAT can retrieve protein entries from identifier-based queries by using pre-computed database indexes. Users can easily formulate complex queries combining different analysis tools with few mouse clicks, or via a dedicated macro language, and a web session manager provides direct access to any temporary file generated during the user session. PAT is freely accessible on the Internet at . PMID:15980554

  6. FASTAptamer: A Bioinformatic Toolkit for High-throughput Sequence Analysis of Combinatorial Selections

    PubMed Central

    Alam, Khalid K; Chang, Jonathan L; Burke, Donald H

    2015-01-01

    High-throughput sequence (HTS) analysis of combinatorial selection populations accelerates lead discovery and optimization and offers dynamic insight into selection processes. An underlying principle is that selection enriches high-fitness sequences as a fraction of the population, whereas low-fitness sequences are depleted. HTS analysis readily provides the requisite numerical information by tracking the evolutionary trajectory of individual sequences in response to selection pressures. Unlike genomic data, for which a number of software solutions exist, user-friendly tools are not readily available for the combinatorial selections field, leading many users to create custom software. FASTAptamer was designed to address the sequence-level analysis needs of the field. The open source FASTAptamer toolkit counts, normalizes and ranks read counts in a FASTQ file, compares populations for sequence distribution, generates clusters of sequence families, calculates fold-enrichment of sequences throughout the course of a selection and searches for degenerate sequence motifs. While originally designed for aptamer selections, FASTAptamer can be applied to any selection strategy that can utilize next-generation DNA sequencing, such as ribozyme or deoxyribozyme selections, in vivo mutagenesis and various surface display technologies (peptide, antibody fragment, mRNA, etc.). FASTAptamer software, sample data and a user's guide are available for download at http://burkelab.missouri.edu/fastaptamer.html. PMID:25734917

  7. molSimplify: A toolkit for automating discovery in inorganic chemistry.

    PubMed

    Ioannidis, Efthymios I; Gani, Terry Z H; Kulik, Heather J

    2016-08-15

    We present an automated, open source toolkit for the first-principles screening and discovery of new inorganic molecules and intermolecular complexes. Challenges remain in the automatic generation of candidate inorganic molecule structures due to the high variability in coordination and bonding, which we overcome through a divide-and-conquer tactic that flexibly combines force-field preoptimization of organic fragments with alignment to first-principles-trained metal-ligand distances. Exploration of chemical space is enabled through random generation of ligands and intermolecular complexes from large chemical databases. We validate the generated structures with the root mean squared (RMS) gradients evaluated from density functional theory (DFT), which are around 0.02 Ha/au across a large 150 molecule test set. Comparison of molSimplify results to full optimization with the universal force field reveals that RMS DFT gradients are improved by 40%. Seamless generation of input files, preparation and execution of electronic structure calculations, and post-processing for each generated structure aids interpretation of underlying chemical and energetic trends. © 2016 Wiley Periodicals, Inc. PMID:27364957

  8. The GBIF integrated publishing toolkit: facilitating the efficient publishing of biodiversity data on the internet.

    PubMed

    Robertson, Tim; Döring, Markus; Guralnick, Robert; Bloom, David; Wieczorek, John; Braak, Kyle; Otegui, Javier; Russell, Laura; Desmet, Peter

    2014-01-01

    The planet is experiencing an ongoing global biodiversity crisis. Measuring the magnitude and rate of change more effectively requires access to organized, easily discoverable, and digitally-formatted biodiversity data, both legacy and new, from across the globe. Assembling this coherent digital representation of biodiversity requires the integration of data that have historically been analog, dispersed, and heterogeneous. The Integrated Publishing Toolkit (IPT) is a software package developed to support biodiversity dataset publication in a common format. The IPT's two primary functions are to 1) encode existing species occurrence datasets and checklists, such as records from natural history collections or observations, in the Darwin Core standard to enhance interoperability of data, and 2) publish and archive data and metadata for broad use in a Darwin Core Archive, a set of files following a standard format. Here we discuss the key need for the IPT, how it has developed in response to community input, and how it continues to evolve to streamline and enhance the interoperability, discoverability, and mobilization of new data types beyond basic Darwin Core records. We close with a discussion how IPT has impacted the biodiversity research community, how it enhances data publishing in more traditional journal venues, along with new features implemented in the latest version of the IPT, and future plans for more enhancements. PMID:25099149

  9. Aggregations in Flatworms.

    ERIC Educational Resources Information Center

    Liffen, C. L.; Hunter, M.

    1980-01-01

    Described is a school project to investigate aggregations in flatworms which may be influenced by light intensity, temperature, and some form of chemical stimulus released by already aggregating flatworms. Such investigations could be adopted to suit many educational levels of science laboratory activities. (DS)

  10. Using the PhenX Toolkit to Add Standard Measures to a Study.

    PubMed

    Hendershot, Tabitha; Pan, Huaqin; Haines, Jonathan; Harlan, William R; Marazita, Mary L; McCarty, Catherine A; Ramos, Erin M; Hamilton, Carol M

    2015-01-01

    The PhenX (consensus measures for Phenotypes and eXposures) Toolkit (https://www.phenxtoolkit.org/) offers high-quality, well-established measures of phenotypes and exposures for use by the scientific community. The goal is to promote the use of standard measures, enhance data interoperability, and help investigators identify opportunities for collaborative and translational research. The Toolkit contains 395 measures drawn from 22 research domains (fields of research), along with additional collections of measures for Substance Abuse and Addiction (SAA) research, Mental Health Research (MHR), and Tobacco Regulatory Research (TRR). Additional measures for TRR that are expected to be released in 2015 include Obesity, Eating Disorders, and Sickle Cell Disease. Measures are selected by working groups of domain experts using a consensus process that includes input from the scientific community. The Toolkit provides a description of each PhenX measure, the rationale for including it in the Toolkit, protocol(s) for collecting the measure, and supporting documentation. Users can browse measures in the Toolkit or can search the Toolkit using the Smart Query Tool or a full text search. PhenX Toolkit users select measures of interest to add to their Toolkit. Registered Toolkit users can save their Toolkit and return to it later to revise or complete. They then have options to download a customized Data Collection Worksheet that specifies the data to be collected, and a Data Dictionary that describes each variable included in the Data Collection Worksheet. The Toolkit also has a Register Your Study feature that facilitates cross-study collaboration by allowing users to find other investigators using the same PhenX measures. PMID:26132000

  11. Census of Population and Housing, 1980: Summary Tape File 1F, School Districts. Technical Documentation.

    ERIC Educational Resources Information Center

    Bureau of the Census (DOC), Washington, DC. Data User Services Div.

    This report provides technical documentation associated with a 1980 Census of Population and Housing Summary Tape File 1F--the School Districts File. The file contains complete-count data of population and housing aggregated by school district. Population items tabulated include age, race (provisional data), sex, marital status, Spanish origin…

  12. 11 CFR 104.5 - Filing dates (2 U.S.C. 434(a)(2)).

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... campaign committee of a candidate for President shall file reports on the dates specified at 11 CFR 104.5(b... committee filing under 11 CFR 104.5(b)(1)(ii) receives contributions aggregating $100,000 or makes... be waived if under 11 CFR 104.5(c)(1)(ii) a pre-election report is required to be filed during...

  13. Cross-File Searching: How Vendors Help--And Don't Help--Improve Compatability.

    ERIC Educational Resources Information Center

    Milstead, Jessica L.

    1999-01-01

    Reports how a cross-section of database producers, search services, and aggregators are using vocabulary management to facilitate cross-file searching. Discusses the range of subject areas and audiences; indexing; vocabulary control within databases; machine aids to indexing; and aids to cross-file searching. A chart contains sources of files and…

  14. Adoption of Test Driven Development and Continuous Integration for the Development of the Trick Simulation Toolkit

    NASA Technical Reports Server (NTRS)

    Penn, John M.

    2013-01-01

    This paper describes the adoption of a Test Driven Development approach and a Continuous Integration System in the development of the Trick Simulation Toolkit, a generic simulation development environment for creating high fidelity training and engineering simulations at the NASA/Johnson Space Center and many other NASA facilities. It describes what was learned and the significant benefits seen, such as fast, thorough, and clear test feedback every time code is checked-in to the code repository. It also describes a system that encourages development of code that is much more flexible, maintainable, and reliable. The Trick Simulation Toolkit development environment provides a common architecture for user-defined simulations. Trick builds executable simulations using user-supplied simulation-definition files (S_define) and user supplied "model code". For each Trick-based simulation, Trick automatically provides job scheduling, checkpoint / restore, data-recording, interactive variable manipulation (variable server), and an input-processor. Also included are tools for plotting recorded data and various other supporting tools and libraries. Trick is written in C/C++ and Java and supports both Linux and MacOSX. Prior to adopting this new development approach, Trick testing consisted primarily of running a few large simulations, with the hope that their complexity and scale would exercise most of Trick's code and expose any recently introduced bugs. Unsurprising, this approach yielded inconsistent results. It was obvious that a more systematic, thorough approach was required. After seeing examples of some Java-based projects that used the JUnit test framework, similar test frameworks for C and C++ were sought. Several were found, all clearly inspired by JUnit. Googletest, a freely available Open source testing framework, was selected as the most appropriate and capable. The new approach was implemented while rewriting the Trick memory management component, to eliminate a

  15. The advanced computational testing and simulation toolkit (ACTS)

    SciTech Connect

    Drummond, L.A.; Marques, O.

    2002-05-21

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  16. Developing Climate Resilience Toolkit Decision Support Training Sectio

    NASA Astrophysics Data System (ADS)

    Livezey, M. M.; Herring, D.; Keck, J.; Meyers, J. C.

    2014-12-01

    The Climate Resilience Toolkit (CRT) is a Federal government effort to address the U.S. President's Climate Action Plan and Executive Order for Climate Preparedness. The toolkit will provide access to tools and products useful for climate-sensitive decision making. To optimize the user experience, the toolkit will also provide access to training materials. The National Oceanic and Atmospheric Administration (NOAA) has been building a climate training capability for 15 years. The target audience for the training has historically been mainly NOAA staff with some modified training programs for external users and stakeholders. NOAA is now using this climate training capacity for the CRT. To organize the CRT training section, we collaborated with the Association of Climate Change Officers to determine the best strategy and identified four additional complimentary skills needed for successful decision making: climate literacy, environmental literacy, risk assessment and management, and strategic execution and monitoring. Developing the climate literacy skills requires knowledge of climate variability and change, as well as an introduction to the suite of available products and services. For the development of an environmental literacy category, specific topics needed include knowledge of climate impacts on specific environmental systems. Climate risk assessment and management introduces a process for decision making and provides knowledge on communication of climate information and integration of climate information in planning processes. The strategic execution and monitoring category provides information on use of NOAA climate products, services, and partnership opportunities for decision making. In order to use the existing training modules, it was necessary to assess their level of complexity, catalog them, and develop guidance for users on a curriculum to take advantage of the training resources to enhance their learning experience. With the development of this CRT

  17. The RAVE/VERTIGO vertex reconstruction toolkit and framework

    NASA Astrophysics Data System (ADS)

    Waltenberger, W.; Mitaroff, W.; Moser, F.; Pflugfelder, B.; Riedel, H. V.

    2008-07-01

    A detector-independent toolkit for vertex reconstruction (RAVE1) is being developed, along with a standalone framework (VERTIGO2) for testing, analyzing and debugging. The core algorithms represent state-of-the-art for geometric vertex finding and fitting by both linear (Kalman filter) and robust estimation methods. Main design goals are ease of use, flexibility for embedding into existing software frameworks, extensibility, and openness. The implementation is based on modern object-oriented techniques, is coded in C++ with interfaces for Java and Python, and follows an open-source approach. A beta release is available.

  18. Visualization of sphere packs using a dataflow toolkit.

    PubMed

    Walton, J

    1994-12-01

    We describe the construction of a simple application for the visualization of sphere packs, with applications to molecular graphics. Our development environment is IRIS Explorer, one of the new generation of so-called dataflow toolkits. We emphasize particularly the way in which working in such an environment facilitates the design and construction process, paying special attention to tools which aid the importing of data into the application, the design of the user interface, and the extension or modification of existing tools. Some examples of the use of the application in the field of molecular modeling are presented. PMID:7696218

  19. The Wind Integration National Dataset (WIND) toolkit (Presentation)

    SciTech Connect

    Caroline Draxl: NREL

    2014-01-01

    Regional wind integration studies require detailed wind power output data at many locations to perform simulations of how the power system will operate under high penetration scenarios. The wind datasets that serve as inputs into the study must realistically reflect the ramping characteristics, spatial and temporal correlations, and capacity factors of the simulated wind plants, as well as being time synchronized with available load profiles.As described in this presentation, the WIND Toolkit fulfills these requirements by providing a state-of-the-art national (US) wind resource, power production and forecast dataset.

  20. TECA: A Parallel Toolkit for Extreme Climate Analysis

    SciTech Connect

    Prabhat, Mr; Ruebel, Oliver; Byna, Surendra; Wu, Kesheng; Li, Fuyu; Wehner, Michael; Bethel, E. Wes

    2012-03-12

    We present TECA, a parallel toolkit for detecting extreme events in large climate datasets. Modern climate datasets expose parallelism across a number of dimensions: spatial locations, timesteps and ensemble members. We design TECA to exploit these modes of parallelism and demonstrate a prototype implementation for detecting and tracking three classes of extreme events: tropical cyclones, extra-tropical cyclones and atmospheric rivers. We process a modern TB-sized CAM5 simulation dataset with TECA, and demonstrate good runtime performance for the three case studies.

  1. Charon Toolkit for Parallel, Implicit Structured-Grid Computations: Functional Design

    NASA Technical Reports Server (NTRS)

    VanderWijngaart, Rob F.; Kutler, Paul (Technical Monitor)

    1997-01-01

    Charon is a software toolkit that enables engineers to develop high-performing message-passing programs in a convenient and piecemeal fashion. Emphasis is on rapid program development and prototyping. In this report a detailed description of the functional design of the toolkit is presented. It is illustrated by the stepwise parallelization of two representative code examples.

  2. Language Access Toolkit: An Organizing and Advocacy Resource for Community-Based Youth Programs

    ERIC Educational Resources Information Center

    Beyersdorf, Mark Ro

    2013-01-01

    Asian American Legal Defense and Education Fund (AALDEF) developed this language access toolkit to share the expertise and experiences of National Asian American Education Advocates Network (NAAEA) member organizations with other community organizations interested in developing language access campaigns. This toolkit includes an overview of…

  3. Toolkit for a Workshop on Building a Culture of Data Use. REL 2015-063

    ERIC Educational Resources Information Center

    Gerzon, Nancy; Guckenburg, Sarah

    2015-01-01

    The Culture of Data Use Workshop Toolkit helps school and district teams apply research to practice as they establish and support a culture of data use in their educational setting. The field-tested workshop toolkit guides teams through a set of structured activities to develop an understanding of data-use research in schools and to analyze…

  4. Building Emergency Contraception Awareness among Adolescents. A Toolkit for Schools and Community-Based Organizations.

    ERIC Educational Resources Information Center

    Simkin, Linda; Radosh, Alice; Nelsesteun, Kari; Silverstein, Stacy

    This toolkit presents emergency contraception (EC) as a method to help adolescent women avoid pregnancy and abortion after unprotected sexual intercourse. The sections of this toolkit are designed to help increase your knowledge of EC and stay up to date. They provide suggestions for increasing EC awareness in the workplace, whether it is a school…

  5. Growing and Sustaining Parent Engagement: A Toolkit for Parents and Community Partners

    ERIC Educational Resources Information Center

    Center for the Study of Social Policy, 2010

    2010-01-01

    The Toolkit is a quick and easy guide to help support and sustain parent engagement. It provides how to's for implementing three powerful strategies communities can use to maintain and grow parent engagement work that is already underway: Creating a Parent Engagement 1) Roadmap, 2) Checklist and 3) Support Network. This toolkit includes…

  6. A Data Audit and Analysis Toolkit To Support Assessment of the First College Year.

    ERIC Educational Resources Information Center

    Paulson, Karen

    This "toolkit" provides a process by which institutions can identify and use information resources to enhance the experiences and outcomes of first-year students. The toolkit contains a "Technical Manual" designed for use by the technical personnel who will be conducting the data audit and associated analyses. Administrators who want more…

  7. The Customer Flow Toolkit: A Framework for Designing High Quality Customer Services.

    ERIC Educational Resources Information Center

    New York Association of Training and Employment Professionals, Albany.

    This document presents a toolkit to assist staff involved in the design and development of New York's one-stop system. Section 1 describes the preplanning issues to be addressed and the intended outcomes that serve as the framework for creation of the customer flow toolkit. Section 2 outlines the following strategies to assist in designing local…

  8. The Development of a Curriculum Toolkit with American Indian and Alaska Native Communities

    ERIC Educational Resources Information Center

    Thompson, Nicole L.; Hare, Dwight; Sempier, Tracie T.; Grace, Cathy

    2008-01-01

    This article explains the creation of the "Growing and Learning with Young Native Children" curriculum toolkit. The curriculum toolkit was designed to give American Indian and Alaska Native early childhood educators who work in a variety of settings the framework for developing a research-based, developmentally appropriate, tribally specific…

  9. Toolkit for Professional Developers: Training Targets 3?6 Grade Teachers

    ERIC Educational Resources Information Center

    McMunn, Nancy; Dunnivant, Michael; Williamson, Jan; Reagan, Hope

    2004-01-01

    The professional development CAR Toolkit is focused on the assessment of reading process at the text level, rather than at the word level. Most students in grades 3-6 generally need support in comprehending text, not just decoding words. While the assessment of reading methods in the CAR Toolkit will help teachers pinpoint difficulties at the word…

  10. Toolkit for Evaluating Alignment of Instructional and Assessment Materials to the Common Core State Standards

    ERIC Educational Resources Information Center

    Achieve, Inc., 2014

    2014-01-01

    In joint partnership, Achieve, The Council of Chief State School Officers, and Student Achievement Partners have developed a Toolkit for Evaluating the Alignment of Instructional and Assessment Materials to the Common Core State Standards (CCSS). The Toolkit is a set of interrelated, freely available instruments for evaluating alignment to the…

  11. Toolkit for Evaluating Alignment of Instructional and Assessment Materials to the Common Core State Standards

    ERIC Educational Resources Information Center

    Achieve, Inc., 2014

    2014-01-01

    In joint partnership, Achieve, The Council of Chief State School Officers, and Student Achievement Partners have developed a Toolkit for Evaluating the Alignment of Instructional and Assessment Materials to the Common Core State Standards. The Toolkit is a set of interrelated, freely available instruments for evaluating alignment to the CCSS; each…

  12. Charged Dust Aggregate Interactions

    NASA Astrophysics Data System (ADS)

    Matthews, Lorin; Hyde, Truell

    2015-11-01

    A proper understanding of the behavior of dust particle aggregates immersed in a complex plasma first requires a knowledge of the basic properties of the system. Among the most important of these are the net electrostatic charge and higher multipole moments on the dust aggregate as well as the manner in which the aggregate interacts with the local electrostatic fields. The formation of elongated, fractal-like aggregates levitating in the sheath electric field of a weakly ionized RF generated plasma discharge has recently been observed experimentally. The resulting data has shown that as aggregates approach one another, they can both accelerate and rotate. At equilibrium, aggregates are observed to levitate with regular spacing, rotating about their long axis aligned parallel to the sheath electric field. Since gas drag tends to slow any such rotation, energy must be constantly fed into the system in order to sustain it. A numerical model designed to analyze this motion provides both the electrostatic charge and higher multipole moments of the aggregate while including the forces due to thermophoresis, neutral gas drag, and the ion wakefield. This model will be used to investigate the ambient conditions leading to the observed interactions. This research is funded by NSF Grant 1414523.

  13. Aggregate and the environment

    USGS Publications Warehouse

    Langer, William H.; Drew, Lawrence J.; Sachs, J.S.

    2004-01-01

    This book is designed to help you understand our aggregate resources-their importance, where they come from, how they are processed for our use, the environmental concerns related to their mining and processing, how those concerns are addressed, and the policies and regulations designed to safeguard workers, neighbors, and the environment from the negative impacts of aggregate mining. We hope this understanding will help prepare you to be involved in decisions that need to be made-individually and as a society-to be good stewards of our aggregate resources and our living planet.

  14. The Bioperl toolkit: Perl modules for the life sciences.

    PubMed

    Stajich, Jason E; Block, David; Boulez, Kris; Brenner, Steven E; Chervitz, Stephen A; Dagdigian, Chris; Fuellen, Georg; Gilbert, James G R; Korf, Ian; Lapp, Hilmar; Lehväslaiho, Heikki; Matsalla, Chad; Mungall, Chris J; Osborne, Brian I; Pocock, Matthew R; Schattner, Peter; Senger, Martin; Stein, Lincoln D; Stupka, Elia; Wilkinson, Mark D; Birney, Ewan

    2002-10-01

    The Bioperl project is an international open-source collaboration of biologists, bioinformaticians, and computer scientists that has evolved over the past 7 yr into the most comprehensive library of Perl modules available for managing and manipulating life-science information. Bioperl provides an easy-to-use, stable, and consistent programming interface for bioinformatics application programmers. The Bioperl modules have been successfully and repeatedly used to reduce otherwise complex tasks to only a few lines of code. The Bioperl object model has been proven to be flexible enough to support enterprise-level applications such as EnsEMBL, while maintaining an easy learning curve for novice Perl programmers. Bioperl is capable of executing analyses and processing results from programs such as BLAST, ClustalW, or the EMBOSS suite. Interoperation with modules written in Python and Java is supported through the evolving BioCORBA bridge. Bioperl provides access to data stores such as GenBank and SwissProt via a flexible series of sequence input/output modules, and to the emerging common sequence data storage format of the Open Bioinformatics Database Access project. This study describes the overall architecture of the toolkit, the problem domains that it addresses, and gives specific examples of how the toolkit can be used to solve common life-sciences problems. We conclude with a discussion of how the open-source nature of the project has contributed to the development effort. PMID:12368254

  15. Developing an evidence-based, multimedia group counseling curriculum toolkit.

    PubMed

    Brooks, Adam C; Diguiseppi, Graham; Laudet, Alexandre; Rosenwasser, Beth; Knoblach, Dan; Carpenedo, Carolyn M; Carise, Deni; Kirby, Kimberly C

    2012-09-01

    Training community-based addiction counselors in empirically supported treatments (ESTs) far exceeds the ever-decreasing resources of publicly funded treatment agencies. This feasibility study describes the development and pilot testing of a group counseling toolkit (an approach adapted from the education field) focused on relapse prevention (RP). When counselors (N = 17) used the RP toolkit after 3 hours of training, their content adherence scores on "coping with craving" and "drug refusal skills" showed significant improvement, as indicated by very large effect sizes (Cohen's d = 1.49 and 1.34, respectively). Counselor skillfulness, in the "adequate-to-average" range at baseline, did not change. Although this feasibility study indicates some benefit to counselor EST acquisition, it is important to note that the impact of the curriculum on client outcomes is unknown. Because a majority of addiction treatment is delivered in group format, a multimedia curriculum approach may assist counselors in applying ESTs in the context of actual service delivery. PMID:22301082

  16. Developing an evidence-based, multimedia group counseling curriculum toolkit

    PubMed Central

    Brooks, Adam C.; DiGuiseppi, Graham; Laudet, Alexandre; Rosenwasser, Beth; Knoblach, Dan; Carpenedo, Carolyn M.; Carise, Deni; Kirby, Kimberly C.

    2013-01-01

    Training community-based addiction counselors in empirically supported treatments (ESTs) far exceeds the ever-decreasing resources of publicly funded treatment agencies. This feasibility study describes the development and pilot testing of a group counseling toolkit (an approach adapted from the education field) focused on relapse prevention (RP). When counselors (N = 17) used the RP toolkit after 3 hours of training, their content adherence scores on “coping with craving” and “drug refusal skills” showed significant improvement, as indicated by very large effect sizes (Cohen’s d = 1.49 and 1.34, respectively). Counselor skillfulness, in the “adequate-to-average” range at baseline, did not change. Although this feasibility study indicates some benefit to counselor EST acquisition, it is important to note that the impact of the curriculum on client outcomes is unknown. Because a majority of addiction treatment is delivered in group format, a multimedia curriculum approach may assist counselors in applying ESTs in the context of actual service delivery. PMID:22301082

  17. Regulatory and Permitting Information Desktop (RAPID) Toolkit (Poster)

    SciTech Connect

    Young, K. R.; Levine, A.

    2014-09-01

    The Regulatory and Permitting Information Desktop (RAPID) Toolkit combines the former Geothermal Regulatory Roadmap, National Environmental Policy Act (NEPA) Database, and other resources into a Web-based tool that gives the regulatory and utility-scale geothermal developer communities rapid and easy access to permitting information. RAPID currently comprises five tools - Permitting Atlas, Regulatory Roadmap, Resource Library, NEPA Database, and Best Practices. A beta release of an additional tool, the Permitting Wizard, is scheduled for late 2014. Because of the huge amount of information involved, RAPID was developed in a wiki platform to allow industry and regulatory agencies to maintain the content in the future so that it continues to provide relevant and accurate information to users. In 2014, the content was expanded to include regulatory requirements for utility-scale solar and bulk transmission development projects. Going forward, development of the RAPID Toolkit will focus on expanding the capabilities of current tools, developing additional tools, including additional technologies, and continuing to increase stakeholder involvement.

  18. Protein Colloidal Aggregation Project

    NASA Technical Reports Server (NTRS)

    Oliva-Buisson, Yvette J. (Compiler)

    2014-01-01

    To investigate the pathways and kinetics of protein aggregation to allow accurate predictive modeling of the process and evaluation of potential inhibitors to prevalent diseases including cataract formation, chronic traumatic encephalopathy, Alzheimer's Disease, Parkinson's Disease and others.

  19. Compress Your Files

    ERIC Educational Resources Information Center

    Branzburg, Jeffrey

    2005-01-01

    File compression enables data to be squeezed together, greatly reducing file size. Why would someone want to do this? Reducing file size enables the sending and receiving of files over the Internet more quickly, the ability to store more files on the hard drive, and the ability pack many related files into one archive (for example, all files…

  20. Integration of Earth Remote Sensing into the NOAA/NWS Damage Assessment Toolkit

    NASA Astrophysics Data System (ADS)

    Molthan, A.; Burks, J. E.; Camp, P.; McGrath, K.; Bell, J. R.

    2014-12-01

    Following the occurrence of severe weather, NOAA/NWS meteorologists are tasked with performing a storm damage survey to assess the type and severity of the weather event, primarily focused with the confirmation and assessment of tornadoes. This labor-intensive process requires meteorologists to venture into the affected area, acquire damage indicators through photos, eyewitness accounts, and other documentation, then aggregation of data in order to make a final determination of the tornado path length, width, maximum intensity, and other characteristics. Earth remote sensing from operational, polar-orbiting satellites can support the damage assessment process by helping to identify portions of damage tracks that are difficult to access due to road limitations or time constraints by applying change detection techniques. In addition, higher resolution commercial imagery can corroborate ground-based surveys by examining higher-resolution commercial imagery. As part of an ongoing collaboration, NASA and NOAA are working to integrate near real-time Earth remote sensing observations into the NOAA/NWS Damage Assessment Toolkit (DAT), a suite of applications used by meteorologists in the survey process. The DAT includes a handheld application used by meteorologists in the survey process. The team has recently developed a more streamlined approach for delivering data via a web mapping service and menu interface, allowing for caching of imagery before field deployment. Near real-time products have been developed using MODIS and VIIRS imagery and change detection for preliminary track identification, along with conduits for higher-resolution Landsat, ASTER, and commercial imagery as they become available. In addition to tornado damage assessments, the team is also investigating the use of near real-time imagery for identifying hail damage to vegetation, which also results in large swaths of damage, particularly in the central United States during the peak growing season

  1. Integration of Earth Remote Sensing into the NOAA/NWS Damage Assessment Toolkit

    NASA Technical Reports Server (NTRS)

    Molthan, Andrew; Burks, Jason; Camp, Parks; McGrath, Kevin; Bell, Jordan

    2014-01-01

    Following the occurrence of severe weather, NOAA/NWS meteorologists are tasked with performing a storm damage survey to assess the type and severity of the weather event, primarily focused with the confirmation and assessment of tornadoes. This labor-intensive process requires meteorologists to venture into the affected area, acquire damage indicators through photos, eyewitness accounts, and other documentation, then aggregation of data in order to make a final determination of the tornado path length, width, maximum intensity, and other characteristics. Earth remote sensing from operational, polar-orbiting satellites can support the damage assessment process by helping to identify portions of damage tracks that are difficult to access due to road limitations or time constraints by applying change detection techniques. In addition, higher resolution commercial imagery can corroborate ground-based surveys by examining higher-resolution commercial imagery. As part of an ongoing collaboration, NASA and NOAA are working to integrate near real-time Earth remote sensing observations into the NOAA/NWS Damage Assessment Toolkit, a handheld application used by meteorologists in the survey process. The team has recently developed a more streamlined approach for delivering data via a web mapping service and menu interface, allowing for caching of imagery before field deployment. Near real-time products have been developed using MODIS and VIIRS imagery and change detection for preliminary track identification, along with conduits for higher-resolution Landsat, ASTER, and commercial imagery as they become available. In addition to tornado damage assessments, the team is also investigating the use of near real-time imagery for identifying hail damage to vegetation, which also results in large swaths of damage, particularly in the central United States during the peak growing season months of June, July, and August. This presentation will present an overview of recent activities

  2. Use of Remote Sensing Data to Enhance NWS Storm Damage Toolkit

    NASA Astrophysics Data System (ADS)

    Jedlovec, G.; Molthan, A.; White, K.; Burks, J.; Stellman, K.; Smith, M. R.

    2012-12-01

    In the wake of a natural disaster such as a tornado, the National Weather Service (NWS) is required to provide a very detailed and timely storm damage assessment to local, state and federal homeland security officials. The Post-Storm Data Acquisition (PSDA) procedure involves the acquisition and assembly of highly perishable data necessary for accurate post-event analysis and potential integration into a geographic information system (GIS) available to its end users and associated decision makers. Information gained from the process also enables the NWS to increase its knowledge of extreme events, learn how to better use existing equipment, improve NWS warning programs, and provide accurate storm intensity and damage information to the news media and academia. To help collect and manage all of this information, forecasters in NWS Southern Region are currently developing a Storm Damage Assessment Toolkit (SDAT), which incorporates GIS-capable phones and laptops into the PSDA process by tagging damage photography, location, and storm damage details with GPS coordinates for aggregation within the GIS database. However, this tool alone does not fully integrate radar and ground based storm damage reports nor does it help to identify undetected storm damage regions. In many cases, information on storm damage location (beginning and ending points, swath width, etc.) from ground surveys is incomplete or difficult to obtain. Geographic factors (terrain and limited roads in rural areas), manpower limitations, and other logistical constraints often prevent the gathering of a comprehensive picture of tornado or hail damage, and may allow damage regions to go undetected. Molthan et al. (2011) have shown that high resolution satellite data can provide additional valuable information on storm damage tracks to augment this database. This paper presents initial development to integrate satellite-derived damage track information into the SDAT for near real-time use by forecasters

  3. Use of Remote Sensing Data to Enhance NWS Storm Damage Toolkit

    NASA Technical Reports Server (NTRS)

    Jedlove, Gary J.; Molthan, Andrew L.; White, Kris; Burks, Jason; Stellman, Keith; Smith, Mathew

    2012-01-01

    In the wake of a natural disaster such as a tornado, the National Weather Service (NWS) is required to provide a very detailed and timely storm damage assessment to local, state and federal homeland security officials. The Post ]Storm Data Acquisition (PSDA) procedure involves the acquisition and assembly of highly perishable data necessary for accurate post ]event analysis and potential integration into a geographic information system (GIS) available to its end users and associated decision makers. Information gained from the process also enables the NWS to increase its knowledge of extreme events, learn how to better use existing equipment, improve NWS warning programs, and provide accurate storm intensity and damage information to the news media and academia. To help collect and manage all of this information, forecasters in NWS Southern Region are currently developing a Storm Damage Assessment Toolkit (SDAT), which incorporates GIS ]capable phones and laptops into the PSDA process by tagging damage photography, location, and storm damage details with GPS coordinates for aggregation within the GIS database. However, this tool alone does not fully integrate radar and ground based storm damage reports nor does it help to identify undetected storm damage regions. In many cases, information on storm damage location (beginning and ending points, swath width, etc.) from ground surveys is incomplete or difficult to obtain. Geographic factors (terrain and limited roads in rural areas), manpower limitations, and other logistical constraints often prevent the gathering of a comprehensive picture of tornado or hail damage, and may allow damage regions to go undetected. Molthan et al. (2011) have shown that high resolution satellite data can provide additional valuable information on storm damage tracks to augment this database. This paper presents initial development to integrate satellitederived damage track information into the SDAT for near real ]time use by forecasters

  4. A prototype forensic toolkit for industrial-control-systems incident response

    NASA Astrophysics Data System (ADS)

    Carr, Nickolas B.; Rowe, Neil C.

    2015-05-01

    Industrial control systems (ICSs) are an important part of critical infrastructure in cyberspace. They are especially vulnerable to cyber-attacks because of their legacy hardware and software and the difficulty of changing it. We first survey the history of intrusions into ICSs, the more serious of which involved a continuing adversary presence on an ICS network. We discuss some common vulnerabilities and the categories of possible attacks, noting the frequent use of software written a long time ago. We propose a framework for designing ICS incident response under the constraints that no new software must be required and that interventions cannot impede the continuous processing that is the norm for such systems. We then discuss a prototype toolkit we built using the Windows Management Instrumentation Command-Line tool for host-based analysis and the Bro intrusion-detection software for network-based analysis. Particularly useful techniques we used were learning the historical range of parameters of numeric quantities so as to recognize anomalies, learning the usual addresses of connections to a node, observing Internet addresses (usually rare), observing anomalous network protocols such as unencrypted data transfers, observing unusual scheduled tasks, and comparing key files through registry entries and hash values to find malicious modifications. We tested our methods on actual data from ICSs including publicly-available data, voluntarily-submitted data, and researcher-provided "advanced persistent threat" data. We found instances of interesting behavior in our experiments. Intrusions were generally easy to see because of the repetitive nature of most processing on ICSs, but operators need to be motivated to look.

  5. A lightweight, flow-based toolkit for parallel and distributed bioinformatics pipelines

    PubMed Central

    2011-01-01

    Background Bioinformatic analyses typically proceed as chains of data-processing tasks. A pipeline, or 'workflow', is a well-defined protocol, with a specific structure defined by the topology of data-flow interdependencies, and a particular functionality arising from the data transformations applied at each step. In computer science, the dataflow programming (DFP) paradigm defines software systems constructed in this manner, as networks of message-passing components. Thus, bioinformatic workflows can be naturally mapped onto DFP concepts. Results To enable the flexible creation and execution of bioinformatics dataflows, we have written a modular framework for parallel pipelines in Python ('PaPy'). A PaPy workflow is created from re-usable components connected by data-pipes into a directed acyclic graph, which together define nested higher-order map functions. The successive functional transformations of input data are evaluated on flexibly pooled compute resources, either local or remote. Input items are processed in batches of adjustable size, all flowing one to tune the trade-off between parallelism and lazy-evaluation (memory consumption). An add-on module ('NuBio') facilitates the creation of bioinformatics workflows by providing domain specific data-containers (e.g., for biomolecular sequences, alignments, structures) and functionality (e.g., to parse/write standard file formats). Conclusions PaPy offers a modular framework for the creation and deployment of parallel and distributed data-processing workflows. Pipelines derive their functionality from user-written, data-coupled components, so PaPy also can be viewed as a lightweight toolkit for extensible, flow-based bioinformatics data-processing. The simplicity and flexibility of distributed PaPy pipelines may help users bridge the gap between traditional desktop/workstation and grid computing. PaPy is freely distributed as open-source Python code at http://muralab.org/PaPy, and includes extensive

  6. Overview and Meteorological Validation of the Wind Integration National Dataset toolkit

    SciTech Connect

    Draxl, C.; Hodge, B. M.; Clifton, A.; McCaa, J.

    2015-04-13

    The Wind Integration National Dataset (WIND) Toolkit described in this report fulfills these requirements, and constitutes a state-of-the-art national wind resource data set covering the contiguous United States from 2007 to 2013 for use in a variety of next-generation wind integration analyses and wind power planning. The toolkit is a wind resource data set, wind forecast data set, and wind power production and forecast data set derived from the Weather Research and Forecasting (WRF) numerical weather prediction model. WIND Toolkit data are available online for over 116,000 land-based and 10,000 offshore sites representing existing and potential wind facilities.

  7. Demonstration of the Health Literacy Universal Precautions Toolkit: Lessons for Quality Improvement.

    PubMed

    Mabachi, Natabhona M; Cifuentes, Maribel; Barnard, Juliana; Brega, Angela G; Albright, Karen; Weiss, Barry D; Brach, Cindy; West, David

    2016-01-01

    The Agency for Healthcare Research and Quality Health Literacy Universal Precautions Toolkit was developed to help primary care practices assess and make changes to improve communication with and support for patients. Twelve diverse primary care practices implemented assigned tools over a 6-month period. Qualitative results revealed challenges practices experienced during implementation, including competing demands, bureaucratic hurdles, technological challenges, limited quality improvement experience, and limited leadership support. Practices used the Toolkit flexibly and recognized the efficiencies of implementing tools in tandem and in coordination with other quality improvement initiatives. Practices recommended reducing Toolkit density and making specific refinements. PMID:27232681

  8. Transverse space charge effect calculation in the Synergia accelerator modeling toolkit

    SciTech Connect

    Okonechnikov, Konstantin; Amundson, James; Macridin, Alexandru; /Fermilab

    2009-09-01

    This paper describes a transverse space charge effect calculation algorithm, developed in the context of accelerator modeling toolkit Synergia. The introduction to the space charge problem and the Synergia modeling toolkit short description are given. The developed algorithm is explained and the implementation is described in detail. As a result of this work a new space charge solver was developed and integrated into the Synergia toolkit. The solver showed correct results in comparison to existing Synergia solvers and delivered better performance in the regime where it is applicable.

  9. MeSh ToolKit v1.2

    SciTech Connect

    Garimella, Rao V.

    2004-05-15

    MSTK or Mesh Toolkit is a mesh framework that allows users to represent, manipulate and query unstructured 3D arbitrary topology meshes in a general manner without the need to code their own data structures. MSTK is a flexible framework in that is allows (or will eventually allow) a wide variety of underlying representations for the mesh while maintaining a common interface. It will allow users to choose from different mesh representations either at initialization or during the program execution so that the optimal data structures are used for the particular algorithm. The interaction of users and applications with MSTK is through a functional interface that acts as through the mesh always contains vertices, edges, faces and regions and maintains connectivity between all these entities.

  10. Migration of 1970s Minicomputer Controls to Modern Toolkit Software

    SciTech Connect

    Juras, R.C.; Meigs, M.J.; Sinclair, J.A.; Tatum, B.A.

    1999-11-13

    Controls for accelerators and associated systems at the Holifield Radioactive Ion Beam Facility (HRIBF) at Oak Ridge National Laboratory have been migrated from 197Os-vintage minicomputers to a modern system based on Vista and EPICS toolkit software. Stability and capabilities of EPICS software have motivated increasing use of EPICS for accelerator controls. In addition, very inexpensive subsystems based on EPICS and the EPICS portable CA server running on Linux PCs have been implemented to control an ion source test facility and to control a building-access badge reader system. A new object-oriented, extensible display manager has been developed for EPICS to facilitate the transition to EPICS and will be used in place of MEDM. EPICS device support has been developed for CAMAC serial highway controls.

  11. A toolkit for MSDs prevention--WHO and IEA context.

    PubMed

    Caple, David C

    2012-01-01

    Many simple MSD risk management tools have been developed by ergonomists for use by workers and employers with little or no training to undertake injury prevention programs in their workplace. However, currently there is no "toolkit" which places such tools within an holistic, participative ergonomics framework and provides guidance on how best to use individual tools. It is proposed that such an holistic approach should entail initial analysis and evaluation of underlying systems of work and related health and performance indicators, prior to focusing in assessment of MSD risks stemming from particular hazards. Depending on the context, more narrowly focused tools might then be selected to assess risk associated with jobs or tasks identified as problematic. This approach ensures that biomechanical risk factors are considered within a broad context of organizational and psychosocial risk factors. This is consistent with current research evidence on work- related causes of MSDs. PMID:22317323

  12. Family Meetings Made Simpler: A Toolkit for the ICU

    PubMed Central

    Nelson, Judith E.; Walker, Amy S.; Luhrs, Carol M.; Cortez, Therese B.; Pronovost, Peter

    2013-01-01

    Although a growing body of evidence has associated the ICU family meeting with important, favorable outcomes for critically ill patients, their families, and health care systems, these meetings often fail to occur in a timely, effective, and reliable way. In this article, we describe three specific tools that we have developed as prototypes to promote more successful implementation of family meetings in the ICU: 1) A Family Meeting Planner; 2) A Meeting Guide for Families; and 3) A Family Meeting Documentation Template. We describe the essential features of these tools and ways that they might be adapted to meet the local needs of individual ICUs and to maximize acceptability and use. We also discuss the role of such tools in structuring a performance improvement initiative. Just as simple tools have helped to reduce bloodstream infections, our hope is that the toolkit presented here will help critical care teams to meet the important communication needs of ICU families. PMID:19427757

  13. Integrating surgical robots into the next medical toolkit.

    PubMed

    Lai, Fuji; Entin, Eileen

    2006-01-01

    Surgical robots hold much promise for revolutionizing the field of surgery and improving surgical care. However, despite the potential advantages they offer, there are multiple barriers to adoption and integration into practice that may prevent these systems from realizing their full potential benefit. This study elucidated some of the most salient considerations that need to be addressed for integration of new technologies such as robotic systems into the operating room of the future as it evolves into a complex system of systems. We conducted in-depth interviews with operating room team members and other stakeholders to identify potential barriers in areas of workflow, teamwork, training, clinical acceptance, and human-system interaction. The findings of this study will inform an approach for the design and integration of robotics and related computer-assisted technologies into the next medical toolkit for "computer-enhanced surgery" to improve patient safety and healthcare quality. PMID:16404063

  14. MeSh ToolKit v1.2

    Energy Science and Technology Software Center (ESTSC)

    2004-05-15

    MSTK or Mesh Toolkit is a mesh framework that allows users to represent, manipulate and query unstructured 3D arbitrary topology meshes in a general manner without the need to code their own data structures. MSTK is a flexible framework in that is allows (or will eventually allow) a wide variety of underlying representations for the mesh while maintaining a common interface. It will allow users to choose from different mesh representations either at initialization ormore » during the program execution so that the optimal data structures are used for the particular algorithm. The interaction of users and applications with MSTK is through a functional interface that acts as through the mesh always contains vertices, edges, faces and regions and maintains connectivity between all these entities.« less

  15. PHISICS toolkit: Multi-reactor transmutation analysis utility - MRTAU

    SciTech Connect

    Alfonsi, A.; Rabiti, C.; Epiney, A. S.; Wang, Y.; Cogliati, J.

    2012-07-01

    The principal idea of this paper is to present the new capabilities available in the PHISICS toolkit, connected with the implementation of the depletion code MRTAU, a generic depletion/decay/burn-up code developed at the Idaho National Laboratory. It is programmed in a modular structure and modern FORTRAN 95/2003. The code tracks the time evolution of the isotopic concentration of a given material accounting for nuclear reactions happening in presence of neutron flux and also due to natural decay. MRTAU has two different methods to perform the depletion calculation, in order to let the user choose the best on with respect to his needs. Both the methodologies and some significant results are reported in this paper. (authors)

  16. Medical Applications of the Geant4 Simulation Toolkit

    NASA Astrophysics Data System (ADS)

    Perl, Joseph

    2008-03-01

    Geant4 is a toolkit for the simulation of the passage of particles through matter. While Geant4 was originally developed for High Energy Physics (HEP), applications now include Nuclear, Space and Medical Physics. Medical applications of Geant4 in North America and throughout the world have been increasing rapidly due to the overall growth of Monte Carlo use in Medical Physics and the unique qualities of Geant4 as an all-particle code able to handle complex geometry, motion and fields with the flexibility of modern programming and an open and free source code. Work has included characterizing beams and brachytherapy sources, treatment planning, retrospective studies, imaging and validation. This talk will provide an overview of these applications, with a focus on therapy, and will discuss how Geant4 has responded to the specific challenges of moving from HEP to Medical applications.

  17. HemI: a toolkit for illustrating heatmaps.

    PubMed

    Deng, Wankun; Wang, Yongbo; Liu, Zexian; Cheng, Han; Xue, Yu

    2014-01-01

    Recent high-throughput techniques have generated a flood of biological data in all aspects. The transformation and visualization of multi-dimensional and numerical gene or protein expression data in a single heatmap can provide a concise but comprehensive presentation of molecular dynamics under different conditions. In this work, we developed an easy-to-use tool named HemI (Heat map Illustrator), which can visualize either gene or protein expression data in heatmaps. Additionally, the heatmaps can be recolored, rescaled or rotated in a customized manner. In addition, HemI provides multiple clustering strategies for analyzing the data. Publication-quality figures can be exported directly. We propose that HemI can be a useful toolkit for conveniently visualizing and manipulating heatmaps. The stand-alone packages of HemI were implemented in Java and can be accessed at http://hemi.biocuckoo.org/down.php. PMID:25372567

  18. PARAMESH: A Parallel Adaptive Mesh Refinement Community Toolkit

    NASA Technical Reports Server (NTRS)

    MacNeice, Peter; Olson, Kevin M.; Mobarry, Clark; deFainchtein, Rosalinda; Packer, Charles

    1999-01-01

    In this paper, we describe a community toolkit which is designed to provide parallel support with adaptive mesh capability for a large and important class of computational models, those using structured, logically cartesian meshes. The package of Fortran 90 subroutines, called PARAMESH, is designed to provide an application developer with an easy route to extend an existing serial code which uses a logically cartesian structured mesh into a parallel code with adaptive mesh refinement. Alternatively, in its simplest use, and with minimal effort, it can operate as a domain decomposition tool for users who want to parallelize their serial codes, but who do not wish to use adaptivity. The package can provide them with an incremental evolutionary path for their code, converting it first to uniformly refined parallel code, and then later if they so desire, adding adaptivity.

  19. The extended PP1 toolkit: designed to create specificity

    PubMed Central

    Bollen, Mathieu; Peti, Wolfgang; Ragusa, Michael J.; Beullens, Monique

    2011-01-01

    Protein Ser/Thr phosphatase-1 (PP1) catalyzes the majority of eukaryotic protein dephosphorylation reactions in a highly regulated and selective manner. Recent studies have identified an unusually diversified PP1 interactome with the properties of a regulatory toolkit. PP1-interacting proteins (PIPs) function as targeting subunits, substrates and/or inhibitors. As targeting subunits, PIPs contribute to substrate selection by bringing PP1 into the vicinity of specific substrates and by modulating substrate specificity via additional substrate docking sites or blocking substrate-binding channels. Many of the nearly 200 established mammalian PIPs are predicted to be intrinsically disordered, a property that facilitates their binding to a large surface area of PP1 via multiple docking motifs. These novel insights offer perspectives for the therapeutic targeting of PP1 by interfering with the binding of PIPs or substrates. PMID:20399103

  20. PHISICS TOOLKIT: MULTI-REACTOR TRANSMUTATION ANALYSIS UTILITY - MRTAU

    SciTech Connect

    Andrea Alfonsi; Cristian Rabiti; Aaron S. Epiney; Yaqi Wang; Joshua Cogliati

    2012-04-01

    The principal idea of this paper is to present the new capabilities available in the PHISICS toolkit, connected with the implementation of the depletion code MRTAU, a generic depletion/ decay/burn-up code developed at the Idaho National Laboratory. It is programmed in a modular structure and modern FORTRAN 95/2003. The code tracks the time evolution of the isotopic concentration of a given material accounting for nuclear reaction happening in presence of neutron flux and also due to natural decay. MRTAU has two different methods to perform the depletion calculation, in order to let the user choose the best one respect his needs. Both the methodologies and some significant results are reported in this paper.

  1. The interactive learning toolkit: technology and the classroom

    NASA Astrophysics Data System (ADS)

    Lukoff, Brian; Tucker, Laura

    2011-04-01

    Peer Instruction (PI) and Just-in-Time-Teaching (JiTT) have been shown to increase both students' conceptual understanding and problem-solving skills. However, the time investment for the instructor to prepare appropriate conceptual questions and manage student JiTT responses is one of the main implementation hurdles. To overcome this we have developed the Interactive Learning Toolkit (ILT), a course management system specifically designed to support PI and JiTT. We are working to integrate the ILT with a fully interactive classroom system where students can use their laptops and smartphones to respond to ConcepTests in class. The goal is to use technology to engage students in conceptual thinking both in and out of the classroom.

  2. NASA Space Radiation Program Integrative Risk Model Toolkit

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Hu, Shaowen; Plante, Ianik; Ponomarev, Artem L.; Sandridge, Chris

    2015-01-01

    NASA Space Radiation Program Element scientists have been actively involved in development of an integrative risk models toolkit that includes models for acute radiation risk and organ dose projection (ARRBOD), NASA space radiation cancer risk projection (NSCR), hemocyte dose estimation (HemoDose), GCR event-based risk model code (GERMcode), and relativistic ion tracks (RITRACKS), NASA radiation track image (NASARTI), and the On-Line Tool for the Assessment of Radiation in Space (OLTARIS). This session will introduce the components of the risk toolkit with opportunity for hands on demonstrations. The brief descriptions of each tools are: ARRBOD for Organ dose projection and acute radiation risk calculation from exposure to solar particle event; NSCR for Projection of cancer risk from exposure to space radiation; HemoDose for retrospective dose estimation by using multi-type blood cell counts; GERMcode for basic physical and biophysical properties for an ion beam, and biophysical and radiobiological properties for a beam transport to the target in the NASA Space Radiation Laboratory beam line; RITRACKS for simulation of heavy ion and delta-ray track structure, radiation chemistry, DNA structure and DNA damage at the molecular scale; NASARTI for modeling of the effects of space radiation on human cells and tissue by incorporating a physical model of tracks, cell nucleus, and DNA damage foci with image segmentation for the automated count; and OLTARIS, an integrated tool set utilizing HZETRN (High Charge and Energy Transport) intended to help scientists and engineers study the effects of space radiation on shielding materials, electronics, and biological systems.

  3. Simulation Toolkit for Renewable Energy Advanced Materials Modeling

    Energy Science and Technology Software Center (ESTSC)

    2013-11-13

    STREAMM is a collection of python classes and scripts that enables and eases the setup of input files and configuration files for simulations of advanced energy materials. The core STREAMM python classes provide a general framework for storing, manipulating and analyzing atomic/molecular coordinates to be used in quantum chemistry and classical molecular dynamics simulations of soft materials systems. The design focuses on enabling the interoperability of materials simulation codes such as GROMACS, LAMMPS and Gaussian.

  4. Technology meets aggregate

    SciTech Connect

    Wilson, C.; Swan, C.

    2007-07-01

    New technology carried out at Tufts University and the University of Massachusetts on synthetic lightweight aggregate has created material from various qualities of fly ash from coal-fired power plants for use in different engineered applications. In pilot scale manufacturing tests an 'SLA' containing 80% fly ash and 20% mixed plastic waste from packaging was produced by 'dry blending' mixed plastic with high carbon fly ash. A trial run was completed to produce concrete masonry unit (CMU) blocks at a full-scale facility. It has been shown that SLA can be used as a partial substitution of a traditional stone aggregate in hot asphalt mix. 1 fig., 2 photos.

  5. BTK: an open-source toolkit for fetal brain MR image processing.

    PubMed

    Rousseau, François; Oubel, Estanislao; Pontabry, Julien; Schweitzer, Marc; Studholme, Colin; Koob, Mériam; Dietemann, Jean-Louis

    2013-01-01

    Studies about brain maturation aim at providing a better understanding of brain development and links between brain changes and cognitive development. Such studies are of great interest for diagnosis help and clinical course of development and treatment of illnesses. However, the processing of fetal brain MR images remains complicated which limits the translation from the research to the clinical domain. In this article, we describe an open-source image processing toolkit dedicated to these images. In this toolkit various tools are included such as: denoising, image reconstruction, super-resolution and tractography. The BTK resource program (distributed under CeCILL-B license) is developed in C++ and relies on common medical imaging libraries such as Insight Toolkit (ITK), Visualization Toolkit (VTK) and Open Multi-Processing (OpenMP). PMID:23036854

  6. 78 FR 14773 - U.S. Environmental Solutions Toolkit-Landfill Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-07

    ... environmental technologies that will outline U.S. approaches to a series of environmental problems and highlight... Toolkit will refer users in foreign markets to U.S. approaches to solving environmental problems and to...

  7. Challenges and Opportunities in Using Automatic Differentiation with Object-Oriented Toolkits for Scientific Computing

    SciTech Connect

    Hovland, P; Lee, S; McInnes, L; Norris, B; Smith, B

    2001-04-17

    The increased use of object-oriented toolkits in large-scale scientific simulation presents new opportunities and challenges for the use of automatic (or algorithmic) differentiation (AD) techniques, especially in the context of optimization. Because object-oriented toolkits use well-defined interfaces and data structures, there is potential for simplifying the AD process. Furthermore, derivative computation can be improved by exploiting high-level information about numerical and computational abstractions. However, challenges to the successful use of AD with these toolkits also exist. Among the greatest challenges is balancing the desire to limit the scope of the AD process with the desire to minimize the work required of a user. They discuss their experiences in integrating AD with the PETSc, PVODE, and TAO toolkits and the plans for future research and development in this area.

  8. Development and Evaluation of a Toolkit to Assess Partnership Readiness for Community-Based Participatory Research

    PubMed Central

    Andrews, Jeannette O.; Cox, Melissa J.; Newman, Susan D.; Meadows, Otha

    2012-01-01

    An earlier investigation by academic and community co-investigators led to the development of the Partnership Readiness for Community-Based Participatory Research (CBPR) Model, which defined major dimensions and key indicators of partnership readiness. As a next step in this process, we used qualitative methods, cognitive pretesting, and expert reviews to develop a working guide, or toolkit, based on the model for academic and community partners to assess and leverage their readiness for CBPR. The 75-page toolkit is designed as a qualitative assessment promoting equal voice and transparent, bi-directional discussions among all the partners. The toolkit is formatted to direct individual partner assessments, followed by team assessments, discussions, and action plans to optimize their goodness of fit, capacity, and operations to conduct CBPR. The toolkit has been piloted with two cohorts in the Medical University of South Carolina’s (MUSC) Community Engaged Scholars (CES) Program with promising results from process and outcome evaluation data. PMID:21623021

  9. The effectiveness of toolkits as knowledge translation strategies for integrating evidence into clinical care: a systematic review

    PubMed Central

    Yamada, Janet; Shorkey, Allyson; Barwick, Melanie; Widger, Kimberley; Stevens, Bonnie J

    2015-01-01

    Objectives The aim of this systematic review was to evaluate the effectiveness of toolkits as a knowledge translation (KT) strategy for facilitating the implementation of evidence into clinical care. Toolkits include multiple resources for educating and/or facilitating behaviour change. Design Systematic review of the literature on toolkits. Methods A search was conducted on MEDLINE, EMBASE, PsycINFO and CINAHL. Studies were included if they evaluated the effectiveness of a toolkit to support the integration of evidence into clinical care, and if the KT goal(s) of the study were to inform, share knowledge, build awareness, change practice, change behaviour, and/or clinical outcomes in healthcare settings, inform policy, or to commercialise an innovation. Screening of studies, assessment of methodological quality and data extraction for the included studies were conducted by at least two reviewers. Results 39 relevant studies were included for full review; 8 were rated as moderate to strong methodologically with clinical outcomes that could be somewhat attributed to the toolkit. Three of the eight studies evaluated the toolkit as a single KT intervention, while five embedded the toolkit into a multistrategy intervention. Six of the eight toolkits were partially or mostly effective in changing clinical outcomes and six studies reported on implementation outcomes. The types of resources embedded within toolkits varied but included predominantly educational materials. Conclusions Future toolkits should be informed by high-quality evidence and theory, and should be evaluated using rigorous study designs to explain the factors underlying their effectiveness and successful implementation. PMID:25869686

  10. Aggregates, broccoli and cauliflower

    NASA Astrophysics Data System (ADS)

    Grey, Francois; Kjems, Jørgen K.

    1989-09-01

    Naturally grown structures with fractal characters like broccoli and cauliflower are discussed and compared with DLA-type aggregates. It is suggested that the branching density can be used to characterize the growth process and an experimental method to determine this parameter is proposed.

  11. Wind Integration National Dataset (WIND) Toolkit; NREL (National Renewable Energy Laboratory)

    SciTech Connect

    Draxl, Caroline; Hodge, Bri-Mathias

    2015-07-14

    A webinar about the Wind Integration National Dataset (WIND) Toolkit was presented by Bri-Mathias Hodge and Caroline Draxl on July 14, 2015. It was hosted by the Southern Alliance for Clean Energy. The toolkit is a grid integration data set that contains meteorological and power data at a 5-minute resolution across the continental United States for 7 years and hourly power forecasts.

  12. [Design of a volume-rendering toolkit using GPU-based ray-casting].

    PubMed

    Liu, Wen-Qing; Chen, Chun-Xiao; Lu, Li-Na

    2009-09-01

    This paper presents an approach to GPU-based ray-casting of a shader model 3.0 compatible graphics card. In addition, a software toolkit is designed using the proposed algorithm to make the full benefit of GPU by extending VTK. Experimental results suggest that remarkable speedups are observed using GPU-based algorithm, and high-quality renderings can be achieved at interactive framerates above 60 fps. The toolkit designed provides a high level of usability and extendibility. PMID:20073244

  13. The MPI bioinformatics Toolkit as an integrative platform for advanced protein sequence and structure analysis.

    PubMed

    Alva, Vikram; Nam, Seung-Zin; Söding, Johannes; Lupas, Andrei N

    2016-07-01

    The MPI Bioinformatics Toolkit (http://toolkit.tuebingen.mpg.de) is an open, interactive web service for comprehensive and collaborative protein bioinformatic analysis. It offers a wide array of interconnected, state-of-the-art bioinformatics tools to experts and non-experts alike, developed both externally (e.g. BLAST+, HMMER3, MUSCLE) and internally (e.g. HHpred, HHblits, PCOILS). While a beta version of the Toolkit was released 10 years ago, the current production-level release has been available since 2008 and has serviced more than 1.6 million external user queries. The usage of the Toolkit has continued to increase linearly over the years, reaching more than 400 000 queries in 2015. In fact, through the breadth of its tools and their tight interconnection, the Toolkit has become an excellent platform for experimental scientists as well as a useful resource for teaching bioinformatic inquiry to students in the life sciences. In this article, we report on the evolution of the Toolkit over the last ten years, focusing on the expansion of the tool repertoire (e.g. CS-BLAST, HHblits) and on infrastructural work needed to remain operative in a changing web environment. PMID:27131380

  14. Designing a Composable Geometric Toolkit for Versatility in Applications to Simulation Development

    NASA Technical Reports Server (NTRS)

    Reed, Gregory S.; Campbell, Thomas

    2008-01-01

    Conceived and implemented through the development of probabilistic risk assessment simulations for Project Constellation, the Geometric Toolkit allows users to create, analyze, and visualize relationships between geometric shapes in three-space using the MATLAB computing environment. The key output of the toolkit is an analysis of how emanations from one "source" geometry (e.g., a leak in a pipe) will affect another "target" geometry (e.g., another heat-sensitive component). It can import computer-aided design (CAD) depictions of a system to be analyzed, allowing the user to reliably and easily represent components within the design and determine the relationships between them, ultimately supporting more technical or physics-based simulations that use the toolkit. We opted to develop a variety of modular, interconnecting software tools to extend the scope of the toolkit, providing the capability to support a range of applications. This concept of simulation composability allows specially-developed tools to be reused by assembling them in various combinations. As a result, the concepts described here and implemented in this toolkit have a wide range of applications outside the domain of risk assessment. To that end, the Geometric Toolkit has been evaluated for use in other unrelated applications due to the advantages provided by its underlying design.

  15. Case File: The Spazzoids

    MedlinePlus

    ... classroom activities. More Related Links Healthy Schools Case file: The Spazzzoids Recommend on Facebook Tweet Share Compartir ... your classroom activities. More Related Links Healthy Schools File Formats Help: How do I view different file ...

  16. New Mexico aggregate production sites, 1997-1999

    USGS Publications Warehouse

    Orris, Greta J.

    2000-01-01

    This report presents data, including latitude and longitude, for aggregate sites in New Mexico that were believed to be active in the period 1997-1999. The data are presented in paper form in Part A of this report and as Microsoft Excel 97 and Data Interchange Format (DIF) files in Part B. The work was undertaken as part of the effort to update information for the National Atlas. This compilation includes data from: the files of U.S. Geological Survey (USGS); company contacts; the New Mexico Bureau of Mines and Mineral Resources, New Mexico Bureau of Mine Inspection, and the Mining and Minerals Division of the New Mexico Energy, Minerals and Natural Resources Department (Hatton and others, 1998); the Bureau of Land Management Information; and direct communications with some of the aggregate operators. Additional information on most of the sites is available in Hatton and others (1998).

  17. Photophoretic force on aggregate grains

    NASA Astrophysics Data System (ADS)

    Matthews, Lorin S.; Kimery, Jesse B.; Wurm, Gerhard; de Beule, Caroline; Kuepper, Markus; Hyde, Truell W.

    2016-01-01

    The photophoretic force may impact planetary formation by selectively moving solid particles based on their composition and structure. This generates collision velocities between grains of different sizes and sorts the dust in protoplanetary discs by composition. This numerical simulation studied the photophoretic force acting on fractal dust aggregates of μm-scale radii. Results show that aggregates tend to have greater photophoretic drift velocities than spheres of similar mass or radii, though with a greater spread in the velocity. While the drift velocities of compact aggregates continue to increase as the aggregates grow larger in size, fluffy aggregates have drift velocities which are relatively constant with size. Aggregates formed from an initially polydisperse size distribution of dust grains behave differently from aggregates formed from a monodisperse population, having smaller drift velocities with directions which deviate substantially from the direction of illumination. Results agree with microgravity experiments which show the difference of photophoretic forces with aggregation state.

  18. Proteins aggregation and human diseases

    NASA Astrophysics Data System (ADS)

    Hu, Chin-Kun

    2015-04-01

    Many human diseases and the death of most supercentenarians are related to protein aggregation. Neurodegenerative diseases include Alzheimer's disease (AD), Huntington's disease (HD), Parkinson's disease (PD), frontotemporallobar degeneration, etc. Such diseases are due to progressive loss of structure or function of neurons caused by protein aggregation. For example, AD is considered to be related to aggregation of Aβ40 (peptide with 40 amino acids) and Aβ42 (peptide with 42 amino acids) and HD is considered to be related to aggregation of polyQ (polyglutamine) peptides. In this paper, we briefly review our recent discovery of key factors for protein aggregation. We used a lattice model to study the aggregation rates of proteins and found that the probability for a protein sequence to appear in the conformation of the aggregated state can be used to determine the temperature at which proteins can aggregate most quickly. We used molecular dynamics and simple models of polymer chains to study relaxation and aggregation of proteins under various conditions and found that when the bending-angle dependent and torsion-angle dependent interactions are zero or very small, then protein chains tend to aggregate at lower temperatures. All atom models were used to identify a key peptide chain for the aggregation of insulin chains and to find that two polyQ chains prefer anti-parallel conformation. It is pointed out that in many cases, protein aggregation does not result from protein mis-folding. A potential drug from Chinese medicine was found for Alzheimer's disease.

  19. 43 CFR 4.1352 - Who may file; where to file; when to file.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 43 Public Lands: Interior 1 2010-10-01 2010-10-01 false Who may file; where to file; when to file... Indian Lands) § 4.1352 Who may file; where to file; when to file. (a) The applicant or operator may file... to file a timely request constitutes a waiver of the opportunity for a hearing before OSM makes...

  20. Scientific Visualization Using the Flow Analysis Software Toolkit (FAST)

    NASA Technical Reports Server (NTRS)

    Bancroft, Gordon V.; Kelaita, Paul G.; Mccabe, R. Kevin; Merritt, Fergus J.; Plessel, Todd C.; Sandstrom, Timothy A.; West, John T.

    1993-01-01

    Over the past few years the Flow Analysis Software Toolkit (FAST) has matured into a useful tool for visualizing and analyzing scientific data on high-performance graphics workstations. Originally designed for visualizing the results of fluid dynamics research, FAST has demonstrated its flexibility by being used in several other areas of scientific research. These research areas include earth and space sciences, acid rain and ozone modelling, and automotive design, just to name a few. This paper describes the current status of FAST, including the basic concepts, architecture, existing functionality and features, and some of the known applications for which FAST is being used. A few of the applications, by both NASA and non-NASA agencies, are outlined in more detail. Described in the Outlines are the goals of each visualization project, the techniques or 'tricks' used lo produce the desired results, and custom modifications to FAST, if any, done to further enhance the analysis. Some of the future directions for FAST are also described.

  1. Using the Browser for Science: A Collaborative Toolkit for Astronomy

    NASA Astrophysics Data System (ADS)

    Connolly, A. J.; Smith, I.; Krughoff, K. S.; Gibson, R.

    2011-07-01

    Astronomical surveys have yielded hundreds of terabytes of catalogs and images that span many decades of the electromagnetic spectrum. Even when observatories provide user-friendly web interfaces, exploring these data resources remains a complex and daunting task. In contrast, gadgets and widgets have become popular in social networking (e.g. iGoogle, Facebook). They provide a simple way to make complex data easily accessible that can be customized based on the interest of the user. With ASCOT (an AStronomical COllaborative Toolkit) we expand on these concepts to provide a customizable and extensible gadget framework for use in science. Unlike iGoogle, where all of the gadgets are independent, the gadgets we develop communicate and share information, enabling users to visualize and interact with data through multiple, simultaneous views. With this approach, web-based applications for accessing and visualizing data can be generated easily and, by linking these tools together, integrated and powerful data analysis and discovery tools can be constructed.

  2. Machine learning for a Toolkit for Image Mining

    NASA Technical Reports Server (NTRS)

    Delanoy, Richard L.

    1995-01-01

    A prototype user environment is described that enables a user with very limited computer skills to collaborate with a computer algorithm to develop search tools (agents) that can be used for image analysis, creating metadata for tagging images, searching for images in an image database on the basis of image content, or as a component of computer vision algorithms. Agents are learned in an ongoing, two-way dialogue between the user and the algorithm. The user points to mistakes made in classification. The algorithm, in response, attempts to discover which image attributes are discriminating between objects of interest and clutter. It then builds a candidate agent and applies it to an input image, producing an 'interest' image highlighting features that are consistent with the set of objects and clutter indicated by the user. The dialogue repeats until the user is satisfied. The prototype environment, called the Toolkit for Image Mining (TIM) is currently capable of learning spectral and textural patterns. Learning exhibits rapid convergence to reasonable levels of performance and, when thoroughly trained, Fo appears to be competitive in discrimination accuracy with other classification techniques.

  3. The Insight ToolKit image registration framework

    PubMed Central

    Avants, Brian B.; Tustison, Nicholas J.; Stauffer, Michael; Song, Gang; Wu, Baohua; Gee, James C.

    2014-01-01

    Publicly available scientific resources help establish evaluation standards, provide a platform for teaching and improve reproducibility. Version 4 of the Insight ToolKit (ITK4) seeks to establish new standards in publicly available image registration methodology. ITK4 makes several advances in comparison to previous versions of ITK. ITK4 supports both multivariate images and objective functions; it also unifies high-dimensional (deformation field) and low-dimensional (affine) transformations with metrics that are reusable across transform types and with composite transforms that allow arbitrary series of geometric mappings to be chained together seamlessly. Metrics and optimizers take advantage of multi-core resources, when available. Furthermore, ITK4 reduces the parameter optimization burden via principled heuristics that automatically set scaling across disparate parameter types (rotations vs. translations). A related approach also constrains steps sizes for gradient-based optimizers. The result is that tuning for different metrics and/or image pairs is rarely necessary allowing the researcher to more easily focus on design/comparison of registration strategies. In total, the ITK4 contribution is intended as a structure to support reproducible research practices, will provide a more extensive foundation against which to evaluate new work in image registration and also enable application level programmers a broad suite of tools on which to build. Finally, we contextualize this work with a reference registration evaluation study with application to pediatric brain labeling.1 PMID:24817849

  4. A Highly Characterized Yeast Toolkit for Modular, Multipart Assembly.

    PubMed

    Lee, Michael E; DeLoache, William C; Cervantes, Bernardo; Dueber, John E

    2015-09-18

    Saccharomyces cerevisiae is an increasingly attractive host for synthetic biology because of its long history in industrial fermentations. However, until recently, most synthetic biology systems have focused on bacteria. While there is a wealth of resources and literature about the biology of yeast, it can be daunting to navigate and extract the tools needed for engineering applications. Here we present a versatile engineering platform for yeast, which contains both a rapid, modular assembly method and a basic set of characterized parts. This platform provides a framework in which to create new designs, as well as data on promoters, terminators, degradation tags, and copy number to inform those designs. Additionally, we describe genome-editing tools for making modifications directly to the yeast chromosomes, which we find preferable to plasmids due to reduced variability in expression. With this toolkit, we strive to simplify the process of engineering yeast by standardizing the physical manipulations and suggesting best practices that together will enable more straightforward translation of materials and data from one group to another. Additionally, by relieving researchers of the burden of technical details, they can focus on higher-level aspects of experimental design. PMID:25871405

  5. DynaMIT: the dynamic motif integration toolkit

    PubMed Central

    Dassi, Erik; Quattrone, Alessandro

    2016-01-01

    De-novo motif search is a frequently applied bioinformatics procedure to identify and prioritize recurrent elements in sequences sets for biological investigation, such as the ones derived from high-throughput differential expression experiments. Several algorithms have been developed to perform motif search, employing widely different approaches and often giving divergent results. In order to maximize the power of these investigations and ultimately be able to draft solid biological hypotheses, there is the need for applying multiple tools on the same sequences and merge the obtained results. However, motif reporting formats and statistical evaluation methods currently make such an integration task difficult to perform and mostly restricted to specific scenarios. We thus introduce here the Dynamic Motif Integration Toolkit (DynaMIT), an extremely flexible platform allowing to identify motifs employing multiple algorithms, integrate them by means of a user-selected strategy and visualize results in several ways; furthermore, the platform is user-extendible in all its aspects. DynaMIT is freely available at http://cibioltg.bitbucket.org. PMID:26253738

  6. Fewbody: Numerical toolkit for simulating small-N gravitational dynamics

    NASA Astrophysics Data System (ADS)

    Fregeau, John

    2012-08-01

    Fewbody is a numerical toolkit for simulating small-N gravitational dynamics. It is a general N-body dynamics code, although it was written for the purpose of performing scattering experiments, and therefore has several features that make it well-suited for this purpose. Fewbody uses the 8th-order Runge-Kutta Prince-Dormand integration method with 9th-order error estimate and adaptive timestep to advance the N-body system forward in time. It integrates the usual formulation of the N-body equations in configuration space, but allows for the option of global pairwise Kustaanheimo-Stiefel (K-S) regularization (Heggie 1974; Mikkola 1985). The code uses a binary tree algorithm to classify the N-body system into a set of independently bound hierarchies, and performs collisions between stars in the “sticky star” approximation. Fewbody contains a collection of command line utilities that can be used to perform individual scattering and N-body interactions, but is more generally a library of functions that can be used from within other codes.

  7. Targeting protein function: the expanding toolkit for conditional disruption

    PubMed Central

    Campbell, Amy E.; Bennett, Daimark

    2016-01-01

    A major objective in biological research is to understand spatial and temporal requirements for any given gene, especially in dynamic processes acting over short periods, such as catalytically driven reactions, subcellular transport, cell division, cell rearrangement and cell migration. The interrogation of such processes requires the use of rapid and flexible methods of interfering with gene function. However, many of the most widely used interventional approaches, such as RNAi or CRISPR (clustered regularly interspaced short palindromic repeats)-Cas9 (CRISPR-associated 9), operate at the level of the gene or its transcripts, meaning that the effects of gene perturbation are exhibited over longer time frames than the process under investigation. There has been much activity over the last few years to address this fundamental problem. In the present review, we describe recent advances in disruption technologies acting at the level of the expressed protein, involving inducible methods of protein cleavage, (in)activation, protein sequestration or degradation. Drawing on examples from model organisms we illustrate the utility of fast-acting techniques and discuss how different components of the molecular toolkit can be employed to dissect previously intractable biochemical processes and cellular behaviours. PMID:27574023

  8. A Cross-platform Toolkit for Mass Spectrometry and Proteomics

    PubMed Central

    Chambers, Matthew C.; Maclean, Brendan; Burke, Robert; Amodei, Dario; Ruderman, Daniel L; Neumann, Steffen; Gatto, Laurent; Fischer, Bernd; Pratt, Brian; Egertson, Jarrett; Hoff, Katherine; Kessner, Darren; Tasman, Natalie; Shulman, Nicholas; Frewen, Barbara; Baker, Tahmina A.; Brusniak, Mi-Youn; Paulse, Christopher; Creasy, David; Flashner, Lisa; Kani, Kian; Moulding, Chris; Seymour, Sean L.; Nuwaysir, Lydia M.; Lefebvre, Brent; Kuhlmann, Frank; Roark, Joe; Rainer, Paape; Detlev, Suckau; Hemenway, Tina; Huhmer, Andreas; Langridge, James; Connolly, Brian; Chadick, Trey; Holly, Krisztina; Eckels, Josh; Deutsch, Eric W.; Moritz, Robert L; Katz, Jonathan E.; Agus, David B.; MacCoss, Michael; Tabb, David L.; Mallick, Parag

    2012-01-01

    Mass-spectrometry-based proteomics has become an important component of biological research. Numerous proteomics methods have been developed to identify and quantify the proteins in biological and clinical samples1, identify pathways affected by endogenous and exogenous perturbations2, and characterize protein complexes3. Despite successes, the interpretation of vast proteomics datasets remains a challenge. There have been several calls for improvements and standardization of proteomics data analysis frameworks, as well as for an application-programming interface for proteomics data access4,5. In response, we have developed the ProteoWizard Toolkit, a robust set of open-source, software libraries and applications designed to facilitate proteomics research. The libraries implement the first-ever, non-commercial, unified data access interface for proteomics, bridging field-standard open formats and all common vendor formats. In addition, diverse software classes enable rapid development of vendor-agnostic proteomics software. Additionally, ProteoWizard projects and applications, building upon the core libraries, are becoming standard tools for enabling significant proteomics inquiries. PMID:23051804

  9. Microgrid Design Toolkit (MDT) Technical Documentation and Component Summaries

    SciTech Connect

    Arguello, Bryan; Gearhart, Jared Lee; Jones, Katherine A.; Eddy, John P.

    2015-09-01

    The Microgrid Design Toolkit (MDT) is a decision support software tool for microgrid designers to use during the microgrid design process. The models that support the two main capabilities in MDT are described. The first capability, the Microgrid Sizing Capability (MSC), is used to determine the size and composition of a new microgrid in the early stages of the design process. MSC is a mixed-integer linear program that is focused on developing a microgrid that is economically viable when connected to the grid. The second capability is focused on refining a microgrid design for operation in islanded mode. This second capability relies on two models: the Technology Management Optimization (TMO) model and Performance Reliability Model (PRM). TMO uses a genetic algorithm to create and refine a collection of candidate microgrid designs. It uses PRM, a simulation based reliability model, to assess the performance of these designs. TMO produces a collection of microgrid designs that perform well with respect to one or more performance metrics.

  10. Ten years of medical imaging standardization and prototypical implementation: the DICOM standard and the OFFIS DICOM toolkit (DCMTK)

    NASA Astrophysics Data System (ADS)

    Eichelberg, Marco; Riesmeier, Joerg; Wilkens, Thomas; Hewett, Andrew J.; Barth, Andreas; Jensch, Peter

    2004-04-01

    In 2003, the DICOM standard celebrated its 10th anniversary. Aside from the standard itself, also OFFIS" open source DICOM toolkit DCMTK, which has continuously followed the development of DICOM, turns 10 years old. On this occasion, this article looks back at the main standardization efforts in DICOM and illustrates related developments in DCMTK. Considering the development of the DICOM standard, it is possible to distinguish several phases of progress. Within the first phase at the beginning of the 1990s, basic network services for image transfer and retrieval were being introduced. The second phase, in the mid 1990s, was characterized by advances in the specification of a file format and of regulations for media interchange. In the later but partly parallel third phase, DICOM predominantly dealt with the problem of optimizing the workflow within imaging departments. As a result of the fact that it was now possible to exchange images between different systems, efforts concerning image display consistency followed in a fourth phase at the end of the 1990s. In the current fifth phase, security enhancements are being integrated into the standard. In another phase of progress, which took place over a relatively long time period concurrently to the other mentioned phases, DICOM Structured Reporting was developed.

  11. Dynamics of fire ant aggregations

    NASA Astrophysics Data System (ADS)

    Tennenbaum, Michael; Hu, David; Fernandez-Nieves, Alberto

    Fire ant aggregations are an inherently active system. Each ant harvests its own energy and can convert it into motion. The motion of individual ants contributes non-trivially to the bulk material properties of the aggregation. We have measured some of these properties using plate-plate rheology, where the response to an applied external force or deformation is measured. In this talk, we will present data pertaining to the aggregation behavior in the absence of any external force. We quantify the aggregation dynamics by monitoring the rotation of the top plate and by measuring the normal force. We then compare the results with visualizations of 2D aggregations.

  12. Tripal v1.1: a standards-based toolkit for construction of online genetic and genomic databases

    PubMed Central

    Sanderson, Lacey-Anne; Ficklin, Stephen P.; Cheng, Chun-Huai; Jung, Sook; Feltus, Frank A.; Bett, Kirstin E.; Main, Dorrie

    2013-01-01

    Tripal is an open-source freely available toolkit for construction of online genomic and genetic databases. It aims to facilitate development of community-driven biological websites by integrating the GMOD Chado database schema with Drupal, a popular website creation and content management software. Tripal provides a suite of tools for interaction with a Chado database and display of content therein. The tools are designed to be generic to support the various ways in which data may be stored in Chado. Previous releases of Tripal have supported organisms, genomic libraries, biological stocks, stock collections and genomic features, their alignments and annotations. Also, Tripal and its extension modules provided loaders for commonly used file formats such as FASTA, GFF, OBO, GAF, BLAST XML, KEGG heir files and InterProScan XML. Default generic templates were provided for common views of biological data, which could be customized using an open Application Programming Interface to change the way data are displayed. Here, we report additional tools and functionality that are part of release v1.1 of Tripal. These include (i) a new bulk loader that allows a site curator to import data stored in a custom tab delimited format; (ii) full support of every Chado table for Drupal Views (a powerful tool allowing site developers to construct novel displays and search pages); (iii) new modules including ‘Feature Map’, ‘Genetic’, ‘Publication’, ‘Project’, ‘Contact’ and the ‘Natural Diversity’ modules. Tutorials, mailing lists, download and set-up instructions, extension modules and other documentation can be found at the Tripal website located at http://tripal.info. Database URL: http://tripal.info/ PMID:24163125

  13. Tripal v1.1: a standards-based toolkit for construction of online genetic and genomic databases.

    PubMed

    Sanderson, Lacey-Anne; Ficklin, Stephen P; Cheng, Chun-Huai; Jung, Sook; Feltus, Frank A; Bett, Kirstin E; Main, Dorrie

    2013-01-01

    Tripal is an open-source freely available toolkit for construction of online genomic and genetic databases. It aims to facilitate development of community-driven biological websites by integrating the GMOD Chado database schema with Drupal, a popular website creation and content management software. Tripal provides a suite of tools for interaction with a Chado database and display of content therein. The tools are designed to be generic to support the various ways in which data may be stored in Chado. Previous releases of Tripal have supported organisms, genomic libraries, biological stocks, stock collections and genomic features, their alignments and annotations. Also, Tripal and its extension modules provided loaders for commonly used file formats such as FASTA, GFF, OBO, GAF, BLAST XML, KEGG heir files and InterProScan XML. Default generic templates were provided for common views of biological data, which could be customized using an open Application Programming Interface to change the way data are displayed. Here, we report additional tools and functionality that are part of release v1.1 of Tripal. These include (i) a new bulk loader that allows a site curator to import data stored in a custom tab delimited format; (ii) full support of every Chado table for Drupal Views (a powerful tool allowing site developers to construct novel displays and search pages); (iii) new modules including 'Feature Map', 'Genetic', 'Publication', 'Project', 'Contact' and the 'Natural Diversity' modules. Tutorials, mailing lists, download and set-up instructions, extension modules and other documentation can be found at the Tripal website located at http://tripal.info. DATABASE URL: http://tripal.info/. PMID:24163125

  14. Common file formats.

    PubMed

    Leonard, Shonda A; Littlejohn, Timothy G; Baxevanis, Andreas D

    2007-01-01

    This appendix discusses a few of the file formats frequently encountered in bioinformatics. Specifically, it reviews the rules for generating FASTA files and provides guidance for interpreting NCBI descriptor lines, commonly found in FASTA files. In addition, it reviews the construction of GenBank, Phylip, MSF and Nexus files. PMID:18428774

  15. Census of Population and Housing, 1980: Summary Tape File 3F. Technical Documentation.

    ERIC Educational Resources Information Center

    Bureau of the Census (DOC), Washington, DC. Data User Services Div.

    This report provides technical documentation associated with a 1980 Census of Population and Housing Summary Tape File (STF) 3F--which contains responses to the extended questionnaire summarized in STF 3, aggregated by school district. The file contains sample data inflated to represent the total population, 100% counts, and unweighted sample…

  16. 77 FR 43897 - Self-Regulatory Organizations; Chicago Board Options Exchange, Incorporated; Notice of Filing of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-26

    ... COMMISSION Release No. 34-67478; File No. SR-CBOE-2012-066] Self-Regulatory Organizations; Chicago Board... Exchange's requirement that TPHs file reports with the Exchange for any customer who held aggregate large... ( http://www.sec.gov/rules/sro.shtml ); or Send an email to rule-comments@sec.gov . Please include...

  17. Making Graphene Resist Aggregation

    NASA Astrophysics Data System (ADS)

    Luo, Jiayan

    Graphene-based sheets have stimulated great interest in many scientific disciplines and shown promise for wide potential applications. Among various ways of creating single atomic layer carbon sheets, a promising route for bulk production is to first chemically exfoliate graphite powders to graphene oxide (GO) sheets, followed by reduction to form chemically modified graphene (CMG). Due to the strong van der Waals attraction between graphene sheets, CMG tends to aggregate. The restacking of sheets is largely uncontrollable and irreversible, thus it reduces their processability and compromises properties such as accessible surface area. Strategies based on colloidal chemistry have been applied to keep CMG dispersed in solvents by introducing electrostatic repulsion to overcome the van der Waals attraction or adding spacers to increase the inter-sheet spacing. In this dissertation, two very different ideas that can prevent CMG aggregation without extensively modifying the material or introducing foreign spacer materials are introduced. The van der Waals potential decreases with reduced overlapping area between sheets. For CMG, reducing the lateral dimension from micrometer to nanometer scale should greatly enhance their colloidal stability with additional advantages of increased charge density and decreased probability to interact. The enhanced colloidal stability of GO and CMG nanocolloids makes them especially promising for spectroscopy based bio-sensing applications. For potential applications in a compact bulk solid form, the sheets were converted into paper-ball like structure using capillary compression in evaporating aerosol droplets. The crumpled graphene balls are stabilized by locally folded pi-pi stacked ridges, and do not unfold or collapse during common processing steps. They can tightly pack without greatly reducing the surface area. This form of graphene leads to scalable performance in energy storage. For example, planer sheets tend to aggregate and

  18. Structure of Viral Aggregates

    NASA Astrophysics Data System (ADS)

    Barr, Stephen; Luijten, Erik

    2010-03-01

    The aggregation of virus particles is a particular form of colloidal self-assembly, since viruses of a give type are monodisperse and have identical, anisotropic surface charge distributions. In small-angle X-ray scattering experiments, the Qbeta virus was found to organize in different crystal structures in the presence of divalent salt and non-adsorbing polymer. Since a simple isotropic potential cannot explain the occurrence of all observed phases, we employ computer simulations to investigate how the surface charge distribution affects the virus interactions. Using a detailed model of the virus particle, we find an asymmetric ion distribution around the virus which gives rise to the different phases observed.

  19. Parameter Sweep and Optimization of Loosely Coupled Simulations Using the DAKOTA Toolkit

    SciTech Connect

    Elwasif, Wael R; Bernholdt, David E; Pannala, Sreekanth; Allu, Srikanth; Foley, Samantha S

    2012-01-01

    The increasing availability of large scale computing capabilities has accelerated the development of high-fidelity coupled simulations. Such simulations typically involve the integration of models that implement various aspects of the complex phenomena under investigation. Coupled simulations are playing an integral role in fields such as climate modeling, earth systems modeling, rocket simulations, computational chemistry, fusion research, and many other computational fields. Model coupling provides scientists with systematic ways to virtually explore the physical, mathematical, and computational aspects of the problem. Such exploration is rarely done using a single execution of a simulation, but rather by aggregating the results from many simulation runs that, together, serve to bring to light novel knowledge about the system under investigation. Furthermore, it is often the case (particularly in engineering disciplines) that the study of the underlying system takes the form of an optimization regime, where the control parameter space is explored to optimize an objective functions that captures system realizability, cost, performance, or a combination thereof. Novel and flexible frameworks that facilitate the integration of the disparate models into a holistic simulation are used to perform this research, while making efficient use of the available computational resources. In this paper, we describe the integration of the DAKOTA optimization and parameter sweep toolkit with the Integrated Plasma Simulator (IPS), a component-based framework for loosely coupled simulations. The integration allows DAKOTA to exploit the internal task and resource management of the IPS to dynamically instantiate simulation instances within a single IPS instance, allowing for greater control over the trade-off between efficiency of resource utilization and time to completion. We present a case study showing the use of the combined DAKOTA-IPS system to aid in the design of a lithium ion

  20. SMART: Soil Moisture and Runoff Toolkit for Semi-distributed Hydrologic Modeling

    NASA Astrophysics Data System (ADS)

    Ajami, H.; Khan, U.; Tuteja, N. K.; Sharma, A.

    2015-12-01

    A new GIS based semi-distributed hydrologic modeling framework is developed for simulating runoff, evapotranspiration and soil moisture at large catchment scale. The framework is based upon the delineation of contiguous and topologically connected Hydrologic Response Units (HRUs). The HRU delineation methodology consists of delineating first order sub-basins and landforms. To reduce the number of computational elements, simulations are performed across a series of cross sections or equivalent cross sections (ECS) in each first order sub-basin using a 2-dimensional, Richards' equation based distributed hydrological model. Delineation of ECSs is performed by weighting the topographic and physiographic properties of the part or entire first-order sub-basins and has the advantage of reducing the computational time while maintaining reasonable accuracy in simulated hydrologic fluxes. Simulated fluxes from every cross section or ECS are weighted by the respective area from which the cross sections or ECSs were formulated and then aggregated to obtain the catchment scale fluxes. SMART pre- and post-processing scripts are written in MATLAB to automate the cross section delineation, model simulations across multiple cross sections, and post-processing of outputs for visualization. The MATLAB Parallel Processing Toolbox is used for simultaneous simulations of cross sections and is further reduced computational time. SMART pre-processing workflow consists of the following steps: 1) delineation of first order sub-basins using a digital elevation model, 2) hillslope delineation, 3) landform delineation in every first order sub-basin based on topographic and geomorphic properties of a group of sub-basins or entire catchment, 4) formulation of cross sections as well as ECSs and 5) extraction of vegetation and soil parameters using spatially distributed land cover and soil information for the 2-d distributed hydrological model. The post-processing tools generate streamflow at the

  1. Next Generation of the Java Image Science Toolkit (JIST): Visualization and Validation

    PubMed Central

    Li, Bo; Bryan, Frederick; Landman, Bennett A.

    2013-01-01

    Modern medical imaging analyses often involve the concatenation of multiple steps, and neuroimaging analysis is no exception. The Java Image Science Toolkit (JIST) has provided a framework for both end users and engineers to synthesize processing modules into tailored, automatic multi-step processing pipelines (“layouts”) and rapid prototyping of module development. Since its release, JIST has facilitated substantial neuroimaging research and fulfilled much of its intended goal. However, key weaknesses must be addressed for JIST to more fully realize its potential and become accessible to an even broader community base. Herein, we identify three core challenges facing traditional JIST (JIST-I) and present the solutions in the next generation JIST (JIST-II). First, in response to community demand, we have introduced seamless data visualization; users can now click ‘show this data’ through the program interfaces and avoid the need to locating files on the disk. Second, as JIST is an open-source community effort by-design; any developer may add modules to the distribution and extend existing functionality for release. However, the large number of developers and different use cases introduced instability into the overal JIST-I framework, causing users to freeze on different, incompatible versions of JIST-I, and the JIST community began to fracture. JIST-II addresses the problem of compilation instability by performing continuous integration checks nightly to ensure community implemented changes do not negatively impact overall JIST-II functionality. Third, JIST-II allows developers and users to ensure that functionality is preserved by running functionality checks nightly using the continuous integration framework. With JIST-II, users can submit layout test cases and quality control criteria through a new GUI. These test cases capture all runtime parameters and help to ensure that the module produces results within tolerance, despite changes in the underlying

  2. Educational RIS/PACS simulator integrated with the HIPAA compliant auditing (HCA) toolkit

    NASA Astrophysics Data System (ADS)

    Zhou, Zheng; Liu, Brent J.; Huang, H. K.; Zhang, J.

    2005-04-01

    Health Insurance Portability and Accountability Act (HIPAA), a guideline for healthcare privacy and security, has been officially instituted recently. HIPAA mandates healthcare providers to follow its privacy and security rules, one of which is to have the ability to generate audit trails on the data access for any specific patient on demand. Although most current medical imaging systems such as PACS utilize logs to record their activities, there is a lack of formal methodology to interpret these large volumes of log data and generate HIPAA compliant auditing trails. In this paper, we present a HIPAA compliant auditing (HCA) toolkit for auditing the image data flow of PACS. The toolkit can extract pertinent auditing information from the logs of various PACS components and store the information in a centralized auditing database. The HIPAA compliant audit trails can be generated based on the database, which can also be utilized for data analysis to facilitate the dynamic monitoring of the data flow of PACS. In order to demonstrate the HCA toolkit in a PACS environment, it was integrated with the PACS Simulator, that was presented as an educational tool in 2003 and 2004 SPIE. With the integration of the HCA toolkit with the PACS simulator, users can learn HIPAA audit concepts and how to generate audit trails of image data access in PACS, as well as trace the image data flow of PACS Simulator through the toolkit.

  3. PsyToolkit: a software package for programming psychological experiments using Linux.

    PubMed

    Stoet, Gijsbert

    2010-11-01

    PsyToolkit is a set of software tools for programming psychological experiments on Linux computers. Given that PsyToolkit is freely available under the Gnu Public License, open source, and designed such that it can easily be modified and extended for individual needs, it is suitable not only for technically oriented Linux users, but also for students, researchers on small budgets, and universities in developing countries. The software includes a high-level scripting language, a library for the programming language C, and a questionnaire presenter. The software easily integrates with other open source tools, such as the statistical software package R. PsyToolkit is designed to work with external hardware (including IoLab and Cedrus response keyboards and two common digital input/output boards) and to support millisecond timing precision. Four in-depth examples explain the basic functionality of PsyToolkit. Example 1 demonstrates a stimulus-response compatibility experiment. Example 2 demonstrates a novel mouse-controlled visual search experiment. Example 3 shows how to control light emitting diodes using PsyToolkit, and Example 4 shows how to build a light-detection sensor. The last two examples explain the electronic hardware setup such that they can even be used with other software packages. PMID:21139177

  4. Mission Analysis, Operations, and Navigation Toolkit Environment (Monte) Version 040

    NASA Technical Reports Server (NTRS)

    Sunseri, Richard F.; Wu, Hsi-Cheng; Evans, Scott E.; Evans, James R.; Drain, Theodore R.; Guevara, Michelle M.

    2012-01-01

    Monte is a software set designed for use in mission design and spacecraft navigation operations. The system can process measurement data, design optimal trajectories and maneuvers, and do orbit determination, all in one application. For the first time, a single software set can be used for mission design and navigation operations. This eliminates problems due to different models and fidelities used in legacy mission design and navigation software. The unique features of Monte 040 include a blowdown thruster model for GRAIL (Gravity Recovery and Interior Laboratory) with associated pressure models, as well as an updated, optimalsearch capability (COSMIC) that facilitated mission design for ARTEMIS. Existing legacy software lacked the capabilities necessary for these two missions. There is also a mean orbital element propagator and an osculating to mean element converter that allows long-term orbital stability analysis for the first time in compiled code. The optimized trajectory search tool COSMIC allows users to place constraints and controls on their searches without any restrictions. Constraints may be user-defined and depend on trajectory information either forward or backwards in time. In addition, a long-term orbit stability analysis tool (morbiter) existed previously as a set of scripts on top of Monte. Monte is becoming the primary tool for navigation operations, a core competency at JPL. The mission design capabilities in Monte are becoming mature enough for use in project proposals as well as post-phase A mission design. Monte has three distinct advantages over existing software. First, it is being developed in a modern paradigm: object- oriented C++ and Python. Second, the software has been developed as a toolkit, which allows users to customize their own applications and allows the development team to implement requirements quickly, efficiently, and with minimal bugs. Finally, the software is managed in accordance with the CMMI (Capability Maturity Model

  5. GATE: a simulation toolkit for PET and SPECT

    NASA Astrophysics Data System (ADS)

    Jan, S.; Santin, G.; Strul, D.; Staelens, S.; Assié, K.; Autret, D.; Avner, S.; Barbier, R.; Bardiès, M.; Bloomfield, P. M.; Brasse, D.; Breton, V.; Bruyndonckx, P.; Buvat, I.; Chatziioannou, A. F.; Choi, Y.; Chung, Y. H.; Comtat, C.; Donnarieix, D.; Ferrer, L.; Glick, S. J.; Groiselle, C. J.; Guez, D.; Honore, P.-F.; Kerhoas-Cavata, S.; Kirov, A. S.; Kohli, V.; Koole, M.; Krieguer, M.; van der Laan, D. J.; Lamare, F.; Largeron, G.; Lartizien, C.; Lazaro, D.; Maas, M. C.; Maigne, L.; Mayet, F.; Melot, F.; Merheb, C.; Pennacchio, E.; Perez, J.; Pietrzyk, U.; Rannou, F. R.; Rey, M.; Schaart, D. R.; Schmidtlein, C. R.; Simon, L.; Song, T. Y.; Vieira, J.-M.; Visvikis, D.; Van de Walle, R.; Wieërs, E.; Morel, C.

    2004-10-01

    Monte Carlo simulation is an essential tool in emission tomography that can assist in the design of new medical imaging devices, the optimization of acquisition protocols and the development or assessment of image reconstruction algorithms and correction techniques. GATE, the Geant4 Application for Tomographic Emission, encapsulates the Geant4 libraries to achieve a modular, versatile, scripted simulation toolkit adapted to the field of nuclear medicine. In particular, GATE allows the description of time-dependent phenomena such as source or detector movement, and source decay kinetics. This feature makes it possible to simulate time curves under realistic acquisition conditions and to test dynamic reconstruction algorithms. This paper gives a detailed description of the design and development of GATE by the OpenGATE collaboration, whose continuing objective is to improve, document and validate GATE by simulating commercially available imaging systems for PET and SPECT. Large effort is also invested in the ability and the flexibility to model novel detection systems or systems still under design. A public release of GATE licensed under the GNU Lesser General Public License can be downloaded at http://www-lphe.epfl.ch/GATE/. Two benchmarks developed for PET and SPECT to test the installation of GATE and to serve as a tutorial for the users are presented. Extensive validation of the GATE simulation platform has been started, comparing simulations and measurements on commercially available acquisition systems. References to those results are listed. The future prospects towards the gridification of GATE and its extension to other domains such as dosimetry are also discussed.

  6. Taurine and platelet aggregation

    SciTech Connect

    Nauss-Karol, C.; VanderWende, C.; Gaut, Z.N.

    1986-03-01

    Taurine is a putative neurotransmitter or neuromodulator. The endogenous taurine concentration in human platelets, determined by amino acid analysis, is 15 ..mu..M/g. In spite of this high level, taurine is actively accumulated. Uptake is saturable, Na/sup +/ and temperature dependent, and suppressed by metabolic inhibitors, structural analogues, and several classes of centrally active substances. High, medium and low affinity transport processes have been characterized, and the platelet may represent a model system for taurine transport in the CNS. When platelets were incubated with /sup 14/C-taurine for 30 minutes, then resuspended in fresh medium and reincubated for one hour, essentially all of the taurine was retained within the cells. Taurine, at concentrations ranging from 10-1000 ..mu..M, had no effect on platelet aggregation induced by ADP or epinephrine. However, taurine may have a role in platelet aggregation since 35-39% of the taurine taken up by human platelets appears to be secreted during the release reaction induced by low concentrations of either epinephrine or ADP, respectively. This release phenomenon would imply that part of the taurine taken up is stored directly in the dense bodies of the platelet.

  7. SatelliteDL - An IDL Toolkit for the Analysis of Satellite Earth Observations - GOES, MODIS, VIIRS and CERES

    NASA Astrophysics Data System (ADS)

    Fillmore, D. W.; Galloy, M. D.; Kindig, D.

    2013-12-01

    SatelliteDL is an IDL toolkit for the analysis of satellite Earth observations from a diverse set of platforms and sensors. The design features an abstraction layer that allows for easy inclusion of new datasets in a modular way. The core function of the toolkit is the spatial and temporal alignment of satellite swath and geostationary data. IDL has a powerful suite of statistical and visualization tools that can be used in conjunction with SatelliteDL. Our overarching objective is to create utilities that automate the mundane aspects of satellite data analysis, are extensible and maintainable, and do not place limitations on the analysis itself. Toward this end we have constructed SatelliteDL to include (1) HTML and LaTeX API document generation, (2) a unit test framework, (3) automatic message and error logs, (4) HTML and LaTeX plot and table generation, and (5) several real world examples with bundled datasets available for download. For ease of use, datasets, variables and optional workflows may be specified in a flexible format configuration file. Configuration statements may specify, for example, a region and date range, and the creation of images, plots and statistical summary tables for a long list of variables. SatelliteDL enforces data provenance; all data should be traceable and reproducible. The output NetCDF file metadata holds a complete history of the original datasets and their transformations, and a method exists to reconstruct a configuration file from this information. Release 0.1.0 of SatelliteDL is anticipated for the 2013 Fall AGU conference. It will distribute with ingest methods for GOES, MODIS, VIIRS and CERES radiance data (L1) as well as select 2D atmosphere products (L2) such as aerosol and cloud (MODIS and VIIRS) and radiant flux (CERES). Future releases will provide ingest methods for ocean and land surface products, gridded and time averaged datasets (L3 Daily, Monthly and Yearly), and support for 3D products such as temperature and

  8. Holographic characterization of protein aggregates

    NASA Astrophysics Data System (ADS)

    Wang, Chen; Zhong, Xiao; Ruffner, David; Stutt, Alexandra; Philips, Laura; Ward, Michael; Grier, David

    Holographic characterization directly measures the size distribution of subvisible protein aggregates in suspension and offers insights into their morphology. Based on holographic video microscopy, this analytical technique records and interprets holograms of individual aggregates in protein solutions as they flow down a microfluidic channel, without requiring labeling or other exceptional sample preparation. The hologram of an individual protein aggregate is analyzed in real time with the Lorenz-Mie theory of light scattering to measure that aggregate's size and optical properties. Detecting, counting and characterizing subvisible aggregates proceeds fast enough for time-resolved studies, and lends itself to tracking trends in protein aggregation arising from changing environmental factors. No other analytical technique provides such a wealth of particle-resolved characterization data in situ. Holographic characterization promises accelerated development of therapeutic protein formulations, improved process control during manufacturing, and streamlined quality assurance during storage and at the point of use. Mrsec and MRI program of the NSF, Spheryx Inc.

  9. Kekule.js: An Open Source JavaScript Chemoinformatics Toolkit.

    PubMed

    Jiang, Chen; Jin, Xi; Dong, Ying; Chen, Ming

    2016-06-27

    Kekule.js is an open-source, object-oriented JavaScript toolkit for chemoinformatics. It provides methods for many common tasks in molecular informatics, including chemical data input/output (I/O), two- and three-dimensional (2D/3D) rendering of chemical structure, stereo identification, ring perception, structure comparison, and substructure search. Encapsulated widgets to display and edit chemical structures directly in web context are also supplied. Developed with web standards, the toolkit is ideal for building chemoinformatics applications over the Internet. Moreover, it is highly platform-independent and can also be used in desktop or mobile environments. Some initial applications, such as plugins for inputting chemical structures on the web and uses in chemistry education, have been developed based on the toolkit. PMID:27243272

  10. AutoMicromanager: A microscopy scripting toolkit for LABVIEW and other programming environments

    NASA Astrophysics Data System (ADS)

    Ashcroft, Brian Alan; Oosterkamp, Tjerk

    2010-11-01

    We present a scripting toolkit for the acquisition and analysis of a wide variety of imaging data by integrating the ease of use of various programming environments such as LABVIEW, IGOR PRO, MATLAB, SCILAB, and others. This toolkit is designed to allow the user to quickly program a variety of standard microscopy components for custom microscopy applications allowing much more flexibility than other packages. Included are both programming tools as well as graphical user interface classes allowing a standard, consistent, and easy to maintain scripting environment. This programming toolkit allows easy access to most commonly used cameras, stages, and shutters through the Micromanager project so the scripter can focus on their custom application instead of boilerplate code generation.

  11. AutoMicromanager: a microscopy scripting toolkit for LABVIEW and other programming environments.

    PubMed

    Ashcroft, Brian Alan; Oosterkamp, Tjerk

    2010-11-01

    We present a scripting toolkit for the acquisition and analysis of a wide variety of imaging data by integrating the ease of use of various programming environments such as LABVIEW, IGOR PRO, MATLAB, SCILAB, and others. This toolkit is designed to allow the user to quickly program a variety of standard microscopy components for custom microscopy applications allowing much more flexibility than other packages. Included are both programming tools as well as graphical user interface classes allowing a standard, consistent, and easy to maintain scripting environment. This programming toolkit allows easy access to most commonly used cameras, stages, and shutters through the Micromanager project so the scripter can focus on their custom application instead of boilerplate code generation. PMID:21133479

  12. Development of a Human Physiologically Based Pharmacokinetic (PBPK) Toolkit for Environmental Pollutants

    PubMed Central

    Ruiz, Patricia; Ray, Meredith; Fisher, Jeffrey; Mumtaz, Moiz

    2011-01-01

    Physiologically Based Pharmacokinetic (PBPK) models can be used to determine the internal dose and strengthen exposure assessment. Many PBPK models are available, but they are not easily accessible for field use. The Agency for Toxic Substances and Disease Registry (ATSDR) has conducted translational research to develop a human PBPK model toolkit by recoding published PBPK models. This toolkit, when fully developed, will provide a platform that consists of a series of priority PBPK models of environmental pollutants. Presented here is work on recoded PBPK models for volatile organic compounds (VOCs) and metals. Good agreement was generally obtained between the original and the recoded models. This toolkit will be available for ATSDR scientists and public health assessors to perform simulations of exposures from contaminated environmental media at sites of concern and to help interpret biomonitoring data. It can be used as screening tools that can provide useful information for the protection of the public. PMID:22174611

  13. The GeoViz Toolkit: Using component-oriented coordination methods for geographic visualization and analysis

    PubMed Central

    Hardisty, Frank; Robinson, Anthony C.

    2010-01-01

    In this paper we present the GeoViz Toolkit, an open-source, internet-delivered program for geographic visualization and analysis that features a diverse set of software components which can be flexibly combined by users who do not have programming expertise. The design and architecture of the GeoViz Toolkit allows us to address three key research challenges in geovisualization: allowing end users to create their own geovisualization and analysis component set on-the-fly, integrating geovisualization methods with spatial analysis methods, and making geovisualization applications sharable between users. Each of these tasks necessitates a robust yet flexible approach to inter-tool coordination. The coordination strategy we developed for the GeoViz Toolkit, called Introspective Observer Coordination, leverages and combines key advances in software engineering from the last decade: automatic introspection of objects, software design patterns, and reflective invocation of methods. PMID:21731423

  14. The Visualization Toolkit (VTK): Rewriting the rendering code for modern graphics cards

    NASA Astrophysics Data System (ADS)

    Hanwell, Marcus D.; Martin, Kenneth M.; Chaudhary, Aashish; Avila, Lisa S.

    2015-09-01

    The Visualization Toolkit (VTK) is an open source, permissively licensed, cross-platform toolkit for scientific data processing, visualization, and data analysis. It is over two decades old, originally developed for a very different graphics card architecture. Modern graphics cards feature fully programmable, highly parallelized architectures with large core counts. VTK's rendering code was rewritten to take advantage of modern graphics cards, maintaining most of the toolkit's programming interfaces. This offers the opportunity to compare the performance of old and new rendering code on the same systems/cards. Significant improvements in rendering speeds and memory footprints mean that scientific data can be visualized in greater detail than ever before. The widespread use of VTK means that these improvements will reap significant benefits.

  15. The MITK image guided therapy toolkit and its application for augmented reality in laparoscopic prostate surgery

    NASA Astrophysics Data System (ADS)

    Baumhauer, Matthias; Neuhaus, Jochen; Fritzsche, Klaus; Meinzer, Hans-Peter

    2010-02-01

    Image Guided Therapy (IGT) faces researchers with high demands and efforts in system design, prototype implementation, and evaluation. The lack of standardized software tools, like algorithm implementations, tracking device and tool setups, and data processing methods escalate the labor for system development and sustainable system evaluation. In this paper, a new toolkit component of the Medical Imaging and Interaction Toolkit (MITK), the MITK-IGT, and its exemplary application for computer-assisted prostate surgery are presented. MITK-IGT aims at integrating software tools, algorithms and tracking device interfaces into the MITK toolkit to provide a comprehensive software framework for computer aided diagnosis support, therapy planning, treatment support, and radiological follow-up. An exemplary application of the MITK-IGT framework is introduced with a surgical navigation system for laparos-copic prostate surgery. It illustrates the broad range of application possibilities provided by the framework, as well as its simple extensibility with custom algorithms and other software modules.

  16. The GeoViz Toolkit: Using component-oriented coordination methods for geographic visualization and analysis.

    PubMed

    Hardisty, Frank; Robinson, Anthony C

    2011-01-01

    In this paper we present the GeoViz Toolkit, an open-source, internet-delivered program for geographic visualization and analysis that features a diverse set of software components which can be flexibly combined by users who do not have programming expertise. The design and architecture of the GeoViz Toolkit allows us to address three key research challenges in geovisualization: allowing end users to create their own geovisualization and analysis component set on-the-fly, integrating geovisualization methods with spatial analysis methods, and making geovisualization applications sharable between users. Each of these tasks necessitates a robust yet flexible approach to inter-tool coordination. The coordination strategy we developed for the GeoViz Toolkit, called Introspective Observer Coordination, leverages and combines key advances in software engineering from the last decade: automatic introspection of objects, software design patterns, and reflective invocation of methods. PMID:21731423

  17. Implementing Project Based Survey Research Skills to Grade Six ELP Students with "The Survey Toolkit" and "TinkerPlots"[R

    ERIC Educational Resources Information Center

    Walsh, Thomas, Jr.

    2011-01-01

    "Survey Toolkit Collecting Information, Analyzing Data and Writing Reports" (Walsh, 2009a) is discussed as a survey research curriculum used by the author's sixth grade students. The report describes the implementation of "The Survey Toolkit" curriculum and "TinkerPlots"[R] software to provide instruction to students learning a project based…

  18. Instructional Improvement Cycle: A Teacher's Toolkit for Collecting and Analyzing Data on Instructional Strategies. REL 2015-080

    ERIC Educational Resources Information Center

    Cherasaro, Trudy L.; Reale, Marianne L.; Haystead, Mark; Marzano, Robert J.

    2015-01-01

    This toolkit, developed by Regional Educational Laboratory (REL) Central in collaboration with York Public Schools in Nebraska, provides a process and tools to help teachers use data from their classroom assessments to evaluate promising practices. The toolkit provides teachers with guidance on how to deliberately apply and study one classroom…

  19. Toolkit of Resources for Engaging Parents and Community as Partners in Education. Part 2: Building a Cultural Bridge

    ERIC Educational Resources Information Center

    Regional Educational Laboratory Pacific, 2015

    2015-01-01

    This toolkit is designed to guide school staff in strengthening partnerships with families and community members to support student learning. This toolkit offers an integrated approach to family and community engagement, bringing together research, promising practices, and a wide range of useful tools and resources with explanations and directions…

  20. WATERSHED HEALTH ASSESSMENT TOOLS-INVESTIGATING FISHERIES (WHAT-IF): A MODELING TOOLKIT FOR WATERSHED AND FISHERIES MANAGEMENT

    EPA Science Inventory

    The Watershed Health Assessment Tools-Investigating Fisheries (WHAT-IF) is a decision-analysis modeling toolkit for personal computers that supports watershed and fisheries management. The WHAT-IF toolkit includes a relational database, help-system functions and documentation, a...

  1. Toolkit of Resources for Engaging Parents and Community as Partners in Education. Part 4: Engaging All in Data Conversations

    ERIC Educational Resources Information Center

    Regional Educational Laboratory Pacific, 2015

    2015-01-01

    This toolkit is designed to guide school staff in strengthening partnerships with families and community members to support student learning. This toolkit offers an integrated approach to family and community engagement, bringing together research, promising practices, and a wide range of useful tools and resources with explanations and directions…

  2. The Recognition of Collagen and Triple-helical Toolkit Peptides by MMP-13

    PubMed Central

    Howes, Joanna-Marie; Bihan, Dominique; Slatter, David A.; Hamaia, Samir W.; Packman, Len C.; Knauper, Vera; Visse, Robert; Farndale, Richard W.

    2014-01-01

    Remodeling of collagen by matrix metalloproteinases (MMPs) is crucial to tissue homeostasis and repair. MMP-13 is a collagenase with a substrate preference for collagen II over collagens I and III. It recognizes a specific, well-known site in the tropocollagen molecule where its binding locally perturbs the triple helix, allowing the catalytic domain of the active enzyme to cleave the collagen α chains sequentially, at Gly775–Leu776 in collagen II. However, the specific residues upon which collagen recognition depends within and surrounding this locus have not been systematically mapped. Using our triple-helical peptide Collagen Toolkit libraries in solid-phase binding assays, we found that MMP-13 shows little affinity for Collagen Toolkit III, but binds selectively to two triple-helical peptides of Toolkit II. We have identified the residues required for the adhesion of both proMMP-13 and MMP-13 to one of these, Toolkit peptide II-44, which contains the canonical collagenase cleavage site. MMP-13 was unable to bind to a linear peptide of the same sequence as II-44. We also discovered a second binding site near the N terminus of collagen II (starting at helix residue 127) in Toolkit peptide II-8. The pattern of binding of the free hemopexin domain of MMP-13 was similar to that of the full-length enzyme, but the free catalytic subunit bound none of our peptides. The susceptibility of Toolkit peptides to proteolysis in solution was independent of the very specific recognition of immobilized peptides by MMP-13; the enzyme proved able to cleave a range of dissolved collagen peptides. PMID:25008319

  3. Using stakeholder perspectives to develop an ePrescribing toolkit for NHS Hospitals: a questionnaire study

    PubMed Central

    Cresswell, Kathrin; Slee, Ann; Slight, Sarah P; Coleman, Jamie; Sheikh, Aziz

    2014-01-01

    Summary Objective To evaluate how an online toolkit may support ePrescribing deployments in National Health Service hospitals, by assessing the type of knowledge-based resources currently sought by key stakeholders. Design Questionnaire-based survey of attendees at a national ePrescribing symposium. Setting 2013 National ePrescribing Symposium in London, UK. Participants Eighty-four delegates were eligible for inclusion in the survey, of whom 70 completed and returned the questionnaire. Main outcome measures Estimate of the usefulness and type of content to be included in an ePrescribing toolkit. Results Interest in a toolkit designed to support the implementation and use of ePrescribing systems was high (n = 64; 91.4%). As could be expected given the current dearth of such a resource, few respondents (n = 2; 2.9%) had access or used an ePrescribing toolkit at the time of the survey. Anticipated users for the toolkit included implementation (n = 62; 88.6%) and information technology (n = 61; 87.1%) teams, pharmacists (n = 61; 87.1%), doctors (n = 58; 82.9%) and nurses (n = 56; 80.0%). Summary guidance for every stage of the implementation (n = 48; 68.6%), planning and monitoring tools (n = 47; 67.1%) and case studies of hospitals’ experiences (n = 45; 64.3%) were considered the most useful types of content. Conclusions There is a clear need for reliable and up-to-date knowledge to support ePrescribing system deployments and longer term use. The findings highlight how a toolkit may become a useful instrument for the management of knowledge in the field, not least by allowing the exchange of ideas and shared learning. PMID:25383199

  4. Aggregation dynamics of rigid polyelectrolytes

    NASA Astrophysics Data System (ADS)

    Tom, Anvy Moly; Rajesh, R.; Vemparala, Satyavani

    2016-01-01

    Similarly charged polyelectrolytes are known to attract each other and aggregate into bundles when the charge density of the polymers exceeds a critical value that depends on the valency of the counterions. The dynamics of aggregation of such rigid polyelectrolytes are studied using large scale molecular dynamics simulations. We find that the morphology of the aggregates depends on the value of the charge density of the polymers. For values close to the critical value, the shape of the aggregates is cylindrical with height equal to the length of a single polyelectrolyte chain. However, for larger values of charge, the linear extent of the aggregates increases as more and more polymers aggregate. In both the cases, we show that the number of aggregates decrease with time as power laws with exponents that are not numerically distinguishable from each other and are independent of charge density of the polymers, valency of the counterions, density, and length of the polyelectrolyte chain. We model the aggregation dynamics using the Smoluchowski coagulation equation with kernels determined from the molecular dynamics simulations and justify the numerically obtained value of the exponent. Our results suggest that once counterions condense, effective interactions between polyelectrolyte chains short-ranged and the aggregation of polyelectrolytes are diffusion-limited.

  5. A Platform to Build Mobile Health Apps: The Personal Health Intervention Toolkit (PHIT).

    PubMed

    Eckhoff, Randall Peter; Kizakevich, Paul Nicholas; Bakalov, Vesselina; Zhang, Yuying; Bryant, Stephanie Patrice; Hobbs, Maria Ann

    2015-01-01

    Personal Health Intervention Toolkit (PHIT) is an advanced cross-platform software framework targeted at personal self-help research on mobile devices. Following the subjective and objective measurement, assessment, and plan methodology for health assessment and intervention recommendations, the PHIT platform lets researchers quickly build mobile health research Android and iOS apps. They can (1) create complex data-collection instruments using a simple extensible markup language (XML) schema; (2) use Bluetooth wireless sensors; (3) create targeted self-help interventions based on collected data via XML-coded logic; (4) facilitate cross-study reuse from the library of existing instruments and interventions such as stress, anxiety, sleep quality, and substance abuse; and (5) monitor longitudinal intervention studies via daily upload to a Web-based dashboard portal. For physiological data, Bluetooth sensors collect real-time data with on-device processing. For example, using the BinarHeartSensor, the PHIT platform processes the heart rate data into heart rate variability measures, and plots these data as time-series waveforms. Subjective data instruments are user data-entry screens, comprising a series of forms with validation and processing logic. The PHIT instrument library consists of over 70 reusable instruments for various domains including cognitive, environmental, psychiatric, psychosocial, and substance abuse. Many are standardized instruments, such as the Alcohol Use Disorder Identification Test, Patient Health Questionnaire-8, and Post-Traumatic Stress Disorder Checklist. Autonomous instruments such as battery and global positioning system location support continuous background data collection. All data are acquired using a schedule appropriate to the app's deployment. The PHIT intelligent virtual advisor (iVA) is an expert system logic layer, which analyzes the data in real time on the device. This data analysis results in a tailored app of interventions

  6. A Platform to Build Mobile Health Apps: The Personal Health Intervention Toolkit (PHIT)

    PubMed Central

    2015-01-01

    Personal Health Intervention Toolkit (PHIT) is an advanced cross-platform software framework targeted at personal self-help research on mobile devices. Following the subjective and objective measurement, assessment, and plan methodology for health assessment and intervention recommendations, the PHIT platform lets researchers quickly build mobile health research Android and iOS apps. They can (1) create complex data-collection instruments using a simple extensible markup language (XML) schema; (2) use Bluetooth wireless sensors; (3) create targeted self-help interventions based on collected data via XML-coded logic; (4) facilitate cross-study reuse from the library of existing instruments and interventions such as stress, anxiety, sleep quality, and substance abuse; and (5) monitor longitudinal intervention studies via daily upload to a Web-based dashboard portal. For physiological data, Bluetooth sensors collect real-time data with on-device processing. For example, using the BinarHeartSensor, the PHIT platform processes the heart rate data into heart rate variability measures, and plots these data as time-series waveforms. Subjective data instruments are user data-entry screens, comprising a series of forms with validation and processing logic. The PHIT instrument library consists of over 70 reusable instruments for various domains including cognitive, environmental, psychiatric, psychosocial, and substance abuse. Many are standardized instruments, such as the Alcohol Use Disorder Identification Test, Patient Health Questionnaire-8, and Post-Traumatic Stress Disorder Checklist. Autonomous instruments such as battery and global positioning system location support continuous background data collection. All data are acquired using a schedule appropriate to the app’s deployment. The PHIT intelligent virtual advisor (iVA) is an expert system logic layer, which analyzes the data in real time on the device. This data analysis results in a tailored app of interventions

  7. Do-it-yourself guide: How to use the modern single molecule toolkit

    PubMed Central

    Walter, Nils G.; Huang, Cheng-Yen; Manzo, Anthony J.; Sobhy, Mohamed A.

    2008-01-01

    Single molecule microscopy has evolved into the ultimate-sensitivity toolkit to study systems from small molecules to living cells, with the prospect of revolutionizing the modern biosciences. Here we survey the current state-of-the-art in single molecule tools including fluorescence spectroscopy, tethered particle microscopy, optical and magnetic tweezers, and atomic force microscopy. Our review seeks to guide the biological scientist in choosing the right approach from the available single molecule toolkit for applications ranging as far as structural biology, enzymology, nanotechnology, and systems biology. PMID:18511916

  8. The PyRosetta Toolkit: A Graphical User Interface for the Rosetta Software Suite

    PubMed Central

    Adolf-Bryfogle, Jared; Dunbrack Jr., Roland L.

    2013-01-01

    The Rosetta Molecular Modeling suite is a command-line-only collection of applications that enable high-resolution modeling and design of proteins and other molecules. Although extremely useful, Rosetta can be difficult to learn for scientists with little computational or programming experience. To that end, we have created a Graphical User Interface (GUI) for Rosetta, called the PyRosetta Toolkit, for creating and running protocols in Rosetta for common molecular modeling and protein design tasks and for analyzing the results of Rosetta calculations. The program is highly extensible so that developers can add new protocols and analysis tools to the PyRosetta Toolkit GUI. PMID:23874400

  9. Python-based geometry preparation and simulation visualization toolkits for STEPS

    PubMed Central

    Chen, Weiliang; De Schutter, Erik

    2014-01-01

    STEPS is a stochastic reaction-diffusion simulation engine that implements a spatial extension of Gillespie's Stochastic Simulation Algorithm (SSA) in complex tetrahedral geometries. An extensive Python-based interface is provided to STEPS so that it can interact with the large number of scientific packages in Python. However, a gap existed between the interfaces of these packages and the STEPS user interface, where supporting toolkits could reduce the amount of scripting required for research projects. This paper introduces two new supporting toolkits that support geometry preparation and visualization for STEPS simulations. PMID:24782754

  10. A methodological toolkit for field assessments of artisanally mined alluvial diamond deposits

    USGS Publications Warehouse

    Chirico, Peter G.; Malpeli, Katherine C.

    2014-01-01

    This toolkit provides a standardized checklist of critical issues relevant to artisanal mining-related field research. An integrated sociophysical geographic approach to collecting data at artisanal mine sites is outlined. The implementation and results of a multistakeholder approach to data collection, carried out in the assessment of Guinea’s artisanally mined diamond deposits, also are summarized. This toolkit, based on recent and successful field campaigns in West Africa, has been developed as a reference document to assist other government agencies or organizations in collecting the data necessary for artisanal diamond mining or similar natural resource assessments.

  11. WOLF: FITS file processor

    NASA Astrophysics Data System (ADS)

    Shamir, Lior

    2012-12-01

    WOLF processes FITS files and generates photometry files, annotated JPGs, opacity maps, background, transient detection and luminance changes detection. This software was used to process data for the Night Sky Live project.

  12. Peptide aggregation in neurodegenerative disease.

    PubMed

    Murphy, Regina M

    2002-01-01

    In the not-so-distant past, insoluble aggregated protein was considered as uninteresting and bothersome as yesterday's trash. More recently, protein aggregates have enjoyed considerable scientific interest, as it has become clear that these aggregates play key roles in many diseases. In this review, we focus attention on three polypeptides: beta-amyloid, prion, and huntingtin, which are linked to three feared neurodegenerative diseases: Alzheimer's, "mad cow," and Huntington's disease, respectively. These proteins lack any significant primary sequence homology, yet their aggregates possess very similar features, specifically, high beta-sheet content, fibrillar morphology, relative insolubility, and protease resistance. Because the aggregates are noncrystalline, secrets of their structure at nanometer resolution are only slowly yielding to X-ray diffraction, solid-state NMR, and other techniques. Besides structure, the aggregates may possess similar pathways of assembly. Two alternative assembly pathways have been proposed: the nucleation-elongation and the template-assisted mode. These two modes may be complementary, not mutually exclusive. Strategies for interfering with aggregation, which may provide novel therapeutic approaches, are under development. The structural similarities between protein aggregates of dissimilar origin suggest that therapeutic strategies successful against one disease may have broad utility in others. PMID:12117755

  13. Topics in Probabilistic Judgment Aggregation

    ERIC Educational Resources Information Center

    Wang, Guanchun

    2011-01-01

    This dissertation is a compilation of several studies that are united by their relevance to probabilistic judgment aggregation. In the face of complex and uncertain events, panels of judges are frequently consulted to provide probabilistic forecasts, and aggregation of such estimates in groups often yield better results than could have been made…

  14. Mineral of the month: aggregates

    USGS Publications Warehouse

    Tepordei, Valentin V.

    2005-01-01

    Natural aggregates, consisting of crushed stone, and sand and gravel, are a major contributor to economic health, and have an amazing variety of uses. Aggregates are among the most abundant mineral resources and are major basic raw materials used by construction, agriculture and other industries that employ complex chemical and metallurgical processes.

  15. Mechanics of fire ant aggregations

    NASA Astrophysics Data System (ADS)

    Tennenbaum, Michael; Liu, Zhongyang; Hu, David; Fernandez-Nieves, Alberto

    2016-01-01

    Fire ants link their bodies to form aggregations; these can adopt a variety of structures, they can drip and spread, or withstand applied loads. Here, by using oscillatory rheology, we show that fire ant aggregations are viscoelastic. We find that, at the lowest ant densities probed and in the linear regime, the elastic and viscous moduli are essentially identical over the spanned frequency range, which highlights the absence of a dominant mode of structural relaxation. As ant density increases, the elastic modulus rises, which we interpret by alluding to ant crowding and subsequent jamming. When deformed beyond the linear regime, the aggregation flows, exhibiting shear-thinning behaviour with a stress load that is comparable to the maximum load the aggregation can withstand before individual ants are torn apart. Our findings illustrate the rich, collective mechanical behaviour that can arise in aggregations of active, interacting building blocks.

  16. Mechanics of fire ant aggregations.

    PubMed

    Tennenbaum, Michael; Liu, Zhongyang; Hu, David; Fernandez-Nieves, Alberto

    2016-01-01

    Fire ants link their bodies to form aggregations; these can adopt a variety of structures, they can drip and spread, or withstand applied loads. Here, by using oscillatory rheology, we show that fire ant aggregations are viscoelastic. We find that, at the lowest ant densities probed and in the linear regime, the elastic and viscous moduli are essentially identical over the spanned frequency range, which highlights the absence of a dominant mode of structural relaxation. As ant density increases, the elastic modulus rises, which we interpret by alluding to ant crowding and subsequent jamming. When deformed beyond the linear regime, the aggregation flows, exhibiting shear-thinning behaviour with a stress load that is comparable to the maximum load the aggregation can withstand before individual ants are torn apart. Our findings illustrate the rich, collective mechanical behaviour that can arise in aggregations of active, interacting building blocks. PMID:26501413

  17. Molecular aggregation of humic substances

    USGS Publications Warehouse

    Wershaw, R. L.

    1999-01-01

    Humic substances (HS) form molecular aggregates in solution and on mineral surfaces. Elucidation of the mechanism of formation of these aggregates is important for an understanding of the interactions of HS in soils arid natural waters. The HS are formed mainly by enzymatic depolymerization and oxidation of plant biopolymers. These reactions transform the aromatic and lipid plant components into amphiphilic molecules, that is, molecules that consist of separate hydrophobic (nonpolar) and hydrophilic (polar) parts. The nonpolar parts of the molecules are composed of relatively unaltered segments of plant polymers and the polar parts of carboxylic acid groups. These amphiphiles form membrane-like aggregates on mineral surfaces and micelle-like aggregates in solution. The exterior surfaces of these aggregates are hydrophilic, and the interiors constitute separate hydrophobic liquid-like phases.

  18. Imbibition kinetics of spherical aggregates

    NASA Astrophysics Data System (ADS)

    Hébraud, Pascal; Lootens, Didier; Debacker, Alban

    The imbibition kinetics of a millimeter-sized aggregate of 300 nm diameter colloidal particles by a wetting pure solvent is studied. Three successive regimes are observed : in the first one, the imbibition proceeds by compressing the air inside the aggregate. Then, the solvent stops when the pressure of the compressed air is equal to the Laplace pressure at the meniscus of the wetting solvent in the porous aggregate. The interface is pinned and the aggregate slowly degases, up to a point where the pressure of the entrapped air stops decreasing and is controlled by the Laplace pressure of small bubbles. Depending on the curvature of the bubble, the system may then be in an unstable state. The imbibition then starts again, but with an inner pressure in equilibrium with these bubbles. This last stage leads to the complete infiltration of the aggregate.

  19. Immunogenicity of Therapeutic Protein Aggregates.

    PubMed

    Moussa, Ehab M; Panchal, Jainik P; Moorthy, Balakrishnan S; Blum, Janice S; Joubert, Marisa K; Narhi, Linda O; Topp, Elizabeth M

    2016-02-01

    Therapeutic proteins have a propensity for aggregation during manufacturing, shipping, and storage. The presence of aggregates in protein drug products can induce adverse immune responses in patients that may affect safety and efficacy, and so it is of concern to both manufacturers and regulatory agencies. In this vein, there is a lack of understanding of the physicochemical determinants of immunological responses and a lack of standardized analytical methods to survey the molecular properties of aggregates associated with immune activation. In this review, we provide an overview of the basic immune mechanisms in the context of interactions with protein aggregates. We then critically examine the literature with emphasis on the underlying immune mechanisms as they relate to aggregate properties. Finally, we highlight the gaps in our current understanding of this issue and offer recommendations for future research. PMID:26869409

  20. Standard interface file handbook

    SciTech Connect

    Shapiro, A.; Huria, H.C. )

    1992-10-01

    This handbook documents many of the standard interface file formats that have been adopted by the US Department of Energy to facilitate communications between and portability of, various large reactor physics and radiation transport software packages. The emphasis is on those files needed for use of the VENTURE/PC diffusion-depletion code system. File structures, contents and some practical advice on use of the various files are provided.

  1. Text File Display Program

    NASA Technical Reports Server (NTRS)

    Vavrus, J. L.

    1986-01-01

    LOOK program permits user to examine text file in pseudorandom access manner. Program provides user with way of rapidly examining contents of ASCII text file. LOOK opens text file for input only and accesses it in blockwise fashion. Handles text formatting and displays text lines on screen. User moves forward or backward in file by any number of lines or blocks. Provides ability to "scroll" text at various speeds in forward or backward directions.

  2. Text File Comparator

    NASA Technical Reports Server (NTRS)

    Kotler, R. S.

    1983-01-01

    File Comparator program IFCOMP, is text file comparator for IBM OS/VScompatable systems. IFCOMP accepts as input two text files and produces listing of differences in pseudo-update form. IFCOMP is very useful in monitoring changes made to software at the source code level.

  3. How To Create a Community Guide to Your School District's Budget. School Finance Toolkit.

    ERIC Educational Resources Information Center

    Hassel, Bryan C.

    This toolkit helps community-based organizations create a community guide to the school budget, demystifying school finance for citizens and engaging them in the process of using the school budget as a tool for school improvement. It explains the major steps organizations have used in their own initiatives, offering advice and examples of tools.…

  4. Excellence in Teaching End-of-Life Care. A New Multimedia Toolkit for Nurse Educators.

    ERIC Educational Resources Information Center

    Wilkie, Diana J.; Judge, Kay M.; Wells, Marjorie J.; Berkley, Ila Meredith

    2001-01-01

    Describes a multimedia toolkit for teaching palliative care in nursing, which contains modules on end-of-life topics: comfort, connections, ethics, grief, impact, and well-being. Other contents include myths, definitions, pre- and postassessments, teaching materials, case studies, learning activities, and resources. (SK)

  5. 78 FR 14773 - U.S. Environmental Solutions Toolkit-Medical Waste

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-07

    ... International Trade Administration U.S. Environmental Solutions Toolkit--Medical Waste AGENCY: International... of medical waste. The Department of Commerce continues to develop the web-based U.S. Environmental... address, contact information, and medical waste management category of interest from the following...

  6. 78 FR 14774 - U.S. Environmental Solutions Toolkit-Universal Waste

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-07

    ...: (a) Mercury Recycling Technology (b) E-Waste Recycling Technology (c) CRT Recycling Technology (d... International Trade Administration U.S. Environmental Solutions Toolkit--Universal Waste AGENCY: International... of universal waste. The Department of Commerce continues to develop the web-based U.S....

  7. Field trials of a novel toolkit for evaluating 'intangible' values-related dimensions of projects.

    PubMed

    Burford, Gemma; Velasco, Ismael; Janoušková, Svatava; Zahradnik, Martin; Hak, Tomas; Podger, Dimity; Piggot, Georgia; Harder, Marie K

    2013-02-01

    A novel toolkit has been developed, using an original approach to develop its components, for the purpose of evaluating 'soft' outcomes and processes that have previously been generally considered 'intangible': those which are specifically values based. This represents a step-wise, significant, change in provision for the assessment of values-based achievements that are of absolutely key importance to most civil society organisations (CSOs) and values-based businesses, and fills a known gap in evaluation practice. In this paper, we demonstrate the significance and rigour of the toolkit by presenting an evaluation of it in three diverse scenarios where different CSOs use it to co-evaluate locally relevant outcomes and processes to obtain results which are both meaningful to them and potentially comparable across organisations. A key strength of the toolkit is its original use of a prior generated, peer-elicited 'menu' of values-based indicators which provides a framework for user CSOs to localise. Principles of participatory, process-based and utilisation-focused evaluation are embedded in this toolkit and shown to be critical to its success, achieving high face-validity and wide applicability. The emerging contribution of this next-generation evaluation tool to other fields, such as environmental values, development and environmental sustainable development, shared values, business, education and organisational change is outlined. PMID:22621861

  8. Virtual venue management users manual : access grid toolkit documentation, version 2.3.

    SciTech Connect

    Judson, I. R.; Lefvert, S.; Olson, E.; Uram, T. D.; Mathematics and Computer Science

    2007-10-24

    An Access Grid Venue Server provides access to individual Virtual Venues, virtual spaces where users can collaborate using the Access Grid Venue Client software. This manual describes the Venue Server component of the Access Grid Toolkit, version 2.3. Covered here are the basic operations of starting a venue server, modifying its configuration, and modifying the configuration of the individual venues.

  9. College Access and Success for Students Experiencing Homelessness: A Toolkit for Educators and Service Providers

    ERIC Educational Resources Information Center

    Dukes, Christina

    2013-01-01

    This toolkit serves as a comprehensive resource on the issue of higher education access and success for homeless students, including information on understanding homeless students, assisting homeless students in choosing a school, helping homeless students pay for application-related expenses, assisting homeless students in finding financial aid…

  10. The Special Educator's Toolkit: Everything You Need to Organize, Manage, and Monitor Your Classroom

    ERIC Educational Resources Information Center

    Golden, Cindy

    2012-01-01

    Overwhelmed special educators: Reduce your stress and support student success with this practical toolkit for whole-classroom organization. A lifesaver for special educators in any K-12 setting, this book-and-CD set will help teachers expertly manage everything, from schedules and paperwork to student supports and behavior plans. Cindy Golden, a…

  11. University of Central Florida and the American Association of State Colleges and Universities: Blended Learning Toolkit

    ERIC Educational Resources Information Center

    EDUCAUSE, 2014

    2014-01-01

    The Blended Learning Toolkit supports the course redesign approach, and interest in its openly available clearinghouse of online tools, strategies, curricula, and other materials to support the adoption of blended learning continues to grow. When the resource originally launched in July 2011, 20 AASCU [American Association of State Colleges and…

  12. Development and Evaluation of an Integrated Pest Management Toolkit for Child Care Providers

    ERIC Educational Resources Information Center

    Alkon, Abbey; Kalmar, Evie; Leonard, Victoria; Flint, Mary Louise; Kuo, Devina; Davidson, Nita; Bradman, Asa

    2012-01-01

    Young children and early care and education (ECE) staff are exposed to pesticides used to manage pests in ECE facilities in the United States and elsewhere. The objective of this pilot study was to encourage child care programs to reduce pesticide use and child exposures by developing and evaluating an Integrated Pest Management (IPM) Toolkit for…

  13. Use of Remote Sensing Data to Enhance the National Weather Service (NWS) Storm Damage Toolkit

    NASA Technical Reports Server (NTRS)

    Jedlovec, Gary; Molthan, Andrew; White, Kris; Burks, Jason; Stellman, Keith; Smith, Matthew

    2012-01-01

    SPoRT is improving the use of near real-time satellite data in response to severe weather events and other diasters. Supported through NASA s Applied Sciences Program. Planned interagency collaboration to support NOAA s Damage Assessment Toolkit, with spinoff opportunities to support other entities such as USGS and FEMA.

  14. UNESCO-APQN Toolkit: Regulating the Quality of Cross-Border Education

    ERIC Educational Resources Information Center

    Online Submission, 2006

    2006-01-01

    As a result of increasing mobility of students and knowledge, cross border education, especially in higher education, is receiving general attention in the Asia and Pacific region. The toolkit serves as a reference tool to assist local policymakers in the formulation of a regulatory framework for cross-border education that is growing rapidly in…

  15. K-12 Service-Learning Project Planning Toolkit. 2009 Updated Edition

    ERIC Educational Resources Information Center

    National Service-Learning Clearinghouse, 2009

    2009-01-01

    The materials in this toolkit contain information about the five core components of a service-learning project: investigation, planning and preparation, action, reflection, and demonstration/celebration. Also included are the standards and indicators of K-12 service-learning. The information is organized into an overview and five chapters, each…

  16. Celebrating a Century of Innovation in Higher Education, 1901-2001. [Toolkit].

    ERIC Educational Resources Information Center

    American Association of Community Colleges, Washington, DC.

    The American Association of Community Colleges (AACC) and the Association of Community College Trustees (ACCT) have commissioned this toolkit to help colleges effectively publicize a major milestone: in 2001, America's community colleges will celebrate 100 years of service and achievement. This year-long celebration presents community colleges…

  17. A Generic Expert Scheduling System Architecture and Toolkit: GUESS (Generically Used Expert Scheduling System)

    NASA Technical Reports Server (NTRS)

    Liebowitz, Jay; Krishnamurthy, Vijaya; Rodens, Ira; Houston, Chapman; Liebowitz, Alisa; Baek, Seung; Radko, Joe; Zeide, Janet

    1996-01-01

    Scheduling has become an increasingly important element in today's society and workplace. Within the NASA environment, scheduling is one of the most frequently performed and challenging functions. Towards meeting NASA's scheduling needs, a research version of a generic expert scheduling system architecture and toolkit has been developed. This final report describes the development and testing of GUESS (Generically Used Expert Scheduling System).

  18. Beyond the Bell: A Toolkit for Creating Effective After-School Programs.

    ERIC Educational Resources Information Center

    Walter, Katie E.; Caplan, Judith G.; McElvain, Carol K.

    After-school programs provide an important educational setting for an increasing number of children and have been viewed as a way to help solve school problems, reduce drug use, and prevent violence and youth crime. This toolkit is designed to help school-based after-school program staff plan and make decisions in six critical areas: (1)…

  19. Early Language & Literacy Classroom Observation (ELLCO) Toolkit, Research Edition [with] User's Guide.

    ERIC Educational Resources Information Center

    Smith, Miriam W.; Dickinson, David K.

    The document is comprised of the Early Language and Literacy Classroom Observation (ELLCO) Toolkit, a field-tested observation packet to examine literacy and language practices and materials in prekindergarten through third grade classrooms, and a user's guide providing technical information and instructions for administration and scoring the…

  20. Testing Video and Social Media for Engaging Users of the U.S. Climate Resilience Toolkit

    NASA Astrophysics Data System (ADS)

    Green, C. J.; Gardiner, N.; Niepold, F., III; Esposito, C.

    2015-12-01

    We developed a custom video production stye and a method for analyzing social media behavior so that we may deliberately build and track audience growth for decision-support tools and case studies within the U.S. Climate Resilience Toolkit. The new style of video focuses quickly on decision processes; its 30s format is well-suited for deployment through social media. We measured both traffic and engagement with video using Google Analytics. Each video included an embedded tag, allowing us to measure viewers' behavior: whether or not they entered the toolkit website; the duration of their session on the website; and the number pages they visited in that session. Results showed that video promotion was more effective on Facebook than Twitter. Facebook links generated twice the number of visits to the toolkit. Videos also increased Facebook interaction overall. Because most Facebook users are return visitors, this campaign did not substantially draw new site visitors. We continue to research and apply these methods in a targeted engagement and outreach campaign that utilizes the theory of social diffusion and social influence strategies to grow our audience of "influential" decision-makers and people within their social networks. Our goal is to increase access and use of the U.S. Climate Resilience Toolkit.

  1. Development of an Online Toolkit for Measuring Commercial Building Energy Efficiency Performance -- Scoping Study

    SciTech Connect

    Wang, Na

    2013-03-13

    This study analyzes the market needs for building performance evaluation tools. It identifies the existing gaps and provides a roadmap for the U.S. Department of Energy (DOE) to develop a toolkit with which to optimize energy performance of a commercial building over its life cycle.

  2. Can a workbook work? Examining whether a practitioner evaluation toolkit can promote instrumental use.

    PubMed

    Campbell, Rebecca; Townsend, Stephanie M; Shaw, Jessica; Karim, Nidal; Markowitz, Jenifer

    2015-10-01

    In large-scale, multi-site contexts, developing and disseminating practitioner-oriented evaluation toolkits are an increasingly common strategy for building evaluation capacity. Toolkits explain the evaluation process, present evaluation design choices, and offer step-by-step guidance to practitioners. To date, there has been limited research on whether such resources truly foster the successful design, implementation, and use of evaluation findings. In this paper, we describe a multi-site project in which we developed a practitioner evaluation toolkit and then studied the extent to which the toolkit and accompanying technical assistance was effective in promoting successful completion of local-level evaluations and fostering instrumental use of the findings (i.e., whether programs directly used their findings to improve practice, see Patton, 2008). Forensic nurse practitioners from six geographically dispersed service programs completed methodologically rigorous evaluations; furthermore, all six programs used the findings to create programmatic and community-level changes to improve local practice. Implications for evaluation capacity building are discussed. PMID:25996627

  3. Making Schools the Model for Healthier Environments Toolkit: What It Is

    ERIC Educational Resources Information Center

    Robert Wood Johnson Foundation, 2012

    2012-01-01

    Healthy students perform better. Poor nutrition and inadequate physical activity can affect not only academic achievement, but also other factors such as absenteeism, classroom behavior, ability to concentrate, self-esteem, cognitive performance, and test scores. This toolkit provides information to help make schools the model for healthier…

  4. Student-Centred Learning: Toolkit for Students, Staff and Higher Education Institutions

    ERIC Educational Resources Information Center

    Attard, Angele; Di Iorio, Emma; Geven, Koen; Santa, Robert

    2010-01-01

    This Toolkit forms part of the project entitled "Time for a New Paradigm in Education: Student-Centred Learning" (T4SCL), jointly led by the European Students' Union (ESU) and Education International (EI). This is an EU-funded project under the Lifelong Learning Programme (LLP) administered by the Education, Audiovisual and Culture Executive…

  5. Logic Models for Program Design, Implementation, and Evaluation: Workshop Toolkit. REL 2015-057

    ERIC Educational Resources Information Center

    Shakman, Karen; Rodriguez, Sheila M.

    2015-01-01

    The Logic Model Workshop Toolkit is designed to help practitioners learn the purpose of logic models, the different elements of a logic model, and the appropriate steps for developing and using a logic model for program evaluation. Topics covered in the sessions include an overview of logic models, the elements of a logic model, an introduction to…

  6. Development of the Mississippi communities for healthy living nutrition education toolkit

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The objective of our study was to develop a nutrition education toolkit for communities in the Lower Mississippi Delta (LMD) with content that is current, evidence-based, culturally relevant, and user friendly. The Mississippi Communities for Fealthy Living (MCHL), an evidenced-based nutrition educa...

  7. Information Skills Toolkit: Collaborative Integrated Instruction for the Middle Grades. Professional Growth Series.

    ERIC Educational Resources Information Center

    Logan, Debra Kay

    This toolkit provides tested lessons and a selection of alternative ideas for the middle grades to help adapt and integrate the teaching of information skills in a way that meets the needs of teachers and students. After an introductory section, eight chapters of Collaborative Integrated Skills Lessons are provided. The lessons are grouped by…

  8. A software toolkit for processing and analyzing spectral and trace gas flux data collected via aircraft

    NASA Astrophysics Data System (ADS)

    Xu, Y.; Garrity, S. R.; Vierling, L. A.; Martins, D. K.; Shepson, P. B.; Stirm, B. H.

    2006-12-01

    In order to spatially extrapolate trace gas flux measurements made at the scale of individual flux towers to broader regions using spectral approaches, it is helpful to establish new methodologies for sampling and processing these data at scales coarser than one flux tower footprint. To this end, we mounted a dual-channel hyperspectral spectroradiometer capable of collecting spectra at ~3Hz to an experimental twin-engine Beechcraft Duchess instrumented to also measure eddy covariance fluxes of CO2. Experimental flights were conducted over a northern hardwood, deciduous forest between 21 July and 24 July 2006. To analyze these data in ecologically meaningful ways, it was necessary to first develop a software toolkit capable of marrying the spectral and flux data in appropriate spatial and spectral contexts. The toolkit is capable of merging the spectral and flux data streams with the GPS/Inertial Navigation System of the aircraft such that data can be interactively selected according to its timestamp or geographic location and queried to output a variety of preset and/or user defined spectral indices for comparison to collocated flux data. In addition, the toolkit enables the user to interactively plot the spectral target locations on any georectified image to facilitate comparisons among land cover type, topography, surface spectral characteristics, and CO2 fluxes. In this paper, we highlight the capabilities of the software toolkit as well as provide examples of ways in which it can be used to explore correlation among spectral and flux data collected via aircraft.

  9. The Student Writing Toolkit: Enhancing Undergraduate Teaching of Scientific Writing in the Biological Sciences

    ERIC Educational Resources Information Center

    Dirrigl, Frank J., Jr.; Noe, Mark

    2014-01-01

    Teaching scientific writing in biology classes is challenging for both students and instructors. This article offers and reviews several useful "toolkit" items that improve student writing. These include sentence and paper-length templates, funnelling and compartmentalisation, and preparing compendiums of corrections. In addition,…

  10. Toolkit Approach to Integrating Library Resources into the Learning Management System

    ERIC Educational Resources Information Center

    Black, Elizabeth L.

    2008-01-01

    As use of learning management systems (LMS) increases, it is essential that librarians are there. Ohio State University Libraries took a toolkit approach to integrate library content in the LMS to facilitate creative and flexible interactions between librarians, students and faculty in Ohio State University's large and decentralized academic…

  11. EMERGO: A Methodology and Toolkit for Developing Serious Games in Higher Education

    ERIC Educational Resources Information Center

    Nadolski, Rob J.; Hummel, Hans G. K.; van den Brink, Henk J.; Hoefakker, Ruud E.; Slootmaker, Aad; Kurvers, Hub J.; Storm, Jeroen

    2008-01-01

    Societal changes demand educators to apply new pedagogical approaches. Many educational stakeholders feel that serious games could play a key role in fulfilling this demand, and they lick their chops when looking at the booming industry of leisure games. However, current toolkits for developing leisure games show severe shortcomings when applied…

  12. Building Management Information Systems to Coordinate Citywide Afterschool Programs: A Toolkit for Cities. Executive Summary

    ERIC Educational Resources Information Center

    Kingsley, Chris

    2012-01-01

    This executive summary describes highlights from the report, "Building Management Information Systems to Coordinate Citywide Afterschool Programs: A Toolkit for Cities." City-led efforts to build coordinated systems of afterschool programming are an important strategy for improving the health, safety and academic preparedness of children and…

  13. School Turnaround Leaders: Selection Toolkit. Part of the School Turnaround Collection from Public Impact

    ERIC Educational Resources Information Center

    Public Impact, 2008

    2008-01-01

    This toolkit includes the following separate sections: (1) Selection Preparation Guide; (2) Day-of-Interview Tools; (3) Candidate Rating Tools; and (4) Candidate Comparison and Decision Tools. Each of the sections is designed to be used at different stages of the selection process. The first section provides a list of competencies that would…

  14. School Turnaround Teachers: Selection Toolkit. Part of the School Turnaround Collection from Public Impact

    ERIC Educational Resources Information Center

    Public Impact, 2008

    2008-01-01

    This toolkit includes these separate sections: (1) Selection Preparation Guide; (2) Day-of-Interview Tools; (3) Candidate Rating Tools; and (4) Candidate Comparison and Decision Tools. Each of the sections is designed to be used at different stages of the selection process. The first section provides turnaround teacher competencies that are the…

  15. Examining the Use of a Representational Toolkit in a U.S. Reform-Oriented Textbook

    ERIC Educational Resources Information Center

    Davis, Jon D.

    2011-01-01

    This study examined the instances of a representational toolkit (RT) in one reform-oriented algebra textbook developed in the United States. The majority of RT uses were active (66%) as opposed to passive and connected (63%) in that the technology results were used in some other manner by either the textbook authors or students as they solved…

  16. The Automatic Parallelisation of Scientific Application Codes Using a Computer Aided Parallelisation Toolkit

    NASA Technical Reports Server (NTRS)

    Ierotheou, C.; Johnson, S.; Leggett, P.; Cross, M.; Evans, E.; Jin, Hao-Qiang; Frumkin, M.; Yan, J.; Biegel, Bryan (Technical Monitor)

    2001-01-01

    The shared-memory programming model is a very effective way to achieve parallelism on shared memory parallel computers. Historically, the lack of a programming standard for using directives and the rather limited performance due to scalability have affected the take-up of this programming model approach. Significant progress has been made in hardware and software technologies, as a result the performance of parallel programs with compiler directives has also made improvements. The introduction of an industrial standard for shared-memory programming with directives, OpenMP, has also addressed the issue of portability. In this study, we have extended the computer aided parallelization toolkit (developed at the University of Greenwich), to automatically generate OpenMP based parallel programs with nominal user assistance. We outline the way in which loop types are categorized and how efficient OpenMP directives can be defined and placed using the in-depth interprocedural analysis that is carried out by the toolkit. We also discuss the application of the toolkit on the NAS Parallel Benchmarks and a number of real-world application codes. This work not only demonstrates the great potential of using the toolkit to quickly parallelize serial programs but also the good performance achievable on up to 300 processors for hybrid message passing and directive-based parallelizations.

  17. After-School Toolkit: Tips, Techniques and Templates for Improving Program Quality

    ERIC Educational Resources Information Center

    Gutierrez, Nora; Bradshaw, Molly; Furano, Kathryn

    2008-01-01

    This toolkit offers program managers a hands-on guide for implementing quality programming in the after-school hours. The kit includes tools and techniques that increased the quality of literacy programming and helped improve student reading gains in the Communities Organizing Resources to Advance Learning (CORAL) initiative of The James Irvine…

  18. Redesigning Schools to Reach Every Student with Excellent Teachers: Teacher & Staff Selection, Development, & Evaluation Toolkit

    ERIC Educational Resources Information Center

    Public Impact, 2012

    2012-01-01

    This toolkit is a companion to the school models provided on OpportunityCulture.org. The school models use job redesign and technology to extend the reach of excellent teachers to more students, for more pay, within budget. Most of these school models create new roles and collaborative teams, enabling all teachers and staff to develop and…

  19. Perspectives on Preference Aggregation.

    PubMed

    Regenwetter, Michel

    2009-07-01

    For centuries, the mathematical aggregation of preferences by groups, organizations, or society itself has received keen interdisciplinary attention. Extensive theoretical work in economics and political science throughout the second half of the 20th century has highlighted the idea that competing notions of rational social choice intrinsically contradict each other. This has led some researchers to consider coherent democratic decision making to be a mathematical impossibility. Recent empirical work in psychology qualifies that view. This nontechnical review sketches a quantitative research paradigm for the behavioral investigation of mathematical social choice rules on real ballots, experimental choices, or attitudinal survey data. The article poses a series of open questions. Some classical work sometimes makes assumptions about voter preferences that are descriptively invalid. Do such technical assumptions lead the theory astray? How can empirical work inform the formulation of meaningful theoretical primitives? Classical "impossibility results" leverage the fact that certain desirable mathematical properties logically cannot hold in all conceivable electorates. Do these properties nonetheless hold true in empirical distributions of preferences? Will future behavioral analyses continue to contradict the expectations of established theory? Under what conditions do competing consensus methods yield identical outcomes and why do they do so? PMID:26158988

  20. High-Performance, Multi-Node File Copies and Checksums for Clustered File Systems

    NASA Technical Reports Server (NTRS)

    Kolano, Paul Z.; Ciotti, Robert B.

    2012-01-01

    Modern parallel file systems achieve high performance using a variety of techniques, such as striping files across multiple disks to increase aggregate I/O bandwidth and spreading disks across multiple servers to increase aggregate interconnect bandwidth. To achieve peak performance from such systems, it is typically necessary to utilize multiple concurrent readers/writers from multiple systems to overcome various singlesystem limitations, such as number of processors and network bandwidth. The standard cp and md5sum tools of GNU coreutils found on every modern Unix/Linux system, however, utilize a single execution thread on a single CPU core of a single system, and hence cannot take full advantage of the increased performance of clustered file systems. Mcp and msum are drop-in replacements for the standard cp and md5sum programs that utilize multiple types of parallelism and other optimizations to achieve maximum copy and checksum performance on clustered file systems. Multi-threading is used to ensure that nodes are kept as busy as possible. Read/write parallelism allows individual operations of a single copy to be overlapped using asynchronous I/O. Multinode cooperation allows different nodes to take part in the same copy/checksum. Split-file processing allows multiple threads to operate concurrently on the same file. Finally, hash trees allow inherently serial checksums to be performed in parallel. Mcp and msum provide significant performance improvements over standard cp and md5sum using multiple types of parallelism and other optimizations. The total speed-ups from all improvements are significant. Mcp improves cp performance over 27x, msum improves md5sum performance almost 19x, and the combination of mcp and msum improves verified copies via cp and md5sum by almost 22x. These improvements come in the form of drop-in replacements for cp and md5sum, so are easily used and are available for download as open source software at http://mutil.sourceforge.net.

  1. Exploiting Lustre File Joining for Effective Collective IO

    SciTech Connect

    Yu, Weikuan; Vetter, Jeffrey S; Canon, Richard Shane; Jiang, Song

    2007-01-01

    Lustre is a parallel file system that presents high aggregated IO bandwidth by striping file extents across many storage devices. However, our experiments indicate excessively wide striping can cause performance degradation. Lustre supports an innovative file joining feature that joins files in place. To mitigate striping overhead and benefit collective IO, we propose two techniques: split writing and hierarchical striping. In split writing, a file is created as separate subfiles, each of which is striped to only a few storage devices. They are joined as a single file at the file close time. Hierarchical striping builds on top of split writing and orchestrates the span of subfiles in a hierarchical manner to avoid overlapping and achieve the appropriate coverage of storage devices. Together, these techniques can avoid the overhead associated with large stripe width, while still being able to combine bandwidth available from many storage devices. We have prototyped these techniques in the ROMIO implementation of MPI-IO. Experimental results indicate that split writing and hierarchical striping can significantly improve the performance of Lustre collective IO in terms of both data transfer and management operations. On a Lustre file system configured with 46 object storage targets, our implementation improves collective write performance of a 16-process job by as much as 220%.

  2. 43 CFR 4.1381 - Who may file; when to file; where to file.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... may file; when to file; where to file. (a) Any person who receives a written decision issued by OSM under 30 CFR 773.28 on a challenge to an ownership or control listing or finding may file a request for... 43 Public Lands: Interior 1 2010-10-01 2010-10-01 false Who may file; when to file; where to...

  3. Phase 1 Development Report for the SESSA Toolkit.

    SciTech Connect

    Knowlton, Robert G.; Melton, Brad J; Anderson, Robert J.

    2014-09-01

    operation of th e SESSA tool kit in order to give the user enough information to start using the tool kit . SESSA is currently a prototype system and this documentation covers the initial release of the tool kit . Funding for SESSA was provided by the Department of Defense (D oD), Assistant Secretary of Defense for Research and Engineering (ASD(R&E)) Rapid Fielding (RF) organization. The project was managed by the Defense Forensic Science Center (DFSC) , formerly known as the U.S. Army Criminal Investigation Laboratory (USACIL) . ACKNOWLEDGEMENTS The authors wish to acknowledge the funding support for the development of the Site Exploitation System for Situational Awareness (SESSA) toolkit from the Department of Defense (DoD), Assistant Secretary of Defense for Research and Engineering (ASD(R&E)) Rapid Fielding (RF) organization. The project was managed by the Defense Forensic Science Center (DFSC) , formerly known as the U.S. Army Criminal Investigation Laboratory (USACIL). Special thanks to Mr. Garold Warner, of DFSC, who served as the Project Manager. Individuals that worked on the design, functional attributes, algorithm development, system arc hitecture, and software programming include: Robert Knowlton, Brad Melton, Robert Anderson, and Wendy Amai.

  4. Margins of safety provided by COSHH Essentials and the ILO Chemical Control Toolkit.

    PubMed

    Jones, Rachael M; Nicas, Mark

    2006-03-01

    COSHH Essentials, developed by the UK Health and Safety Executive, and the Chemical Control Toolkit (Toolkit) proposed by the International Labor Organization, are 'control banding' approaches to workplace risk management intended for use by proprietors of small and medium-sized businesses. Both systems group chemical substances into hazard bands based on toxicological endpoint and potency. COSSH Essentials uses the European Union's Risk-phrases (R-phrases), whereas the Toolkit uses R-phrases and the Globally Harmonized System (GHS) of Classification and Labeling of Chemicals. Each hazard band is associated with a range of airborne concentrations, termed exposure bands, which are to be attained by the implementation of recommended control technologies. Here we analyze the margin of safety afforded by the systems and, for each hazard band, define the minimal margin as the ratio of the minimum airborne concentration that produced the toxicological endpoint of interest in experimental animals to the maximum concentration in workplace air permitted by the exposure band. We found that the minimal margins were always <100, with some ranging to <1, and inversely related to molecular weight. The Toolkit-GHS system generally produced margins equal to or larger than COSHH Essentials, suggesting that the Toolkit-GHS system is more protective of worker health. Although, these systems predict exposures comparable with current occupational exposure limits, we argue that the minimal margins are better indicators of health protection. Further, given the small margins observed, we feel it is important that revisions of these systems provide the exposure bands to users, so as to permit evaluation of control technology capture efficiency. PMID:16172140

  5. Accelerating climate simulation analytics via multilevel aggregation and synthesis

    NASA Astrophysics Data System (ADS)

    Anantharaj, Valentine; Ravindran, Krishnaraj; Gunasekaran, Raghul; Vazhkudai, Sudharshan; Butt, Ali

    2015-04-01

    A typical set of ultra high resolution (0.25 deg) climate simulation experiments produce over 50,000 files, ranging in sizes from 101 MB to 102 GB each - for a total volume of nearly 1 PB of data. The execution of the experiments will require over 100 Million CPU hours on the Titan supercomputer at the Oak Ridge Leadership Computing Facility (OLCF). The output from the simulations must then be archived, analyzed, distributed to the project partners in a timely manner. Meeting this challenge would require efficient movement of the data, staging the simulation output to a large and fast file system that provides high volume access to other computational systems used to analyze the data and synthesize results. But data movement is one of the most expensive and time consuming steps in the scientific workflow. It is be expedient to complete the diagnostics and analytics before the files are archived for long term storage. Nevertheless, it is often necessary to fetch the files from archive for further analysis. We are implementing a solution to query, extract, index and summarize key statistical information from the individual CF-compliant netCDF files that are then stored for ready-access in a database. The contents of the database can be related back to the archived files from which they were extracted. The statistical information can be quickly aggregated to provide meaningful statistical summaries that could then be related to observations and/or other simulation results for synthesis and further inference. The scientific workflow at OLCF, augmented by expedited analytics capabilities, will allow the users of our systems to shorten the time required to derive meaningful and relevant science results. We will illustrate some of the timesaving benefits via a few typical use cases, based on recent large-scale simulation experiments using the Community Earth System Model (CESM) and the DOE Accelerated Climate Model for Energy (ACME).

  6. Aggregate breakdown of nanoparticulate titania

    NASA Astrophysics Data System (ADS)

    Venugopal, Navin

    Six nanosized titanium dioxide powders synthesized from a sulfate process were investigated. The targeted end-use of this powder was for a de-NOx catalyst honeycomb monolith. Alteration of synthesis parameters had resulted principally in differences in soluble ion level and specific surface area of the powders. The goal of this investigation was to understand the role of synthesis parameters in the aggregation behavior of these powders. Investigation via scanning electron microscopy of the powders revealed three different aggregation iterations at specific length scales. Secondary and higher order aggregate strength was investigated via oscillatory stress rheometry as a means of simulating shear conditions encountered during extrusion. G' and G'' were measured as a function of the applied oscillatory stress. Oscillatory rheometry indicated a strong variation as a function of the sulfate level of the particles in the viscoelastic yield strengths. Powder yield stresses ranged from 3.0 Pa to 24.0 Pa of oscillatory stress. Compaction curves to 750 MPa found strong similarities in extrapolated yield point of stage I and II compaction for each of the powders (at approximately 500 MPa) suggesting that the variation in sulfate was greatest above the primary aggregate level. Scanning electron microscopy of samples at different states of shear in oscillatory rheometry confirmed the variation in the linear elastic region and the viscous flow regime. A technique of this investigation was to approach aggregation via a novel perspective: aggregates are distinguished as being loose open structures that are highly disordered and stochastic in nature. The methodology used was to investigate the shear stresses required to rupture the various aggregation stages encountered and investigate the attempt to realign the now free-flowing constituents comprising the aggregate into a denser configuration. Mercury porosimetry was utilized to measure the pore size of the compact resulting from

  7. Monosized aggregates -- A new model

    SciTech Connect

    Gopal, M.

    1997-08-01

    For applications requiring colloidal particles, it is desirable that they be monosized to better control the structure and the properties. In a number of systems, the monosized particles come together to form aggregates that are also monosized. A model is presented here to explain the formation of these monosized aggregates. This is of particular importance in the fields of ceramics, catalysis, pigments, pharmacy, photographic emulsions, etc.

  8. Model for amorphous aggregation processes

    NASA Astrophysics Data System (ADS)

    Stranks, Samuel D.; Ecroyd, Heath; van Sluyter, Steven; Waters, Elizabeth J.; Carver, John A.; von Smekal, Lorenz

    2009-11-01

    The amorphous aggregation of proteins is associated with many phenomena, ranging from the formation of protein wine haze to the development of cataract in the eye lens and the precipitation of recombinant proteins during their expression and purification. While much literature exists describing models for linear protein aggregation, such as amyloid fibril formation, there are few reports of models which address amorphous aggregation. Here, we propose a model to describe the amorphous aggregation of proteins which is also more widely applicable to other situations where a similar process occurs, such as in the formation of colloids and nanoclusters. As first applications of the model, we have tested it against experimental turbidimetry data of three proteins relevant to the wine industry and biochemistry, namely, thaumatin, a thaumatinlike protein, and α -lactalbumin. The model is very robust and describes amorphous experimental data to a high degree of accuracy. Details about the aggregation process, such as shape parameters of the aggregates and rate constants, can also be extracted.

  9. Glycation precedes lens crystallin aggregation

    SciTech Connect

    Swamy, M.S.; Perry, R.E.; Abraham, E.C.

    1987-05-01

    Non-enzymatic glycosylation (glycation) seems to have the potential to alter the structure of crystallins and make them susceptible to thiol oxidation leading to disulfide-linked high molecular weight (HMW) aggregate formation. They used streptozotocin diabetic rats during precataract and cataract stages and long-term cell-free glycation of bovine lens crystallins to study the relationship between glycation and lens crystallin aggregation. HMW aggregates and other protein components of the water-soluble (WS) and urea-soluble (US) fractions were separated by molecular sieve high performance liquid chromatography. Glycation was estimated by both (/sup 3/H)NaBH/sub 4/ reduction and phenylboronate agarose affinity chromatography. Levels of total glycated protein (GP) in the US fractions were about 2-fold higher than in the WS fractions and there was a linear increase in GP in both WS and US fractions. This increase was parallelled by a corresponding increase in HMW aggregates. Total GP extracted by the affinity method from the US fraction showed a predominance of HMW aggregates and vice versa. Cell-free glycation studies with bovine crystallins confirmed the results of the animals studies. Increasing glycation caused a corresponding increase in protein insolubilization and the insoluble fraction thus formed also contained more glycated protein. It appears that lens protein glycation, HMW aggregate formation, and protein insolubilization are interrelated.

  10. Modifiers of mutant huntingtin aggregation

    PubMed Central

    Teuling, Eva; Bourgonje, Annika; Veenje, Sven; Thijssen, Karen; de Boer, Jelle; van der Velde, Joeri; Swertz, Morris; Nollen, Ellen

    2011-01-01

    Protein aggregation is a common hallmark of a number of age-related neurodegenerative diseases, including Alzheimer’s, Parkinson’s, and polyglutamine-expansion disorders such as Huntington’s disease, but how aggregation-prone proteins lead to pathology is not known. Using a genome-wide RNAi screen in a C. elegans-model for polyglutamine aggregation, we previously identified 186 genes that suppress aggregation. Using an RNAi screen for human orthologs of these genes, we here present 26 human genes that suppress aggregation of mutant huntingtin in a human cell line. Among these are genes that have not been previously linked to mutant huntingtin aggregation. They include those encoding eukaryotic translation initiation, elongation and translation factors, and genes that have been previously associated with other neurodegenerative diseases, like the ATP-ase family gene 3-like 2 (AFG3L2) and ubiquitin-like modifier activating enzyme 1 (UBA1). Unravelling the role of these genes will broaden our understanding of the pathogenesis of Huntington’s disease. PMID:21915392

  11. Kinetic model for erythrocyte aggregation.

    PubMed

    Bertoluzzo, S M; Bollini, A; Rasia, M; Raynal, A

    1999-01-01

    It is well known that light transmission through blood is the most widely utilized method for the study of erythrocyte aggregation. The curves obtained had been considered empirically as exponential functions. In consequence, the process becomes characterized by an only parameter that varies with all the process factors without discrimination. In the present paper a mathematical model for RBC aggregation process is deduced in accordance with von Smoluchowski's theory about the kinetics of colloidal particles agglomeration. The equation fitted the experimental pattern of the RBC suspension optical transmittance closely and contained two parameters that estimate the most important characteristics of the aggregation process separately, i.e., (1) average size of rouleaux at equilibrium and (2) aggregation rate. The evaluation of the method was assessed by some factors affecting erythrocyte aggregation, such as temperature, plasma dilutions, Dextran 500, Dextran 70 and PVP 360, at different media concentrations, cellular membrane alteration by the alkylating agent TCEA, and decrease of medium osmolarity. Results were interpreted considering the process characteristics estimated by the parameters, and there were also compared with similar studies carried out by other authors with other methods. This analysis allowed us to conclude that the equation proposed is reliable and useful to study erythrocyte aggregation. PMID:10660481

  12. Ash Aggregates in Proximal Settings

    NASA Astrophysics Data System (ADS)

    Porritt, L. A.; Russell, K.

    2012-12-01

    Ash aggregates are thought to have formed within and been deposited by the eruption column and plume and dilute density currents and their associated ash clouds. Moist, turbulent ash clouds are considered critical to ash aggregate formation by facilitating both collision and adhesion of particles. Consequently, they are most commonly found in distal deposits. Proximal deposits containing ash aggregates are less commonly observed but do occur. Here we describe two occurrences of vent proximal ash aggregate-rich deposits; the first within a kimberlite pipe where coated ash pellets and accretionary lapilli are found within the intra-vent sequence; and the second in a glaciovolcanic setting where cored pellets (armoured lapilli) occur within <1 km of the vent. The deposits within the A418 pipe, Diavik Diamond Mine, Canada, are the residual deposits within the conduit and vent of the volcano and are characterised by an abundance of ash aggregates. Coated ash pellets are dominant but are followed in abundance by ash pellets, accretionary lapilli and rare cored pellets. The coated ash pellets typically range from 1 - 5 mm in diameter and have core to rim ratios of approximately 10:1. The formation and preservation of these aggregates elucidates the style and nature of the explosive phase of kimberlite eruption at A418 (and other pipes?). First, these pyroclasts dictate the intensity of the kimberlite eruption; it must be energetic enough to cause intense fragmentation of the kimberlite to produce a substantial volume of very fine ash (<62 μm). Secondly, the ash aggregates indicate the involvement of moisture coupled with the presence of dilute expanded eruption clouds. The structure and distribution of these deposits throughout the kimberlite conduit demand that aggregation and deposition operate entirely within the confines of the vent; this indicates that aggregation is a rapid process. Ash aggregates within glaciovolcanic sequences are also rarely documented. The

  13. Magnetic fields of the solar system: A comparative planetology toolkit

    NASA Astrophysics Data System (ADS)

    Nicholas, J. B.; Purucker, M. E.; Johnson, C. L.; Sabaka, T. J.; Olsen, N.; Sun, Z.; Al Asad, M.; Anderson, B. J.; Korth, H.; Slavin, J. A.; Alexeev, I. I.; Belenkaya, E. S.; Phillips, R. J.; Solomon, S. C.; Lillis, R. J.; Langlais, B.; Winslow, R. M.; Russell, C. T.; Dougherty, M. K.; Zuber, M. T.

    2011-12-01

    Magnetic fields within the solar system provide a strong organizing force for processes active both within a planet or moon, and outside of it. In the interest of stimulating research and education in the field of comparative planetology, we present documented Fortran and MATLAB source codes and benchmarks to the latest models for planets and satellites that host internal magnetic fields. This presentation is made in the context of an interactive website: http://planetary-mag.net. Models are included for Earth (Comprehensive model CM4 of Sabaka et al., 2004, Geophysics J. Int.), Mercury (Anderson et al, 2011, Science), the Moon (Purucker and Nicholas, 2010, JGR), Mars (Lillis et al., 2010, JGR), and the outer planets Jupiter, Saturn, Uranus, and Neptune (Russell and Dougherty, 2010, Space Science Reviews). All models include magnetic fields of internal origin, and fields of external origin are included in the models for Mercury, the Earth, and the Moon. As models evolve, we intend to include magnetic fields of external origin for the other planets and moons. The website allows the user to select a coordinate system, such as planet-centered, heliocentric, or boundary normal, and the location within that coordinate system, and the vector magnetic field due to each of the component source fields at that location is then calculated and presented. Alternatively, the user can input a range as well as a grid spacing, and the vector magnetic field will be calculated for all points on that grid and be made available as a file for downloading.

  14. MDAnalysis: a toolkit for the analysis of molecular dynamics simulations.

    PubMed

    Michaud-Agrawal, Naveen; Denning, Elizabeth J; Woolf, Thomas B; Beckstein, Oliver

    2011-07-30

    MDAnalysis is an object-oriented library for structural and temporal analysis of molecular dynamics (MD) simulation trajectories and individual protein structures. It is written in the Python language with some performance-critical code in C. It uses the powerful NumPy package to expose trajectory data as fast and efficient NumPy arrays. It has been tested on systems of millions of particles. Many common file formats of simulation packages including CHARMM, Gromacs, Amber, and NAMD and the Protein Data Bank format can be read and written. Atoms can be selected with a syntax similar to CHARMM's powerful selection commands. MDAnalysis enables both novice and experienced programmers to rapidly write their own analytical tools and access data stored in trajectories in an easily accessible manner that facilitates interactive explorative analysis. MDAnalysis has been tested on and works for most Unix-based platforms such as Linux and Mac OS X. It is freely available under the GNU General Public License from http://mdanalysis.googlecode.com. PMID:21500218

  15. KAnalyze: a fast versatile pipelined K-mer toolkit

    PubMed Central

    Audano, Peter; Vannberg, Fredrik

    2014-01-01

    Motivation: Converting nucleotide sequences into short overlapping fragments of uniform length, k-mers, is a common step in many bioinformatics applications. While existing software packages count k-mers, few are optimized for speed, offer an application programming interface (API), a graphical interface or contain features that make it extensible and maintainable. We designed KAnalyze to compete with the fastest k-mer counters, to produce reliable output and to support future development efforts through well-architected, documented and testable code. Currently, KAnalyze can output k-mer counts in a sorted tab-delimited file or stream k-mers as they are read. KAnalyze can process large datasets with 2 GB of memory. This project is implemented in Java 7, and the command line interface (CLI) is designed to integrate into pipelines written in any language. Results: As a k-mer counter, KAnalyze outperforms Jellyfish, DSK and a pipeline built on Perl and Linux utilities. Through extensive unit and system testing, we have verified that KAnalyze produces the correct k-mer counts over multiple datasets and k-mer sizes. Availability and implementation: KAnalyze is available on SourceForge: https://sourceforge.net/projects/kanalyze/ Contact: fredrik.vannberg@biology.gatech.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24642064

  16. An SML Driven Graphical User Interface and Application Management Toolkit

    SciTech Connect

    White, Greg R

    2002-01-18

    In the past, the features of a user interface were limited by those available in the existing graphical widgets it used. Now, improvements in processor speed have fostered the emergence of interpreted languages, in which the appropriate method to render a given data object can be loaded at runtime. XML can be used to precisely describe the association of data types with their graphical handling (beans), and Java provides an especially rich environment for programming the graphics. We present a graphical user interface builder based on Java Beans and XML, in which the graphical screens are described textually (in files or a database) in terms of their screen components. Each component may be a simple text read back, or a complex plot. The programming model provides for dynamic data pertaining to a component to be forwarded synchronously or asynchronously, to the appropriate handler, which may be a built-in method, or a complex applet. This work was initially motivated by the need to move the legacy VMS display interface of the SLAC Control Program to another platform while preserving all of its existing functionality. However the model allows us a powerful and generic system for adding new kinds of graphics, such as Matlab, data sources, such as EPICS, middleware, such as AIDA[1], and transport, such as XML and SOAP. The system will also include a management console, which will be able to report on the present usage of the system, for instance who is running it where and connected to which channels.

  17. An XML Driven Graphical User Interface and Application Management Toolkit

    SciTech Connect

    White, Greg R

    2002-01-18

    In the past, the features of a user interface were limited by those available in the existing graphical widgets it used. Now, improvements in processor speed have fostered the emergence of interpreted languages, in which the appropriate method to render a given data object can be loaded at runtime. XML can be used to precisely describe the association of data types with their graphical handling (beans), and Java provides an especially rich environment for programming the graphics. We present a graphical user interface builder based on Java Beans and XML, in which the graphical screens are described textually (in files or a database) in terms of their screen components. Each component may be a simple text read back, or a complex plot. The programming model provides for dynamic data pertaining to a component to be forwarded synchronously or asynchronously, to the appropriate handler, which may be a built-in method, or a complex applet. This work was initially motivated by the need to move the legacy VMS display interface of the SLAC Control Program to another platform while preserving all of its existing functionality. However the model allows us a powerful and generic system for adding new kinds of graphics, such as Matlab, data sources, such as EPICS, middleware, such as AIDA[1], and transport, such as XML and SOAP. The system will also include a management console, which will be able to report on the present usage of the system, for instance who is running it where and connected to which channels.

  18. MDAnalysis: A Toolkit for the Analysis of Molecular Dynamics Simulations

    PubMed Central

    Michaud-Agrawal, Naveen; Denning, Elizabeth J.; Woolf, Thomas B.; Beckstein, Oliver

    2011-01-01

    MDAnalysis is an object-oriented library for structural and temporal analysis of molecular dynamics (MD) simulation trajectories and individual protein structures. It is written in the Python language with some performance-critical code in C. It uses the powerful NumPy package to expose trajectory data as fast and efficient NumPy arrays. It has been tested on systems of millions of particles. Many common file formats of simulation packages including CHARMM, Gromacs, Amber, and NAMD and the Protein Data Bank format can be read and written. Atoms can be selected with a syntax similar to CHARMM’s powerful selection commands. MDAnalysis enables both novice and experienced programmers to rapidly write their own analytical tools and access data stored in trajectories in an easily accessible manner that facilitates interactive explorative analysis. MDAnalysis has been tested on and works for most Unix-based platforms such as Linux and Mac OS X. It is freely available under the GNU Public License from http://mdanalysis.googlecode.com. PMID:21500218

  19. Crystal aggregation in kidney stones; a polymer aggregation problem?

    NASA Astrophysics Data System (ADS)

    Wesson, J.; Beshensky, A.; Viswanathan, P.; Zachowicz, W.; Kleinman, J.

    2008-03-01

    Kidney stones most frequently form as aggregates of calcium oxalate monohydrate (COM) crystals with organic layers between them, and the organic layers contain principally proteins. The pathway leading to the formation of these crystal aggregates in affected people has not been identified, but stone forming patients are thought to have a defect in the structure or distribution of urinary proteins, which normally protect against stone formation. We have developed two polyelectrolyte models that will induce COM crystal aggregation in vitro, and both are consistent with possible urinary protein compositions. The first model was based on mixing polyanionic and polycationic proteins, in portions such that the combined protein charge is near zero. The second model was based on reducing the charge density on partially charged polyanionic proteins, specifically Tamm-Horsfall protein, the second most abundant protein in urine. Both models demonstrated polymer phase separation at solution conditions where COM crystal aggregation was observed. Correlation with data from other bulk crystallization measurements suggest that the anionic side chains form critical binding interactions with COM surfaces that are necessary along with the phase separation process to induce COM crystal aggregation.

  20. EPA FACILITY POINT LOCATION FILES

    EPA Science Inventory

    Data includes locations of facilities from which pollutants are discharged. The epapoints.tar.gz file is a gzipped tar file of 14 Arc/Info export files and text documents. The .txt files define the attributes located in the INFO point coverage files. Projections are defined in...

  1. Medical image file formats.

    PubMed

    Larobina, Michele; Murino, Loredana

    2014-04-01

    Image file format is often a confusing aspect for someone wishing to process medical images. This article presents a demystifying overview of the major file formats currently used in medical imaging: Analyze, Neuroimaging Informatics Technology Initiative (Nifti), Minc, and Digital Imaging and Communications in Medicine (Dicom). Concepts common to all file formats, such as pixel depth, photometric interpretation, metadata, and pixel data, are first presented. Then, the characteristics and strengths of the various formats are discussed. The review concludes with some predictive considerations about the future trends in medical image file formats. PMID:24338090

  2. Fractal Aggregates in Tennis Ball Systems

    ERIC Educational Resources Information Center

    Sabin, J.; Bandin, M.; Prieto, G.; Sarmiento, F.

    2009-01-01

    We present a new practical exercise to explain the mechanisms of aggregation of some colloids which are otherwise not easy to understand. We have used tennis balls to simulate, in a visual way, the aggregation of colloids under reaction-limited colloid aggregation (RLCA) and diffusion-limited colloid aggregation (DLCA) regimes. We have used the…

  3. Performance of the IBM General Parallel File System

    SciTech Connect

    Jones, T.; Koniges, A.; Yates, R.K.

    1999-09-27

    Experimental performance analysis is a necessary first step in input/output software tuning and real-time environment code performance prediction. We measure the performance and scalability of IBM's General Parallel File System (GPFS) under a variety of conditions. The measurements are based on a set of benchmark codes that allow us to vary block sizes, access patterns, etc., and to measure aggregate throughput rates. We use the data to give performance recommendations for application development and as a guide to the improvement of parallel file systems.

  4. MuCor: mutation aggregation and correlation

    PubMed Central

    Kroll, Karl W.; Eisfeld, Ann-Katherin; Lozanski, Gerard; Bloomfield, Clara D.; Byrd, John C.; Blachly, James S.

    2016-01-01

    Motivation: There are many tools for variant calling and effect prediction, but little to tie together large sample groups. Aggregating, sorting and summarizing variants and effects across a cohort is often done with ad hoc scripts that must be re-written for every new project. In response, we have written MuCor, a tool to gather variants from a variety of input formats (including multiple files per sample), perform database lookups and frequency calculations, and write many types of reports. In addition to use in large studies with numerous samples, MuCor can also be employed to directly compare variant calls from the same sample across two or more platforms, parameters or pipelines. A companion utility, DepthGauge, measures coverage at regions of interest to increase confidence in calls. Availability and implementation: Source code is freely available at https://github.com/blachlylab/mucor and a Docker image is available at https://hub.docker.com/r/blachlylab/mucor/ Contact: james.blachly@osumc.edu Supplementary data: Supplementary data are available at Bioinformatics online. PMID:26803155

  5. Toward an Efficient Icing CFD Process Using an Interactive Software Toolkit: Smagglce 2D

    NASA Technical Reports Server (NTRS)

    Vickerman, Mary B.; Choo, Yung K.; Schilling, Herbert W.; Baez, Marivell; Braun, Donald C.; Cotton, Barbara J.

    2001-01-01

    Two-dimensional CID analysis for iced airfoils can be a labor-intensive task. The software toolkit SmaggIce 2D is being developed to help streamline the CID process and provide the unique features needed for icing. When complete, it will include a combination of partially automated and fully interactive tools for all aspects of the tasks leading up to the flow analysis: geometry preparation, domain decomposition. block boundary demoralization. gridding, and linking with a flow solver. It also includes tools to perform ice shape characterization, an important aid in determining the relationship between ice characteristics and their effects on aerodynamic performance. Completed tools, work-in-progress, and planned features of the software toolkit are presented here.

  6. Applying natural language processing toolkits to electronic health records - an experience report.

    PubMed

    Barrett, Neil; Weber-Jahnke, Jens H

    2009-01-01

    A natural language challenge devised by Informatics for Integrating Biology and the Bedside (i2b2) was to analyze free-text health data to construct a multi-class, multi-label classification system focused on obesity and its co-morbidities. This report presents a case study in which a natural language processing (NLP) toolkit, called NLTK, was used in the challenge. This report provides a brief review of NLP in the context of EHR applications, briefly surveys and contrasts some existing NLP toolkits, and reports on our experiences with the i2b2 case study. Our efforts uncovered issues including the lack of human annotated physician notes for use as NLP training data, differences between conventional free-text and medical notes, and potential hardware and software limitations affecting future projects. PMID:19380974

  7. Efficient Genome Editing in Caenorhabditis elegans with a Toolkit of Dual-Marker Selection Cassettes.

    PubMed

    Norris, Adam D; Kim, Hyun-Min; Colaiácovo, Mónica P; Calarco, John A

    2015-10-01

    Use of the CRISPR/Cas9 RNA-guided endonuclease complex has recently enabled the generation of double-strand breaks virtually anywhere in the C. elegans genome. Here, we present an improved strategy that makes all steps in the genome editing process more efficient. We have created a toolkit of template-mediated repair cassettes that contain an antibiotic resistance gene to select for worms carrying the repair template and a fluorescent visual marker that facilitates identification of bona fide recombinant animals. Homozygous animals can be identified as early as 4-5 days post-injection, and minimal genotyping by PCR is required. We demonstrate that our toolkit of dual-marker vectors can generate targeted disruptions, deletions, and endogenous tagging with fluorescent proteins and epitopes. This strategy should be useful for a wide variety of additional applications and will provide researchers with increased flexibility when designing genome editing experiments. PMID:26232410

  8. Efficient Genome Editing in Caenorhabditis elegans with a Toolkit of Dual-Marker Selection Cassettes

    PubMed Central

    Norris, Adam D.; Kim, Hyun-Min; Colaiácovo, Mónica P.; Calarco, John A.

    2015-01-01

    Use of the CRISPR/Cas9 RNA-guided endonuclease complex has recently enabled the generation of double-strand breaks virtually anywhere in the C. elegans genome. Here, we present an improved strategy that makes all steps in the genome editing process more efficient. We have created a toolkit of template-mediated repair cassettes that contain an antibiotic resistance gene to select for worms carrying the repair template and a fluorescent visual marker that facilitates identification of bona fide recombinant animals. Homozygous animals can be identified as early as 4–5 days post-injection, and minimal genotyping by PCR is required. We demonstrate that our toolkit of dual-marker vectors can generate targeted disruptions, deletions, and endogenous tagging with fluorescent proteins and epitopes. This strategy should be useful for a wide variety of additional applications and will provide researchers with increased flexibility when designing genome editing experiments. PMID:26232410

  9. Measuring the Environmental Dimensions of Human Migration: The Demographer’s Toolkit

    PubMed Central

    Hunter, Lori M.; Gray, Clark L.

    2014-01-01

    In recent years, the empirical literature linking environmental factors and human migration has grown rapidly and gained increasing visibility among scholars and the policy community. Still, this body of research uses a wide range of methodological approaches for assessing environment-migration relationships. Without comparable data and measures across a range of contexts, it is impossible to make generalizations that would facilitate the development of future migration scenarios. Demographic researchers have a large methodological toolkit for measuring migration as well as modeling its drivers. This toolkit includes population censuses, household surveys, survival analysis and multi-level modeling. This paper’s purpose is to introduce climate change researchers to demographic data and methods and to review exemplary studies of the environmental dimensions of human migration. Our intention is to foster interdisciplinary understanding and scholarship, and to promote high quality research on environment and migration that will lead toward broader knowledge of this association. PMID:25177108

  10. Tripal: a construction toolkit for online genome databases.

    PubMed

    Ficklin, Stephen P; Sanderson, Lacey-Anne; Cheng, Chun-Huai; Staton, Margaret E; Lee, Taein; Cho, Il-Hyung; Jung, Sook; Bett, Kirstin E; Main, Doreen

    2011-01-01

    As the availability, affordability and magnitude of genomics and genetics research increases so does the need to provide online access to resulting data and analyses. Availability of a tailored online database is the desire for many investigators or research communities; however, managing the Information Technology infrastructure needed to create such a database can be an undesired distraction from primary research or potentially cost prohibitive. Tripal provides simplified site development by merging the power of Drupal, a popular web Content Management System with that of Chado, a community-derived database schema for storage of genomic, genetic and other related biological data. Tripal provides an interface that extends the content management features of Drupal to the data housed in Chado. Furthermore, Tripal provides a web-based Chado installer, genomic data loaders, web-based editing of data for organisms, genomic features, biological libraries, controlled vocabularies and stock collections. Also available are Tripal extensions that support loading and visualizations of NCBI BLAST, InterPro, Kyoto Encyclopedia of Genes and Genomes and Gene Ontology analyses, as well as an extension that provides integration of Tripal with GBrowse, a popular GMOD tool. An Application Programming Interface is available to allow creation of custom extensions by site developers, and the look-and-feel of the site is completely customizable through Drupal-based PHP template files. Addition of non-biological content and user-management is afforded through Drupal. Tripal is an open source and freely available software package found at http://tripal.sourceforge.net. PMID:21959868

  11. 11 CFR 100.19 - File, filed or filing (2 U.S.C. 434(a)).

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... file electronically under 11 CFR 104.18, may file 24-hour reports using the Commission's website's on... to be filed under 11 CFR parts 101, 102, 104, 105, 107, 108, and 109, and any modifications or... required by 11 CFR part 105, by the close of business on the prescribed filing date. (b) Timely filed....

  12. 11 CFR 100.19 - File, filed or filing (2 U.S.C. 434(a)).

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... file electronically under 11 CFR 104.18, may file 24-hour reports using the Commission's website's on... to be filed under 11 CFR parts 101, 102, 104, 105, 107, 108, and 109, and any modifications or... required by 11 CFR part 105, by the close of business on the prescribed filing date. (b) Timely filed....

  13. 75 FR 30017 - Electronic Tariff Filings; Notice of Posting Regarding Filing Procedures for Electronically Filed...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-28

    ... Energy Regulatory Commission Electronic Tariff Filings; Notice of Posting Regarding Filing Procedures for Electronically Filed Tariffs May 21, 2010. Take Notice that the attached document ``Filing Procedures For Electronically Filed Tariffs, Rate Schedules And Jurisdictional Agreements'' has been posted on the eTariff...

  14. Aggregated Recommendation through Random Forests

    PubMed Central

    2014-01-01

    Aggregated recommendation refers to the process of suggesting one kind of items to a group of users. Compared to user-oriented or item-oriented approaches, it is more general and, therefore, more appropriate for cold-start recommendation. In this paper, we propose a random forest approach to create aggregated recommender systems. The approach is used to predict the rating of a group of users to a kind of items. In the preprocessing stage, we merge user, item, and rating information to construct an aggregated decision table, where rating information serves as the decision attribute. We also model the data conversion process corresponding to the new user, new item, and both new problems. In the training stage, a forest is built for the aggregated training set, where each leaf is assigned a distribution of discrete rating. In the testing stage, we present four predicting approaches to compute evaluation values based on the distribution of each tree. Experiments results on the well-known MovieLens dataset show that the aggregated approach maintains an acceptable level of accuracy. PMID:25180204

  15. Aggregated recommendation through random forests.

    PubMed

    Zhang, Heng-Ru; Min, Fan; He, Xu

    2014-01-01

    Aggregated recommendation refers to the process of suggesting one kind of items to a group of users. Compared to user-oriented or item-oriented approaches, it is more general and, therefore, more appropriate for cold-start recommendation. In this paper, we propose a random forest approach to create aggregated recommender systems. The approach is used to predict the rating of a group of users to a kind of items. In the preprocessing stage, we merge user, item, and rating information to construct an aggregated decision table, where rating information serves as the decision attribute. We also model the data conversion process corresponding to the new user, new item, and both new problems. In the training stage, a forest is built for the aggregated training set, where each leaf is assigned a distribution of discrete rating. In the testing stage, we present four predicting approaches to compute evaluation values based on the distribution of each tree. Experiments results on the well-known MovieLens dataset show that the aggregated approach maintains an acceptable level of accuracy. PMID:25180204

  16. The Trick Simulation Toolkit: A NASA/Open source Framework for Running Time Based Physics Models

    NASA Technical Reports Server (NTRS)

    Penn, John M.; Lin, Alexander S.

    2016-01-01

    This paper describes the design and use at of the Trick Simulation Toolkit, a simulation development environment for creating high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. It describes Trick's design goals and how the development environment attempts to achieve those goals. It describes how Trick is used in some of the many training and engineering simulations at NASA. Finally it describes the Trick NASA/Open source project on Github.

  17. The SpeX Prism Library Analysis Toolkit: Design Considerations and First Results

    NASA Astrophysics Data System (ADS)

    Burgasser, Adam J.; Aganze, Christian; Escala, Ivana; Lopez, Mike; Choban, Caleb; Jin, Yuhui; Iyer, Aishwarya; Tallis, Melisa; Suarez, Adrian; Sahi, Maitrayee

    2016-01-01

    Various observational and theoretical spectral libraries now exist for galaxies, stars, planets and other objects, which have proven useful for classification, interpretation, simulation and model development. Effective use of these libraries relies on analysis tools, which are often left to users to develop. In this poster, we describe a program to develop a combined spectral data repository and Python-based analysis toolkit for low-resolution spectra of very low mass dwarfs (late M, L and T dwarfs), which enables visualization, spectral index analysis, classification, atmosphere model comparison, and binary modeling for nearly 2000 library spectra and user-submitted data. The SpeX Prism Library Analysis Toolkit (SPLAT) is being constructed as a collaborative, student-centered, learning-through-research model with high school, undergraduate and graduate students and regional science teachers, who populate the database and build the analysis tools through quarterly challenge exercises and summer research projects. In this poster, I describe the design considerations of the toolkit, its current status and development plan, and report the first published results led by undergraduate students. The combined data and analysis tools are ideal for characterizing cool stellar and exoplanetary atmospheres (including direct exoplanetary spectra observations by Gemini/GPI, VLT/SPHERE, and JWST), and the toolkit design can be readily adapted for other spectral datasets as well.This material is based upon work supported by the National Aeronautics and Space Administration under Grant No. NNX15AI75G. SPLAT code can be found at https://github.com/aburgasser/splat.

  18. Use of OHRA Toolkit in the QRA work in Norway and UK

    SciTech Connect

    Skramstad, E.; Hundseid, H.

    1995-12-31

    The Offshore Hazard and Risk Analysis (OHRA) Toolkit is a comprehensive computer program which has been developed for quantitative risk analyses of offshore installations. Use of OHRAT in the QRA (Quantitative Risk Analysis) work in Norway and UK are increasing rapidly. Being a flexible tool there is no fixed approach for how to do a QRA with OHRAT. The purpose of this paper is to present typical approaches and experience from use for operators in the Norwegian part of the North Sea.

  19. The doctor-patient relationship as a toolkit for uncertain clinical decisions.

    PubMed

    Diamond-Brown, Lauren

    2016-06-01

    Medical uncertainty is a well-recognized problem in healthcare, yet how doctors make decisions in the face of uncertainty remains to be understood. This article draws on interdisciplinary literature on uncertainty and physician decision-making to examine a specific physician response to uncertainty: using the doctor-patient relationship as a toolkit. Additionally, I ask what happens to this process when the doctor-patient relationship becomes fragmented. I answer these questions by examining obstetrician-gynecologists' narratives regarding how they make decisions when faced with uncertainty in childbirth. Between 2013 and 2014, I performed 21 semi-structured interviews with obstetricians in the United States. Obstetricians were selected to maximize variation in relevant physician, hospital, and practice characteristics. I began with grounded theory and moved to analytical coding of themes in relation to relevant literature. My analysis renders it evident that some physicians use the doctor-patient relationship as a toolkit for dealing with uncertainty. I analyze how this process varies for physicians in different models of care by comparing doctors' experiences in models with continuous versus fragmented doctor-patient relationships. My key findings are that obstetricians in both models appealed to the ideal of patient-centered decision-making to cope with uncertain decisions, but in practice physicians in fragmented care faced a number of challenges to using the doctor-patient relationship as a toolkit for decision-making. These challenges led to additional uncertainties and in some cases to poor outcomes for doctors and/or patients; they also raised concerns about the reproduction of inequality. Thus organization of care delivery mitigates the efficacy of doctors' use of the doctor-patient relationship toolkit for uncertain decisions. These findings have implications for theorizing about decision-making under conditions of medical uncertainty, for understanding

  20. A Molecular Toolkit to Visualize Native Protein Assemblies in the Context of Human Disease

    PubMed Central

    Gilmore, Brian L.; Winton, Carly E.; Demmert, Andrew C.; Tanner, Justin R.; Bowman, Sam; Karageorge, Vasilea; Patel, Kaya; Sheng, Zhi; Kelly, Deborah F.

    2015-01-01

    We present a new molecular toolkit to investigate protein assemblies natively formed in the context of human disease. The system employs tunable microchips that can be decorated with switchable adaptor molecules to select for target proteins of interest and analyze them using molecular microscopy. Implementing our new streamlined microchip approach, we could directly visualize BRCA1 gene regulatory complexes from patient-derived cancer cells for the first time. PMID:26395823

  1. Aggregation operations for multiaspect fuzzy soft sets

    NASA Astrophysics Data System (ADS)

    Sulaiman, Nor Hashimah; Mohamad, Daud

    2015-10-01

    Multiaspect fuzzy soft set (MAFSS) is one of the generalized forms of fuzzy soft sets. In this paper, we introduce two types of aggregation operations for MAFSSs, namely the weighted arithmetic mean (WAM)-based MAFSS aggregation, and the ordered weighted aggregation (OWA)-based MAFSS aggregation. The applicability of the two MAFSS-aggregation operations is illustrated with numerical examples in group decision making.

  2. Evaporation effects in elastocapillary aggregation

    NASA Astrophysics Data System (ADS)

    Vella, Dominic; Hadjittofis, Andreas; Singh, Kiran; Lister, John

    2015-11-01

    We consider the effect of evaporation on the aggregation of a number of elastic objects due to a liquid's surface tension. In particular, we consider an array of spring-block elements in which the gaps between blocks are filled by thin liquid films that evaporate during the course of an experiment. Using lubrication theory to account for the fluid flow within the gaps, we study the dynamics of aggregation. We find that a non-zero evaporation rate causes the elements to aggregate more quickly and, indeed, to contact within finite time. However, we also show that the number of elements within each cluster decreases as the evaporation rate increases. We explain these results quantitatively by comparison with the corresponding two-body problem and discuss their relevance for controlling pattern formation in carbon nanotube forests.

  3. Molecular Aggregation in Disodium Cromoglycate

    NASA Astrophysics Data System (ADS)

    Singh, Gautam; Agra-Kooijman, D.; Collings, P. J.; Kumar, Satyendra

    2012-02-01

    Details of molecular aggregation in the mesophases of the anti-asthmatic drug disodium cromoglycate (DSCG) have been studied using x-ray synchrotron scattering. The results show two reflections, one at wide angles corresponding to π-π stacking (3.32 å) of molecules, and the other at small angles which is perpendicular to the direction of molecular stacking and corresponds to the distance between the molecular aggregates. The latter varies from 35 - 41 å in the nematic (N) phase and 27 -- 32 å in the columnar (M) phase. The temperature evolution of the stack height, positional order correlations in the lateral direction, and orientation order parameter were determined in the N, M, and biphasic regions. The structure of the N and M phases and the nature of the molecular aggregation, together with their dependence on temperature and concentration, will be presented.

  4. Global kinetic analysis of seeded BSA aggregation.

    PubMed

    Sahin, Ziya; Demir, Yusuf Kemal; Kayser, Veysel

    2016-04-30

    Accelerated aggregation studies were conducted around the melting temperature (Tm) to elucidate the kinetics of seeded BSA aggregation. Aggregation was tracked by SEC-HPLC and intrinsic fluorescence spectroscopy. Time evolution of monomer, dimer and soluble aggregate concentrations were globally analysed to reliably deduce mechanistic details pertinent to the process. Results showed that BSA aggregated irreversibly through both sequential monomer addition and aggregate-aggregate interactions. Sequential monomer addition proceeded only via non-native monomers, starting to occur only by 1-2°C below the Tm. Aggregate-aggregate interactions were the dominant mechanism below the Tm due to an initial presence of small aggregates that acted as seeds. Aggregate-aggregate interactions were significant also above the Tm, particularly at later stages of aggregation when sequential monomer addition seemed to cease, leading in some cases to insoluble aggregate formation. The adherence (or non-thereof) of the mechanisms to Arrhenius kinetics were discussed alongside possible implications of seeding for biopharmaceutical shelf-life and spectroscopic data interpretation, the latter of which was found to often be overlooked in BSA aggregation studies. PMID:26970282

  5. Hopper File Management Tool

    SciTech Connect

    Long, J W; O'Neill, N J; Smith, N G; Springmeyer, R R; Remmele, S; Richards, D A; Southon, J

    2004-11-15

    Hopper is a powerful interactive tool that allows users to transfer and manipulate files and directories by means of a graphical user interface. Users can connect to and manage resources using the major file transfer protocols. Implemented in Java, Hopper can be run almost anywhere: from an individual's desktop machine to large production machines. In a high-performance computing environment, managing files can become a difficult and time-consuming task that distracts from scientific work. Users must deal with multiple file transfer protocols, transferring enormous amounts of files between computer platforms, repeated authentication, organizing massive amounts of data, and other detailed but necessary tasks. This is often accomplished with a set of several different tools, each with its own interface and idiosyncrasies. Our goal is to develop tools for a more automated approach to file management that substantially improves users' ability to transfer, organize, search, and operate on collections of files. This paper describes the Hopper tool for advanced file management, including the software architecture, the functionality, and the user interface.

  6. NASA Uniform Files Index

    NASA Technical Reports Server (NTRS)

    1987-01-01

    This handbook is a guide for the use of all personnel engaged in handling NASA files. It is issued in accordance with the regulations of the National Archives and Records Administration, in the Code of Federal Regulations Title 36, Part 1224, Files Management; and the Federal Information Resources Management Regulation, Subpart 201-45.108, Files Management. It is intended to provide a standardized classification and filing scheme to achieve maximum uniformity and ease in maintaining and using agency records. It is a framework for consistent organization of information in an arrangement that will be useful to current and future researchers. The NASA Uniform Files Index coding structure is composed of the subject classification table used for NASA management directives and the subject groups in the NASA scientific and technical information system. It is designed to correlate files throughout NASA and it is anticipated that it may be useful with automated filing systems. It is expected that in the conversion of current files to this arrangement it will be necessary to add tertiary subjects and make further subdivisions under the existing categories. Established primary and secondary subject categories may not be changed arbitrarily. Proposals for additional subject categories of NASA-wide applicability, and suggestions for improvement in this handbook, should be addressed to the Records Program Manager at the pertinent installation who will forward it to the NASA Records Management Office, Code NTR, for approval. This handbook is issued in loose-leaf form and will be revised by page changes.

  7. JENDL Dosimetry File 99.

    Energy Science and Technology Software Center (ESTSC)

    2001-01-22

    Version 00 JENDL/D-99 contains information for 47 nuclides and 67 reactions in the SAND-II group structure (although it was observed by RSICC that not all of the processed files are in the SAND-II group structure) and as 0K preprocessed pointwise files.

  8. Environmentalism and natural aggregate mining

    USGS Publications Warehouse

    Drew, L.J.; Langer, W.H.; Sachs, J.S.

    2002-01-01

    Sustaining a developed economy and expanding a developing one require the use of large volumes of natural aggregate. Almost all human activity (commercial, recreational, or leisure) is transacted in or on facilities constructed from natural aggregate. In our urban and suburban worlds, we are almost totally dependent on supplies of water collected behind dams and transported through aqueducts made from concrete. Natural aggregate is essential to the facilities that produce energy-hydroelectric dams and coal-fired powerplants. Ironically, the utility created for mankind by the use of natural aggregate is rarely compared favorably with the environmental impacts of mining it. Instead, the empty quarries and pits are seen as large negative environmental consequences. At the root of this disassociation is the philosophy of environmentalism, which flavors our perceptions of the excavation, processing, and distribution of natural aggregate. The two end-member ideas in this philosophy are ecocentrism and anthropocentrism. Ecocentrism takes the position that the natural world is a organism whose arteries are the rivers-their flow must not be altered. The soil is another vital organ and must not be covered with concrete and asphalt. The motto of the ecocentrist is "man must live more lightly on the land." The anthropocentrist wants clean water and air and an uncluttered landscape for human use. Mining is allowed and even encouraged, but dust and noise from quarry and pit operations must be minimized. The large volume of truck traffic is viewed as a real menace to human life and should be regulated and isolated. The environmental problems that the producers of natural aggregate (crushed stone and sand and gravel) face today are mostly difficult social and political concerns associated with the large holes dug in the ground and the large volume of heavy truck traffic associated with quarry and pit operations. These concerns have increased in recent years as society's demand for

  9. Register file soft error recovery

    DOEpatents

    Fleischer, Bruce M.; Fox, Thomas W.; Wait, Charles D.; Muff, Adam J.; Watson, III, Alfred T.

    2013-10-15

    Register file soft error recovery including a system that includes a first register file and a second register file that mirrors the first register file. The system also includes an arithmetic pipeline for receiving data read from the first register file, and error detection circuitry to detect whether the data read from the first register file includes corrupted data. The system further includes error recovery circuitry to insert an error recovery instruction into the arithmetic pipeline in response to detecting the corrupted data. The inserted error recovery instruction replaces the corrupted data in the first register file with a copy of the data from the second register file.

  10. Verification & Validation Toolkit to Assess Codes: Is it Theory Limitation, Numerical Method Inadequacy, Bug in the Code or a Serious Flaw?

    NASA Astrophysics Data System (ADS)

    Bombardelli, F. A.; Zamani, K.

    2014-12-01

    We introduce and discuss an open-source, user friendly, numerical post-processing piece of software to assess reliability of the modeling results of environmental fluid mechanics' codes. Verification and Validation, Uncertainty Quantification (VAVUQ) is a toolkit developed in Matlab© for general V&V proposes. In this work, The VAVUQ implementation of V&V techniques and user interfaces would be discussed. VAVUQ is able to read Excel, Matlab, ASCII, and binary files and it produces a log of the results in txt format. Next, each capability of the code is discussed through an example: The first example is the code verification of a sediment transport code, developed with the Finite Volume Method, with MES. Second example is a solution verification of a code for groundwater flow, developed with the Boundary Element Method, via MES. Third example is a solution verification of a mixed order, Compact Difference Method code of heat transfer via MMS. Fourth example is a solution verification of a 2-D, Finite Difference Method code of floodplain analysis via Complete Richardson Extrapolation. In turn, application of VAVUQ in quantitative model skill assessment studies (validation) of environmental codes is given through two examples: validation of a two-phase flow computational modeling of air entrainment in a free surface flow versus lab measurements and heat transfer modeling in the earth surface versus field measurement. At the end, we discuss practical considerations and common pitfalls in interpretation of V&V results.

  11. Mesoscale Simulation of Asphaltene Aggregation.

    PubMed

    Wang, Jiang; Ferguson, Andrew L

    2016-08-18

    Asphaltenes constitute a heavy aromatic crude oil fraction with a propensity to aggregate and precipitate out of solution during petroleum processing. Aggregation is thought to proceed according to the Yen-Mullins hierarchy, but the molecular mechanisms underlying mesoscopic assembly remain poorly understood. By combining coarse-grained molecular models parametrized using all-atom data with high-performance GPU hardware, we have performed molecular dynamics simulations of the aggregation of hundreds of asphaltenes over microsecond time scales. Our simulations reveal a hierarchical self-assembly mechanism consistent with the Yen-Mullins model, but the details are sensitive and depend on asphaltene chemistry and environment. At low concentrations asphaltenes exist predominantly as dispersed monomers. Upon increasing concentration, we first observe parallel stacking into 1D rod-like nanoaggregates, followed by the formation of clusters of nanoaggregates associated by offset, T-shaped, and edge-edge stacking. Asphaltenes possessing long aliphatic side chains cannot form nanoaggregate clusters due to steric repulsions between their aliphatic coronae. At very high concentrations, we observe a porous percolating network of rod-like nanoaggregates suspended in a sea of interpenetrating aliphatic side chains with a fractal dimension of ∼2. The lifetime of the rod-like aggregates is described by an exponential distribution reflecting a dynamic equilibrium between coagulation and fragmentation. PMID:27455391

  12. RAGG - R EPISODIC AGGREGATION PACKAGE

    EPA Science Inventory

    The RAGG package is an R implementation of the CMAQ episodic model aggregation method developed by Constella Group and the Environmental Protection Agency. RAGG is a tool to provide climatological seasonal and annual deposition of sulphur and nitrogen for multimedia management. ...

  13. An Aggregation Advisor for Ligand Discovery.

    PubMed

    Irwin, John J; Duan, Da; Torosyan, Hayarpi; Doak, Allison K; Ziebart, Kristin T; Sterling, Teague; Tumanian, Gurgen; Shoichet, Brian K

    2015-09-10

    Colloidal aggregation of organic molecules is the dominant mechanism for artifactual inhibition of proteins, and controls against it are widely deployed. Notwithstanding an increasingly detailed understanding of this phenomenon, a method to reliably predict aggregation has remained elusive. Correspondingly, active molecules that act via aggregation continue to be found in early discovery campaigns and remain common in the literature. Over the past decade, over 12 thousand aggregating organic molecules have been identified, potentially enabling a precedent-based approach to match known aggregators with new molecules that may be expected to aggregate and lead to artifacts. We investigate an approach that uses lipophilicity, affinity, and similarity to known aggregators to advise on the likelihood that a candidate compound is an aggregator. In prospective experimental testing, five of seven new molecules with Tanimoto coefficients (Tc's) between 0.95 and 0.99 to known aggregators aggregated at relevant concentrations. Ten of 19 with Tc's between 0.94 and 0.90 and three of seven with Tc's between 0.89 and 0.85 also aggregated. Another three of the predicted compounds aggregated at higher concentrations. This method finds that 61 827 or 5.1% of the ligands acting in the 0.1 to 10 μM range in the medicinal chemistry literature are at least 85% similar to a known aggregator with these physical properties and may aggregate at relevant concentrations. Intriguingly, only 0.73% of all drug-like commercially available compounds resemble the known aggregators, suggesting that colloidal aggregators are enriched in the literature. As a percentage of the literature, aggregator-like compounds have increased 9-fold since 1995, partly reflecting the advent of high-throughput and virtual screens against molecular targets. Emerging from this study is an aggregator advisor database and tool ( http://advisor.bkslab.org ), free to the community, that may help distinguish between

  14. Registered File Support for Critical Operations Files at SIRTF

    NASA Astrophysics Data System (ADS)

    Turek, G.; Handley, T.; Jacobson, J.; Rector, J.

    The SIRTF Science Center's (SSC) Science Operations System (SOS) has to contend with nearly one hundred critical operations files via comprehensive file management services. The management is accomplished via the registered file system (otherwise known as TFS) which manages these files in a registered file repository composed of a virtual file system accessible via a TFS server and a file registration database. The TFS server provides controlled, reliable, and secure file transfer and storage by registering all file transactions and meta-data in the file registration database. An API is provided for application programs to communicate with TFS servers and the repository. A command line client implementing this API has been developed as a client tool. This paper describes the architecture, current implementation, and more importantly, the evolution of these services based on evolving community use cases and emerging information system technology.

  15. Problem posing and cultural tailoring: developing an HIV/AIDS health literacy toolkit with the African American community.

    PubMed

    Rikard, R V; Thompson, Maxine S; Head, Rachel; McNeil, Carlotta; White, Caressa

    2012-09-01

    The rate of HIV infection among African Americans is disproportionately higher than for other racial groups in the United States. Previous research suggests that low level of health literacy (HL) is an underlying factor to explain racial disparities in the prevalence and incidence of HIV/AIDS. The present research describes a community and university project to develop a culturally tailored HIV/AIDS HL toolkit in the African American community. Paulo Freire's pedagogical philosophy and problem-posing methodology served as the guiding framework throughout the development process. Developing the HIV/AIDS HL toolkit occurred in a two-stage process. In Stage 1, a nonprofit organization and research team established a collaborative partnership to develop a culturally tailored HIV/AIDS HL toolkit. In Stage 2, African American community members participated in focus groups conducted as Freirian cultural circles to further refine the HIV/AIDS HL toolkit. In both stages, problem posing engaged participants' knowledge, experiences, and concerns to evaluate a working draft toolkit. The discussion and implications highlight how Freire's pedagogical philosophy and methodology enhances the development of culturally tailored health information. PMID:22102601

  16. Aggregation of metallochlorophylls - Examination by spectroscopy

    NASA Technical Reports Server (NTRS)

    Boucher, L. J.; Katz, J. J.

    1969-01-01

    Nuclear magnetic resonance measurements determine which metallochlorophylls, besides magnesium-containing chlorophylls, possess coordination aggregation properties. Infrared spectroscopy reveals that only zinc pheophytin and zinc methyl pheophorbide showed significant coordination aggregation, whereas divalent nickel and copper did not.

  17. Oligomeric baroeffect and gas aggregation states

    NASA Technical Reports Server (NTRS)

    Noever, David A.

    1992-01-01

    The baroeffect is analyzed to include a gas that aggregates into higher-order polymers or oligomers. The resulting pressure change is found to vary independently of the molecular weight of the gas components and to depend only on the aggregation or oligomeric order of the gas. With increasing aggregation, diffusive slip velocities are found to increase. The calculations are extended to include general counterdiffusion of two distinct aggregation states (k-, j-mer) for the gas, and the pressure change is derived as a function that is independent of both molecular weight and the absolute aggregation. The only parameter that determines the baroeffect is the ratio of aggregated states, beta = k/j. For gases that reversibly aggregate, possible oscillatory behavior and complex dynamics for pressure are discussed. Gas aggregation may play a role for low-temperature crystal-growth conditions in which vapor concentrations of one (or more) species are high.

  18. MAWST file generator (MFG)

    SciTech Connect

    Henriksen, P.W.; Hurdle, S.; Hafer, J.F.

    1993-12-01

    The software program MAWST was developed as a tool to deal with common materials accounting problems. The key to successful usage of this program is in the generation of input files for measurement values, measurement errors, and measurement methods. The program MFG was developed as an aid to creating input files for MAWST. MFG contains three commonly used measurements -- nondestructive assay, (G-T)*C, and V*C -- and a GENERIC measurement as models for data entry for the measurement value file. Sufficient data is collected from the user to produce the measurement error and measurement method files. This report is written as a tutorial presenting and explaining all the options available in MFG by giving examples of execution and the resulting screens that MFG produces.

  19. 11 CFR 100.19 - File, filed or filing (2 U.S.C. 434(a)).

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... file electronically under 11 CFR 104.18, may file 24-hour reports using the Commission's website's on... to be filed under 11 CFR parts 101, 102, 104, 105, 107, 108, and 109, and any modifications or...., Washington, DC 20510 as required by 11 CFR part 105, by the close of business on the prescribed filing...

  20. Protein aggregation in salt solutions

    PubMed Central

    Kastelic, Miha; Kalyuzhnyi, Yurij V.; Hribar-Lee, Barbara; Dill, Ken A.; Vlachy, Vojko

    2015-01-01

    Protein aggregation is broadly important in diseases and in formulations of biological drugs. Here, we develop a theoretical model for reversible protein–protein aggregation in salt solutions. We treat proteins as hard spheres having square-well-energy binding sites, using Wertheim’s thermodynamic perturbation theory. The necessary condition required for such modeling to be realistic is that proteins in solution during the experiment remain in their compact form. Within this limitation our model gives accurate liquid–liquid coexistence curves for lysozyme and γ IIIa-crystallin solutions in respective buffers. It provides good fits to the cloud-point curves of lysozyme in buffer–salt mixtures as a function of the type and concentration of salt. It than predicts full coexistence curves, osmotic compressibilities, and second virial coefficients under such conditions. This treatment may also be relevant to protein crystallization. PMID:25964322

  1. Evolving the US Climate Resilience Toolkit to Support a Climate-Smart Nation

    NASA Astrophysics Data System (ADS)

    Tilmes, C.; Niepold, F., III; Fox, J. F.; Herring, D.; Dahlman, L. E.; Hall, N.; Gardiner, N.

    2015-12-01

    Communities, businesses, resource managers, and decision-makers at all levels of government need information to understand and ameliorate climate-related risks. Likewise, climate information can expose latent opportunities. Moving from climate science to social and economic decisions raises complex questions about how to communicate the causes and impacts of climate variability and change; how to characterize and quantify vulnerabilities, risks, and opportunities faced by communities and businesses; and how to make and implement "win-win" adaptation plans at local, regional, and national scales. A broad coalition of federal agencies launched the U.S. Climate Resilience Toolkit (toolkit.climate.gov) in November 2014 to help our nation build resilience to climate-related extreme events. The site's primary audience is planners and decision makers in business, resource management, and government (at all levels) who seek science-based climate information and tools to help them in their near- and long-term planning. The Executive Office of the President assembled a task force of dozens of subject experts from across the 13 agencies of the U.S. Global Change Research Program to guide the site's development. The site's ongoing evolution is driven by feedback from the target audience. For example, based on feedback, climate projections will soon play a more prominent role in the site's "Climate Explorer" tool and case studies. The site's five-step adaptation planning process is being improved to better facilitate people getting started and to provide clear benchmarks for evaluating progress along the way. In this session, we will share lessons learned from a series of user engagements around the nation and evidence that the Toolkit couples climate information with actionable decision-making processes in ways that are helping Americans build resilience to climate-related stressors.

  2. The genome of Romanomermis culicivorax: revealing fundamental changes in the core developmental genetic toolkit in Nematoda

    PubMed Central

    2013-01-01

    Background The genetics of development in the nematode Caenorhabditis elegans has been described in exquisite detail. The phylum Nematoda has two classes: Chromadorea (which includes C. elegans) and the Enoplea. While the development of many chromadorean species resembles closely that of C. elegans, enoplean nematodes show markedly different patterns of early cell division and cell fate assignment. Embryogenesis of the enoplean Romanomermis culicivorax has been studied in detail, but the genetic circuitry underpinning development in this species has not been explored. Results We generated a draft genome for R. culicivorax and compared its gene content with that of C. elegans, a second enoplean, the vertebrate parasite Trichinella spiralis, and a representative arthropod, Tribolium castaneum. This comparison revealed that R. culicivorax has retained components of the conserved ecdysozoan developmental gene toolkit lost in C. elegans. T. spiralis has independently lost even more of this toolkit than has C. elegans. However, the C. elegans toolkit is not simply depauperate, as many novel genes essential for embryogenesis in C. elegans are not found in, or have only extremely divergent homologues in R. culicivorax and T. spiralis. Our data imply fundamental differences in the genetic programmes not only for early cell specification but also others such as vulva formation and sex determination. Conclusions Despite the apparent morphological conservatism, major differences in the molecular logic of development have evolved within the phylum Nematoda. R. culicivorax serves as a tractable system to contrast C. elegans and understand how divergent genomic and thus regulatory backgrounds nevertheless generate a conserved phenotype. The R. culicivorax draft genome will promote use of this species as a research model. PMID:24373391

  3. Authors' Submission Toolkit: a practical guide to getting your research published.

    PubMed

    Chipperfield, Leighton; Citrome, Leslie; Clark, Juli; David, Frank S; Enck, Robert; Evangelista, Michelle; Gonzalez, John; Groves, Trish; Magrann, Jay; Mansi, Bernadette; Miller, Charles; Mooney, LaVerne A; Murphy, Ann; Shelton, John; Walson, Philip D; Weigel, Al

    2010-08-01

    Biomedical journals and the pharmaceutical industry share the goals of enhancing transparency and expanding access to peer-reviewed research; both industries have recently instituted new policies and guidelines to effect this change. However, while increasing transparency may elevate standards and bring benefits to readers, it will drive a significant increase in manuscript volume, posing challenges to both the journals and industry sponsors. As a result, there is a need to: (1) increase efficiency in the submission process to accommodate the rising manuscript volume and reduce the resource demands on journals, peer reviewers, and authors; and (2) identify suitable venues to publish this research. These shared goals can only be accomplished through close collaboration among stakeholders in the process.In an effort to foster mutual collaboration, members of the pharmaceutical industry and the International Society for Medical Publication Professionals founded a unique collaborative venture in 2008 - the Medical Publishing Insights and Practices initiative (MPIP). At an MPIP roundtable meeting in September 2009,journal editors, publishers and industry representatives identified and prioritized opportunities to streamline the submission process and requirements, and to support prompt publication and dissemination of clinical trial results in the face of increasing manuscript volume. Journal and sponsor participants agreed that more author education on manuscript preparation and submission was needed to increase efficiency and enhance quality and transparency in the publication of industry-sponsored research. They suggested an authors'guide to help bridge the gap between author practices and editor expectations.To address this unmet educational need, MPIP supported development of an Authors' Submission Toolkit to compile best practices in the preparation and submission of manuscripts describing sponsored research.The Toolkit represents a unique collaboration between

  4. Conservation and modification of genetic and physiological toolkits underpinning diapause in bumble bee queens.

    PubMed

    Amsalem, Etya; Galbraith, David A; Cnaani, Jonathan; Teal, Peter E A; Grozinger, Christina M

    2015-11-01

    Diapause is the key adaptation allowing insects to survive unfavourable conditions and inhabit an array of environments. Physiological changes during diapause are largely conserved across species and are hypothesized to be regulated by a conserved suite of genes (a 'toolkit'). Furthermore, it is hypothesized that in social insects, this toolkit was co-opted to mediate caste differentiation between long-lived, reproductive, diapause-capable queens and short-lived, sterile workers. Using Bombus terrestris queens, we examined the physiological and transcriptomic changes associated with diapause and CO2 treatment, which causes queens to bypass diapause. We performed comparative analyses with genes previously identified to be associated with diapause in the Dipteran Sarcophaga crassipalpis and with caste differentiation in bumble bees. As in Diptera, diapause in bumble bees is associated with physiological and transcriptional changes related to nutrient storage, stress resistance and core metabolic pathways. There is a significant overlap, both at the level of transcript and gene ontology, between the genetic mechanisms mediating diapause in B. terrestris and S. crassipalpis, reaffirming the existence of a conserved insect diapause genetic toolkit. However, a substantial proportion (10%) of the differentially regulated transcripts in diapausing queens have no clear orthologs in other species, and key players regulating diapause in Diptera (juvenile hormone and vitellogenin) appear to have distinct functions in bumble bees. We also found a substantial overlap between genes related to caste determination and diapause in bumble bees. Thus, our studies demonstrate an intriguing interplay between pathways underpinning adaptation to environmental extremes and the evolution of sociality in insects. PMID:26453894

  5. A Grassroots Remote Sensing Toolkit Using Live Coding, Smartphones, Kites and Lightweight Drones.

    PubMed

    Anderson, K; Griffiths, D; DeBell, L; Hancock, S; Duffy, J P; Shutler, J D; Reinhardt, W J; Griffiths, A

    2016-01-01

    This manuscript describes the development of an android-based smartphone application for capturing aerial photographs and spatial metadata automatically, for use in grassroots mapping applications. The aim of the project was to exploit the plethora of on-board sensors within modern smartphones (accelerometer, GPS, compass, camera) to generate ready-to-use spatial data from lightweight aerial platforms such as drones or kites. A visual coding 'scheme blocks' framework was used to build the application ('app'), so that users could customise their own data capture tools in the field. The paper reports on the coding framework, then shows the results of test flights from kites and lightweight drones and finally shows how open-source geospatial toolkits were used to generate geographical information system (GIS)-ready GeoTIFF images from the metadata stored by the app. Two Android smartphones were used in testing-a high specification OnePlus One handset and a lower cost Acer Liquid Z3 handset, to test the operational limits of the app on phones with different sensor sets. We demonstrate that best results were obtained when the phone was attached to a stable single line kite or to a gliding drone. Results show that engine or motor vibrations from powered aircraft required dampening to ensure capture of high quality images. We demonstrate how the products generated from the open-source processing workflow are easily used in GIS. The app can be downloaded freely from the Google store by searching for 'UAV toolkit' (UAV toolkit 2016), and used wherever an Android smartphone and aerial platform are available to deliver rapid spatial data (e.g. in supporting decision-making in humanitarian disaster-relief zones, in teaching or for grassroots remote sensing and democratic mapping). PMID:27144310

  6. A versatile valving toolkit for automating fluidic operations in paper microfluidic devices

    PubMed Central

    Toley, Bhushan J.; Wang, Jessica A.; Gupta, Mayuri; Buser, Joshua R.; Lafleur, Lisa K.; Lutz, Barry R.; Fu, Elain; Yager, Paul

    2015-01-01

    Failure to utilize valving and automation techniques has restricted the complexity of fluidic operations that can be performed in paper microfluidic devices. We developed a toolkit of paper microfluidic valves and methods for automatic valve actuation using movable paper strips and fluid-triggered expanding elements. To the best of our knowledge, this is the first functional demonstration of this valving strategy in paper microfluidics. After introduction of fluids on devices, valves can actuate automatically a) after a certain period of time, or b) after the passage of a certain volume of fluid. Timing of valve actuation can be tuned with greater than 8.5% accuracy by changing lengths of timing wicks, and we present timed on-valves, off-valves, and diversion (channel-switching) valves. The actuators require ~30 μl fluid to actuate and the time required to switch from one state to another ranges from ~5 s for short to ~50s for longer wicks. For volume-metered actuation, the size of a metering pad can be adjusted to tune actuation volume, and we present two methods – both methods can achieve greater than 9% accuracy. Finally, we demonstrate the use of these valves in a device that conducts a multi-step assay for the detection of the malaria protein PfHRP2. Although slightly more complex than devices that do not have moving parts, this valving and automation toolkit considerably expands the capabilities of paper microfluidic devices. Components of this toolkit can be used to conduct arbitrarily complex, multi-step fluidic operations on paper-based devices, as demonstrated in the malaria assay device. PMID:25606810

  7. Aggregated Authentication (AMAC) Using Universal Hash Functions

    NASA Astrophysics Data System (ADS)

    Znaidi, Wassim; Minier, Marine; Lauradoux, Cédric

    Aggregation is a very important issue to reduce the energy consumption in Wireless Sensors Networks (WSNs). There is currently a lack of cryptographic primitives for authentication of aggregated data. The theoretical background for Aggregated Message Authentication Codes (AMACs) has been proposed by Chan and Castelluccia at ISIT 08.

  8. Mineral resource of the month: aggregates

    USGS Publications Warehouse

    Willett, Jason C.

    2012-01-01

    Crushed stone and construction sand and gravel, the two major types of natural aggregates, are among the most abundant and accessible natural resources on the planet. The earliest civilizations used aggregates for various purposes, mainly construction. Today aggregates provide the basic raw materials for the foundation of modern society.

  9. 28 CFR 2.5 - Sentence aggregation.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Sentence aggregation. 2.5 Section 2.5 Judicial Administration DEPARTMENT OF JUSTICE PAROLE, RELEASE, SUPERVISION AND RECOMMITMENT OF PRISONERS... aggregation. When multiple sentences are aggregated by the Bureau of Prisons pursuant to 18 U.S.C. 4161...

  10. 76 FR 62798 - Combined Notice of Filings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-11

    ... filing per 154.402: GSS LSS SS-2 S-2 2011 TGPL ACA Tracker Filing to be effective 10/1/2011. Filed Date... Tracker Filing to be effective 11/1/ 2011. Filed Date: 09/30/2011. Accession Number: 20110930-5048.... Description: Texas Gas Transmission, LLC submits tariff filing per 154.403(d)(2): 2011 Fuel Tracker Filing...

  11. Cytotoxic effects of aggregated nanomaterials.

    PubMed

    Soto, Karla; Garza, K M; Murr, L E

    2007-05-01

    This study deals with cytotoxicity assays performed on an array of commercially manufactured inorganic nanoparticulate materials, including Ag, TiO(2), Fe(2)O(3), Al(2)O(3), ZrO(2), Si(3)N(4), naturally occurring mineral chrysotile asbestos and carbonaceous nanoparticulate materials such as multiwall carbon nanotube aggregates and black carbon aggregates. The nanomaterials were characterized by TEM, as the primary particles, aggregates or long fiber dimensions ranged from 2nm to 20microm. Cytotoxicological assays of these nanomaterials were performed utilizing a murine alveolar macrophage cell line and human macrophage and epithelial lung cell lines as comparators. The nanoparticulate materials exhibited varying degrees of cytoxicity for all cell lines and the general trends were similar for both the murine and human macrophage cell lines. These findings suggest that representative cytotoxic responses for humans might be obtained by nanoparticulate exposures to simple murine macrophage cell line assays. Moreover, these results illustrate the utility in performing rapid in vitro assays for cytotoxicity assessments of nanoparticulate materials as a general inquiry of potential respiratory health risks in humans. PMID:17275430

  12. Aggregation of Heterogeneously Charged Colloids.

    PubMed

    Dempster, Joshua M; Olvera de la Cruz, Monica

    2016-06-28

    Patchy colloids are attractive as programmable building blocks for metamaterials. Inverse patchy colloids, in which a charged surface is decorated with patches of the opposite charge, are additionally noteworthy as models for heterogeneously charged biological materials such as proteins. We study the phases and aggregation behavior of a single charged patch in an oppositely charged colloid with a single-site model. This single-patch inverse patchy colloid model shows a large number of phases when varying patch size. For large patch sizes we find ferroelectric crystals, while small patch sizes produce cross-linked gels. Intermediate values produce monodisperse clusters and unusual worm structures that preserve finite ratios of area to volume. The polarization observed at large patch sizes is robust under extreme disorder in patch size and shape. We examine phase-temperature dependence and coexistence curves and find that large patch sizes produce polarized liquids, in contrast to mean-field predictions. Finally, we introduce small numbers of unpatched charged colloids. These can either suppress or encourage aggregation depending on their concentration and the size of the patches on the patched colloids. These effects can be exploited to control aggregation and to measure effective patch size. PMID:27253725

  13. Converting CSV Files to RKSML Files

    NASA Technical Reports Server (NTRS)

    Trebi-Ollennu, Ashitey; Liebersbach, Robert

    2009-01-01

    A computer program converts, into a format suitable for processing on Earth, files of downlinked telemetric data pertaining to the operation of the Instrument Deployment Device (IDD), which is a robot arm on either of the Mars Explorer Rovers (MERs). The raw downlinked data files are in comma-separated- value (CSV) format. The present program converts the files into Rover Kinematics State Markup Language (RKSML), which is an Extensible Markup Language (XML) format that facilitates representation of operations of the IDD and enables analysis of the operations by means of the Rover Sequencing Validation Program (RSVP), which is used to build sequences of commanded operations for the MERs. After conversion by means of the present program, the downlinked data can be processed by RSVP, enabling the MER downlink operations team to play back the actual IDD activity represented by the telemetric data against the planned IDD activity. Thus, the present program enhances the diagnosis of anomalies that manifest themselves as differences between actual and planned IDD activities.

  14. Advancements in Wind Integration Study Input Data Modeling: The Wind Integration National Dataset (WIND) Toolkit

    NASA Astrophysics Data System (ADS)

    Hodge, B.; Orwig, K.; McCaa, J. R.; Harrold, S.; Draxl, C.; Jones, W.; Searight, K.; Getman, D.

    2013-12-01

    projects to develop updated datasets: the Wind Integration National Dataset (WIND) Toolkit and the Solar Integration National Dataset (SIND) Toolkit. The WIND Toolkit spans 2007-2013 using advanced NWP methods run on a nationwide 2-km grid with 5-minute resolution, and includes over 110,000 onshore and offshore wind power production sites. This paper and presentation will discuss an overview of the WIND Toolkit modeling advancements, site selection, data accessibility, and validation results.

  15. Performance analysis of the Globus Toolkit Monitoring and Discovery Service, MDS2.

    SciTech Connect

    Zhang, X.; Schopf, J. M.; Mathematics and Computer Science; Univ. of Chicago

    2004-01-01

    Monitoring and information services form a key component of a distributed system, or grid. A quantitative study of such services can aid in understanding the performance limitations, advise in the deployment of the monitoring system, and help evaluate future development work. To this end, we examined the performance of the Globus Toolkit/spl reg/ Monitoring and Discovery Service (MDS2) by instrumenting its main services using NetLogger. Our study shows a strong advantage to caching or prefetching the data, as well as the need to have primary components at well-connected sites.

  16. The early bird catches the worm: new technologies for the Caenorhabditis elegans toolkit

    PubMed Central

    Xu, Xiao; Kim, Stuart K.

    2014-01-01

    The inherent simplicity of Caenorhabditis elegans and its extensive genetic toolkit make it ideal for studying complex biological processes. Recent developments further increase the usefulness of the worm, including new methods for: altering gene expression, altering physiology using optogenetics, manipulating large numbers of worms, automating laborious processes and processing high-resolution images. These developments both enhance the worm as a model for studying processes such as development and ageing and make it an attractive model in areas such as neurobiology and behaviour. PMID:21969037

  17. Nuclear fragmentation reactions in extended media studied with Geant4 toolkit

    NASA Astrophysics Data System (ADS)

    Pshenichnov, Igor; Botvina, Alexander; Mishustin, Igor; Greiner, Walter

    2010-03-01

    It is well-known from numerous experiments that nuclear multifragmentation is a dominating mechanism for production of intermediate mass fragments in nucleus-nucleus collisions at energies above 100A MeV. In this paper we investigate the validity and performance of the Fermi break-up model and the statistical multifragmentation model implemented as parts of the Geant4 toolkit. We study the impact of violent nuclear disintegration reactions on the depth-dose profiles and yields of secondary fragments for beams of light and medium-weight nuclei propagating in extended media. Implications for ion-beam cancer therapy and shielding from cosmic radiation are discussed.

  18. The development of a standard training toolkit for research studies that recruit pregnant women in labour

    PubMed Central

    2013-01-01

    Recruitment of pregnant women in labour to clinical trials poses particular challenges. Interpretation of regulation lacks consistency or clarity and variation occurs as to the training required by clinicians to safely contribute to the conduct of intrapartum studies. The Royal College of Obstetricians and Gynaecologists Intrapartum Clinical Study Group initiated the development of a pragmatic, proportionate and standardised toolkit for training clinical staff that complies with both regulatory and clinician requirements and has been peer-reviewed. This approach may be useful to researchers in acute care settings that necessitate the integration of research, routine clinical practice and compliance with regulation. PMID:24171801

  19. Testability, Test Automation and Test Driven Development for the Trick Simulation Toolkit

    NASA Technical Reports Server (NTRS)

    Penn, John

    2014-01-01

    This paper describes the adoption of a Test Driven Development approach and a Continuous Integration System in the development of the Trick Simulation Toolkit, a generic simulation development environment for creating high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. It describes the approach, and the significant benefits seen, such as fast, thorough and clear test feedback every time code is checked into the code repository. It also describes an approach that encourages development of code that is testable and adaptable.

  20. Atlas Toolkit: Fast registration of 3D morphological datasets in the absence of landmarks

    PubMed Central

    Grocott, Timothy; Thomas, Paul; Münsterberg, Andrea E.

    2016-01-01

    Image registration is a gateway technology for Developmental Systems Biology, enabling computational analysis of related datasets within a shared coordinate system. Many registration tools rely on landmarks to ensure that datasets are correctly aligned; yet suitable landmarks are not present in many datasets. Atlas Toolkit is a Fiji/ImageJ plugin collection offering elastic group-wise registration of 3D morphological datasets, guided by segmentation of the interesting morphology. We demonstrate the method by combinatorial mapping of cell signalling events in the developing eyes of chick embryos, and use the integrated datasets to predictively enumerate Gene Regulatory Network states. PMID:26864723

  1. A MultiSite Gateway Toolkit for Rapid Cloning of Vertebrate Expression Constructs with Diverse Research Applications

    PubMed Central

    Fowler, Daniel K.; Stewart, Scott; Seredick, Steve; Eisen, Judith S.

    2016-01-01

    Recombination-based cloning is a quick and efficient way to generate expression vectors. Recent advancements have provided powerful recombinant DNA methods for molecular manipulations. Here, we describe a novel collection of three-fragment MultiSite Gateway cloning system-compatible vectors providing expanded molecular tools for vertebrate research. The components of this toolkit encompass a broad range of uses such as fluorescent imaging, dual gene expression, RNA interference, tandem affinity purification, chemically-inducible dimerization and lentiviral production. We demonstrate examples highlighting the utility of this toolkit for producing multi-component vertebrate expression vectors with diverse primary research applications. The vectors presented here are compatible with other Gateway toolkits and collections, facilitating the rapid generation of a broad range of innovative DNA constructs for biological research. PMID:27500400

  2. Toolkit of Resources for Engaging Parents and Community as Partners in Education. Part I: Building an Understanding of Family and Community Engagement

    ERIC Educational Resources Information Center

    Regional Educational Laboratory Pacific, 2014

    2014-01-01

    This toolkit is designed to guide school staff in strengthening partnerships with families and community members to support student learning. This toolkit offers an integrated approach to family and community engagement, bringing together research, promising practices, and a wide range of useful tools and resources with explanations and directions…

  3. Toolkit of Resources for Engaging Parents and Community as Partners in Education. Part 3: Building Trusting Relationships with Families & Community through Effective Communication

    ERIC Educational Resources Information Center

    Regional Educational Laboratory Pacific, 2015

    2015-01-01

    This toolkit is designed to guide school staff in strengthening partnerships with families and community members to support student learning. This toolkit offers an integrated approach to family and community engagement, bringing together research, promising practices, and a wide range of useful tools and resources with explanations and directions…

  4. Aggregation and Aggregate Carbon in a Forested Southeastern Coastal Plain Spodosol

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Soil aggregation is influenced by the soil environment and is a factor in soil carbon sequestration. Sandy Coastal Plain soils often do not have the clay to promote aggregation nor have been considered soils with high levels of aggregation. This study was conducted to examine the aggregate morpholog...

  5. “Weaving Balance into Life”: Development and cultural adaptation of a cancer symptom management toolkit for Southwest American Indians

    PubMed Central

    Itty, Tracy Line; Cadogan, Mary P.; Martinez, Fernando

    2012-01-01

    Introduction Self-management of cancer symptoms has the potential to decrease the suffering of cancer survivors while improving their health and quality of life. For many racial/ethnic groups, culturally appropriate self-management instruction is not readily available. This paper reports on the first symptom management toolkit developed for American Indian cancer survivors. Methods Part of a larger research study, a three-phase project tested a cancer symptom self-management toolkit to be responsive to the unique learning and communication needs of American Indians in the Southwest USA. American Indian cancer survivors and family members participated in 13 focus groups to identify cultural concepts of cancer and illness beliefs, communication styles, barriers, and recommendations for self-management techniques. Sessions were audiotaped and transcriptions were coded using Grounded Theory. Results Participants expressed a need for an overview of cancer, tips on management of common symptoms, resources in their communities, and suggestions for how to communicate with providers and others. The “Weaving Balance into Life” toolkit is comprised of a self-help guide, resource directory, and video. Preferred presentation style and content for the toolkit were pilot tested. Discussion/conclusions American Indian survivors favor educational materials that provide information on symptom management and are tailored to their culture and beliefs. Suggestions for adapting the toolkit materials for other American Indian populations are made. Implications for cancer survivors Many cancer survivors lack effective self-management techniques for symptoms, such as pain, fatigue, and depression. The toolkit promotes self-management strategies for survivors and provides family members/caregivers tangible ways to offer support. PMID:22160662

  6. The recognition of collagen and triple-helical toolkit peptides by MMP-13: sequence specificity for binding and cleavage.

    PubMed

    Howes, Joanna-Marie; Bihan, Dominique; Slatter, David A; Hamaia, Samir W; Packman, Len C; Knauper, Vera; Visse, Robert; Farndale, Richard W

    2014-08-29

    Remodeling of collagen by matrix metalloproteinases (MMPs) is crucial to tissue homeostasis and repair. MMP-13 is a collagenase with a substrate preference for collagen II over collagens I and III. It recognizes a specific, well-known site in the tropocollagen molecule where its binding locally perturbs the triple helix, allowing the catalytic domain of the active enzyme to cleave the collagen α chains sequentially, at Gly(775)-Leu(776) in collagen II. However, the specific residues upon which collagen recognition depends within and surrounding this locus have not been systematically mapped. Using our triple-helical peptide Collagen Toolkit libraries in solid-phase binding assays, we found that MMP-13 shows little affinity for Collagen Toolkit III, but binds selectively to two triple-helical peptides of Toolkit II. We have identified the residues required for the adhesion of both proMMP-13 and MMP-13 to one of these, Toolkit peptide II-44, which contains the canonical collagenase cleavage site. MMP-13 was unable to bind to a linear peptide of the same sequence as II-44. We also discovered a second binding site near the N terminus of collagen II (starting at helix residue 127) in Toolkit peptide II-8. The pattern of binding of the free hemopexin domain of MMP-13 was similar to that of the full-length enzyme, but the free catalytic subunit bound none of our peptides. The susceptibility of Toolkit peptides to proteolysis in solution was independent of the very specific recognition of immobilized peptides by MMP-13; the enzyme proved able to cleave a range of dissolved collagen peptides. PMID:25008319

  7. [AGGREGATION OF METABOLICALLY DEPLETED HUMAN ERYTHROCYTES].

    PubMed

    Sheremet'ev, Yu A; Popovicheva, A N; Rogozin, M M; Levin, G Ya

    2016-01-01

    An aggregation of erythrocytes in autologous plasma after blood storage for 14 days at 4 °C was studied using photometry and light microscopy. The decrease of ATP content, the formation of echinocytes and spheroechinocytes, the decrease of rouleaux form of erythrocyte aggregation were observed during the storage. On the other hand the aggregates of echinocytes were formed in the stored blood. The addition of plasma from the fresh blood didn't restore the normal discocytic shape and aggregation of erythrocytes in the stored blood. The possible mechanisms of erythrocytes and echinocytes aggregation are discussed. PMID:27220249

  8. [Lysophosphatidic acid and human erythrocyte aggregation].

    PubMed

    Sheremet'ev, Iu A; Popovicheva, A N; Levin, G Ia

    2014-01-01

    The effects of lysophosphatidic acid on the morphology and aggregation of human erythrocytes has been studied. Morphology of erythrocytes and their aggregates were studied by light microscopy. It has been shown that lysophosphatidic acid changes the shape of red blood cells: diskocyte become echinocytes. Aggregation of red blood cells (rouleaux) was significantly reduced in autoplasma. At the same time there is a strong aggregation of echinocytes. This was accompanied by the formation of microvesicles. Adding normal plasma to echinocytes restores shape and aggregation of red blood cells consisting of "rouleaux". A possible mechanism of action of lysophosphatidic acid on erythrocytes is discussed. PMID:25509147

  9. NeuroPigPen: A Scalable Toolkit for Processing Electrophysiological Signal Data in Neuroscience Applications Using Apache Pig.

    PubMed

    Sahoo, Satya S; Wei, Annan; Valdez, Joshua; Wang, Li; Zonjy, Bilal; Tatsuoka, Curtis; Loparo, Kenneth A; Lhatoo, Samden D

    2016-01-01

    The recent advances in neurological imaging and sensing technologies have led to rapid increase in the volume, rate of data generation, and variety of neuroscience data. This "neuroscience Big data" represents a significant opportunity for the biomedical research community to design experiments using data with greater timescale, large number of attributes, and statistically significant data size. The results from these new data-driven research techniques can advance our understanding of complex neurological disorders, help model long-term effects of brain injuries, and provide new insights into dynamics of brain networks. However, many existing neuroinformatics data processing and analysis tools were not built to manage large volume of data, which makes it difficult for researchers to effectively leverage this available data to advance their research. We introduce a new toolkit called NeuroPigPen that was developed using Apache Hadoop and Pig data flow language to address the challenges posed by large-scale electrophysiological signal data. NeuroPigPen is a modular toolkit that can process large volumes of electrophysiological signal data, such as Electroencephalogram (EEG), Electrocardiogram (ECG), and blood oxygen levels (SpO2), using a new distributed storage model called Cloudwave Signal Format (CSF) that supports easy partitioning and storage of signal data on commodity hardware. NeuroPigPen was developed with three design principles: (a) Scalability-the ability to efficiently process increasing volumes of data; (b) Adaptability-the toolkit can be deployed across different computing configurations; and (c) Ease of programming-the toolkit can be easily used to compose multi-step data processing pipelines using high-level programming constructs. The NeuroPigPen toolkit was evaluated using 750 GB of electrophysiological signal data over a variety of Hadoop cluster configurations ranging from 3 to 30 Data nodes. The evaluation results demonstrate that the toolkit

  10. NeuroPigPen: A Scalable Toolkit for Processing Electrophysiological Signal Data in Neuroscience Applications Using Apache Pig

    PubMed Central

    Sahoo, Satya S.; Wei, Annan; Valdez, Joshua; Wang, Li; Zonjy, Bilal; Tatsuoka, Curtis; Loparo, Kenneth A.; Lhatoo, Samden D.

    2016-01-01

    The recent advances in neurological imaging and sensing technologies have led to rapid increase in the volume, rate of data generation, and variety of neuroscience data. This “neuroscience Big data” represents a significant opportunity for the biomedical research community to design experiments using data with greater timescale, large number of attributes, and statistically significant data size. The results from these new data-driven research techniques can advance our understanding of complex neurological disorders, help model long-term effects of brain injuries, and provide new insights into dynamics of brain networks. However, many existing neuroinformatics data processing and analysis tools were not built to manage large volume of data, which makes it difficult for researchers to effectively leverage this available data to advance their research. We introduce a new toolkit called NeuroPigPen that was developed using Apache Hadoop and Pig data flow language to address the challenges posed by large-scale electrophysiological signal data. NeuroPigPen is a modular toolkit that can process large volumes of electrophysiological signal data, such as Electroencephalogram (EEG), Electrocardiogram (ECG), and blood oxygen levels (SpO2), using a new distributed storage model called Cloudwave Signal Format (CSF) that supports easy partitioning and storage of signal data on commodity hardware. NeuroPigPen was developed with three design principles: (a) Scalability—the ability to efficiently process increasing volumes of data; (b) Adaptability—the toolkit can be deployed across different computing configurations; and (c) Ease of programming—the toolkit can be easily used to compose multi-step data processing pipelines using high-level programming constructs. The NeuroPigPen toolkit was evaluated using 750 GB of electrophysiological signal data over a variety of Hadoop cluster configurations ranging from 3 to 30 Data nodes. The evaluation results demonstrate that

  11. 75 FR 15479 - Self-Regulatory Organizations; The National Securities Clearing Corporation; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-29

    ... From the Federal Register Online via the Government Publishing Office ] SECURITIES AND EXCHANGE COMMISSION Self-Regulatory Organizations; The National Securities Clearing Corporation; Notice of Filing and Immediate Effectiveness of Proposed Rule Change to Aggregate Obligations in Certain Securities Transactions Designated for Settlement on...

  12. Microwave extinction characteristics of nanoparticle aggregates

    NASA Astrophysics Data System (ADS)

    Wu, Y. P.; Cheng, J. X.; Liu, X. X.; Wang, H. X.; Zhao, F. T.; Wen, W. W.

    2016-07-01

    Structure of nanoparticle aggregates plays an important role in microwave extinction capacity. The diffusion-limited aggregation model (DLA) for fractal growth is utilized to explore the possible structures of nanoparticle aggregates by computer simulation. Based on the discrete dipole approximation (DDA) method, the microwave extinction performance by different nano-carborundum aggregates is numerically analyzed. The effects of the particle quantity, original diameter, fractal structure, as well as orientation on microwave extinction are investigated, and also the extinction characteristics of aggregates are compared with the spherical nanoparticle in the same volume. Numerical results give out that proper aggregation of nanoparticle is beneficial to microwave extinction capacity, and the microwave extinction cross section by aggregated granules is better than that of the spherical solid one in the same volume.

  13. Simulation of J-aggregate microcavity photoluminescence

    NASA Astrophysics Data System (ADS)

    Michetti, Paolo; La Rocca, Giuseppe C.

    2008-05-01

    We have developed a model in order to account for the photoexcitation dynamics of J-aggregate films and strongly coupled J-aggregate microcavities. The J aggregates are described as a disordered Frenkel exciton system in which relaxation occurs due to the presence of a thermal bath of molecular vibrations. The correspondence between the photophysics in J-aggregate films and that in J-aggregate microcavities is obtained by introducing a model polariton wave function mixing cavity photon modes and J-aggregate super-radiant excitons. With the same description of the material properties, we have calculated both absorption and luminescence spectra for the J-aggregate film and the photoluminescence of strongly coupled organic microcavities. The model is able to account for the fast relaxation dynamics in organic microcavities following nonresonant pumping and explains the temperature dependence of the ratio between the upper polariton and the lower polariton luminescence.

  14. Backtracking behaviour in lost ants: an additional strategy in their navigational toolkit.

    PubMed

    Wystrach, Antoine; Schwarz, Sebastian; Baniel, Alice; Cheng, Ken

    2013-10-22

    Ants use multiple sources of information to navigate, but do not integrate all this information into a unified representation of the world. Rather, the available information appears to serve three distinct main navigational systems: path integration, systematic search and the use of learnt information--mainly via vision. Here, we report on an additional behaviour that suggests a supplemental system in the ant's navigational toolkit: 'backtracking'. Homing ants, having almost reached their nest but, suddenly displaced to unfamiliar areas, did not show the characteristic undirected headings of systematic searches. Instead, these ants backtracked in the compass direction opposite to the path that they had just travelled. The ecological function of this behaviour is clear as we show it increases the chances of returning to familiar terrain. Importantly, the mechanistic implications of this behaviour stress an extra level of cognitive complexity in ant navigation. Our results imply: (i) the presence of a type of 'memory of the current trip' allowing lost ants to take into account the familiar view recently experienced, and (ii) direct sharing of information across different navigational systems. We propose a revised architecture of the ant's navigational toolkit illustrating how the different systems may interact to produce adaptive behaviours. PMID:23966644

  15. SAPPHIRE: a toolkit for building efficient stream programs for medical video analysis.

    PubMed

    Stanek, Sean R; Tavanapong, Wallapak; Wong, Johnny; Oh, JungHwan; Nawarathna, Ruwan D; Muthukudage, Jayantha; de Groen, Piet C

    2013-12-01

    This paper describes the design and implementation of SAPPHIRE--a novel middleware and software development kit for stream programing on a heterogeneous system of multi-core multi-CPUs with optional hardware accelerators such as graphics processing unit (GPU). A stream program consists of a set of tasks where the same tasks are repeated over multiple iterations of data (e.g., video frames). Examples of such programs are video analysis applications for computer-aided diagnosis and computer-assisted surgeries. Our design goal is to reduce the implementation efforts and ease collaborative software development of stream programs while supporting efficient execution of the programs on the target hardware. To validate the toolkit, we implemented EM-Automated-RT software with the toolkit and reported our experience. EM-Automated-RT performs real-time video analysis for quality of a colonoscopy procedure and provides visual feedback to assist the endoscopist to achieve optimal inspection of the colon during the procedure. The software has been deployed in a hospital setting to conduct a clinical trial. PMID:24001925

  16. eVITAL: A Preliminary Taxonomy and Electronic Toolkit of Health-Related Habits and Lifestyle

    PubMed Central

    Salvador-Carulla, Luis; Olson Walsh, Carolyn; Alonso, Federico; Gómez, Rafael; de Teresa, Carlos; Cabo-Soler, José Ricardo; Cano, Antonio; Ruiz, Mencía

    2012-01-01

    Objectives. To create a preliminary taxonomy and related toolkit of health-related habits (HrH) following a person-centered approach with a focus on primary care. Methods. From 2003–2009, a working group (n = 6 physicians) defined the knowledge base, created a framing document, and selected evaluation tools using an iterative process. Multidisciplinary focus groups (n = 29 health professionals) revised the document and evaluation protocol and participated in a feasibility study and review of the model based on a demonstration study with 11 adult volunteers in Antequera, Spain. Results. The preliminary taxonomy contains 6 domains of HrH and 1 domain of additional health descriptors, 3 subdomains, 43 dimensions, and 141 subdimensions. The evaluation tool was completed by the 11 volunteers. The eVITAL toolkit contains history and examination items for 4 levels of engagement: self-assessment, basic primary care, extended primary care, and specialty care. There was positive feedback from the volunteers and experts, but concern about the length of the evaluation. Conclusions. We present the first taxonomy of HrH, which may aid the development of the new models of care such as the personal contextual factors of the International Classification of Functioning (ICF) and the positive and negative components of the multilevel person-centered integrative diagnosis model. PMID:22545016

  17. Iterative user centered design for development of a patient-centered fall prevention toolkit.

    PubMed

    Katsulis, Zachary; Ergai, Awatef; Leung, Wai Yin; Schenkel, Laura; Rai, Amisha; Adelman, Jason; Benneyan, James; Bates, David W; Dykes, Patricia C

    2016-09-01

    Due to the large number of falls that occur in hospital settings, inpatient fall prevention is a topic of great interest to patients and health care providers. The use of electronic decision support that tailors fall prevention strategy to patient-specific risk factors, known as Fall T.I.P.S (Tailoring Interventions for Patient Safety), has proven to be an effective approach for decreasing hospital falls. A paper version of the Fall T.I.P.S toolkit was developed primarily for hospitals that do not have the resources to implement the electronic solution; however, more work is needed to optimize the effectiveness of the paper version of this tool. We examined the use of human factors techniques in the redesign of the existing paper fall prevention tool with the goal of increasing ease of use and decreasing inpatient falls. The inclusion of patients and clinical staff in the redesign of the existing tool was done to increase adoption of the tool and fall prevention best practices. The redesigned paper Fall T.I.P.S toolkit showcased a built in clinical decision support system and increased ease of use over the existing version. PMID:27184319

  18. VisDock: A Toolkit for Cross-Cutting Interactions in Visualization.

    PubMed

    Choi, Jungu; Park, Deok Gun; Wong, Yuet Ling; Fisher, Eli; Elmqvist, Niklas

    2015-09-01

    Standard user applications provide a range of cross-cutting interaction techniques that are common to virtually all such tools: selection, filtering, navigation, layer management, and cut-and-paste. We present VisDock, a JavaScript mixin library that provides a core set of these cross-cutting interaction techniques for visualization, including selection (lasso, paths, shape selection, etc), layer management (visibility, transparency, set operations, etc), navigation (pan, zoom, overview, magnifying lenses, etc), and annotation (point-based, region-based, data-space based, etc). To showcase the utility of the library, we have released it as Open Source and integrated it with a large number of existing web-based visualizations. Furthermore, we have evaluated VisDock using qualitative studies with both developers utilizing the toolkit to build new web-based visualizations, as well as with end-users utilizing it to explore movie ratings data. Results from these studies highlight the usability and effectiveness of the toolkit from both developer and end-user perspectives. PMID:26357289

  19. Backtracking behaviour in lost ants: an additional strategy in their navigational toolkit

    PubMed Central

    Wystrach, Antoine; Schwarz, Sebastian; Baniel, Alice; Cheng, Ken

    2013-01-01

    Ants use multiple sources of information to navigate, but do not integrate all this information into a unified representation of the world. Rather, the available information appears to serve three distinct main navigational systems: path integration, systematic search and the use of learnt information—mainly via vision. Here, we report on an additional behaviour that suggests a supplemental system in the ant's navigational toolkit: ‘backtracking’. Homing ants, having almost reached their nest but, suddenly displaced to unfamiliar areas, did not show the characteristic undirected headings of systematic searches. Instead, these ants backtracked in the compass direction opposite to the path that they had just travelled. The ecological function of this behaviour is clear as we show it increases the chances of returning to familiar terrain. Importantly, the mechanistic implications of this behaviour stress an extra level of cognitive complexity in ant navigation. Our results imply: (i) the presence of a type of ‘memory of the current trip’ allowing lost ants to take into account the familiar view recently experienced, and (ii) direct sharing of information across different navigational systems. We propose a revised architecture of the ant's navigational toolkit illustrating how the different systems may interact to produce adaptive behaviours. PMID:23966644

  20. A toolkit to promote fidelity to health promotion interventions in afterschool programs.

    PubMed

    Wiecha, Jean L; Hannon, Cynthia; Meyer, Kimberly

    2013-05-01

    Community-based obesity prevention efforts are an essential component of a public health approach to obesity and chronic disease risk reduction. Afterschool programs can participate by providing healthy snacks and regular physical activity. Although efficacious obesity prevention strategies have been identified, they have not been widely implemented. The authors describe the development of A+, a quality improvement (QI) toolkit designed to help YMCA afterschool programs implement healthy eating and physical activity guidelines. YMCA of the USA Health Promotion Standards for afterschool sites specify eliminating sugar-sweetened beverages and trans fats; providing fruits, vegetables, and water; and ensuring at least 30 minutes of physical activity daily. Field tests of A+ indicated that a QI toolkit for community-based afterschool programs can be implemented by a program director across multiple program sites, responds to programmatic needs, appropriately identifies barriers to improvement, and permits development of locally appropriate improvement plans. The QI approach holds promise for public health efforts and for field research to evaluate promising interventions by helping ensure full implementation of health promotion strategies. PMID:22982705