Science.gov

Sample records for files aggregation toolkit

  1. The ALFA (Activity Log Files Aggregation) Toolkit: A Method for Precise Observation of the Consultation

    PubMed Central

    2008-01-01

    Background There is a lack of tools to evaluate and compare Electronic patient record (EPR) systems to inform a rational choice or development agenda. Objective To develop a tool kit to measure the impact of different EPR system features on the consultation. Methods We first developed a specification to overcome the limitations of existing methods. We divided this into work packages: (1) developing a method to display multichannel video of the consultation; (2) code and measure activities, including computer use and verbal interactions; (3) automate the capture of nonverbal interactions; (4) aggregate multiple observations into a single navigable output; and (5) produce an output interpretable by software developers. We piloted this method by filming live consultations (n = 22) by 4 general practitioners (GPs) using different EPR systems. We compared the time taken and variations during coded data entry, prescribing, and blood pressure (BP) recording. We used nonparametric tests to make statistical comparisons. We contrasted methods of BP recording using Unified Modeling Language (UML) sequence diagrams. Results We found that 4 channels of video were optimal. We identified an existing application for manual coding of video output. We developed in-house tools for capturing use of keyboard and mouse and to time stamp speech. The transcript is then typed within this time stamp. Although we managed to capture body language using pattern recognition software, we were unable to use this data quantitatively. We loaded these observational outputs into our aggregation tool, which allows simultaneous navigation and viewing of multiple files. This also creates a single exportable file in XML format, which we used to develop UML sequence diagrams. In our pilot, the GP using the EMIS LV (Egton Medical Information Systems Limited, Leeds, UK) system took the longest time to code data (mean 11.5 s, 95% CI 8.7-14.2). Nonparametric comparison of EMIS LV with the other systems showed

  2. Small file aggregation in a parallel computing system

    DOEpatents

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Grider, Gary; Zhang, Jingwang

    2014-09-02

    Techniques are provided for small file aggregation in a parallel computing system. An exemplary method for storing a plurality of files generated by a plurality of processes in a parallel computing system comprises aggregating the plurality of files into a single aggregated file; and generating metadata for the single aggregated file. The metadata comprises an offset and a length of each of the plurality of files in the single aggregated file. The metadata can be used to unpack one or more of the files from the single aggregated file.

  3. Recent College Graduates Study, 1987 (RCGS-1987). Combined File of Survey and Aggregate Transcript Data [machine-readable data file].

    ERIC Educational Resources Information Center

    National Center for Education Statistics (ED), Washington, DC.

    The 1987 Recent College Graduates Study (RCGS) machine-readable data file, RECENT.GRADS.COMBINED.A8586, is the third of three data files produced from the study and contains information about 1985-86 bachelor's degree graduates for whom both questionnaire and transcript data were collected. The combined survey and aggregate transcript data file…

  4. Basic Internet Software Toolkit.

    ERIC Educational Resources Information Center

    Buchanan, Larry

    1998-01-01

    Once schools are connected to the Internet, the next step is getting network workstations configured for Internet access. This article describes a basic toolkit comprising software currently available on the Internet for free or modest cost. Lists URLs for Web browser, Telnet, FTP, file decompression, portable document format (PDF) reader,…

  5. Geospatial Toolkit

    SciTech Connect

    2010-10-14

    The Geospatial Toolkit is an NREL-developed map-based software application that integrates resource data and other geographic information systems (GIS) data for integrated resource assessment. The non-resource, country-specific data for each toolkit comes from a variety of agencies within each country as well as from global datasets. Originally developed in 2005, the Geospatial Toolkit was completely redesigned and re-released in November 2010 to provide a more modern, easier-to-use interface with considerably faster analytical querying capabilities. The revised version of the Geospatial Toolkit has been released for all original toolkit countries/regions and each software package is made available on NREL's website,

  6. Geospatial Toolkit

    Energy Science and Technology Software Center (ESTSC)

    2010-10-14

    The Geospatial Toolkit is an NREL-developed map-based software application that integrates resource data and other geographic information systems (GIS) data for integrated resource assessment. The non-resource, country-specific data for each toolkit comes from a variety of agencies within each country as well as from global datasets. Originally developed in 2005, the Geospatial Toolkit was completely redesigned and re-released in November 2010 to provide a more modern, easier-to-use interface with considerably faster analytical querying capabilities. Themore » revised version of the Geospatial Toolkit has been released for all original toolkit countries/regions and each software package is made available on NREL's website,« less

  7. Molecular techniques in ecohealth research toolkit: facilitating estimation of aggregate gastroenteritis burden in an irrigated periurban landscape.

    PubMed

    Tserendorj, Ariuntuya; Anceno, Alfredo J; Houpt, Eric R; Icenhour, Crystal R; Sethabutr, Orntipa; Mason, Carl S; Shipin, Oleg V

    2011-09-01

    Assessment of microbial hazards associated with certain environmental matrices, livelihood strategies, and food handling practices are constrained by time-consuming conventional microbiological techniques that lead to health risk assessments of narrow geographic or time scope, often targeting very few pathogens. Health risk assessment based on one or few indicator organisms underestimates true disease burden due a number of coexisting causative pathogens. Here, we employed molecular techniques in a survey of Cryptosporidium parvum, Giardia lamblia, Campylobacter jejuni, Escherichia coli O157:H7, Listeria monocytogenes, Salmonella spp., Shigella spp., Vibrio cholera, and Rotavirus A densities in canal water with respect to seasonality and spatial distribution of point-nonpoint pollution sources. Three irrigational canals stretching across nearly a 150-km(2) periurban landscape, traditionally used for agricultural irrigation but function as vital part of municipal wastewater stabilization in recent years, were investigated. Compiled stochastic data (pathogen concentration, susceptible populations) and literature-obtained deterministic data (pathogen dose-response model parameter values) were used in estimating waterborne gastroenteritis burden. Exposure scenarios include swimming or fishing, consuming canal water-irrigated vegetables, and ingesting or inhaling water aerosols while working in canal water-irrigated fields. Estimated annual gastroenteritis burden due individual pathogens among the sampling points was -10.6log(10) to -2.2log(10) DALYs. Aggregated annual gastroenteritis burden due all the target pathogens per sampling point was -3.1log(10) to -1.9log(10) DALYs, far exceeding WHO acceptable limit of -6.0log(10) DALYs. The present approach will facilitate the comprehensive collection of surface water microbiological baseline data and setting of benchmarks for interventions aimed at reducing microbial hazards in similar landscapes worldwide. PMID:22146856

  8. Literacy Toolkit

    ERIC Educational Resources Information Center

    Center for Best Practices in Early Childhood Education, 2005

    2005-01-01

    The toolkit contains print and electronic resources, including (1) "eMERGing Literacy and Technology: Working Together", A 492 page curriculum guide; (2) "LitTECH Interactive Presents: The Beginning of Literacy", a DVD that provides and overview linking technology to the concepts of emerging literacy; (3) "Your Preschool Classroom Computer Center:…

  9. Local Toolkit

    Energy Science and Technology Software Center (ESTSC)

    2007-05-31

    The LOCAL Toolkit contains tools and libraries developed under the LLNL LOCAL LDRD project for managing and processing large unstructured data sets primrily from parallel numerical simulations, such as triangular, tetrahedral, and hexahedral meshes, point sets, and graphs. The tools have three main functionalities: cache-coherent, linear ordering of multidimensional data; lossy and lossless data compression optimized for different data types; and an out-of-core streaming I/O library with simple processing modules for unstructed data.

  10. Tracker Toolkit

    NASA Technical Reports Server (NTRS)

    Lewis, Steven J.; Palacios, David M.

    2013-01-01

    This software can track multiple moving objects within a video stream simultaneously, use visual features to aid in the tracking, and initiate tracks based on object detection in a subregion. A simple programmatic interface allows plugging into larger image chain modeling suites. It extracts unique visual features for aid in tracking and later analysis, and includes sub-functionality for extracting visual features about an object identified within an image frame. Tracker Toolkit utilizes a feature extraction algorithm to tag each object with metadata features about its size, shape, color, and movement. Its functionality is independent of the scale of objects within a scene. The only assumption made on the tracked objects is that they move. There are no constraints on size within the scene, shape, or type of movement. The Tracker Toolkit is also capable of following an arbitrary number of objects in the same scene, identifying and propagating the track of each object from frame to frame. Target objects may be specified for tracking beforehand, or may be dynamically discovered within a tripwire region. Initialization of the Tracker Toolkit algorithm includes two steps: Initializing the data structures for tracked target objects, including targets preselected for tracking; and initializing the tripwire region. If no tripwire region is desired, this step is skipped. The tripwire region is an area within the frames that is always checked for new objects, and all new objects discovered within the region will be tracked until lost (by leaving the frame, stopping, or blending in to the background).

  11. A new open-source Python-based Space Weather data access, visualization, and analysis toolkit

    NASA Astrophysics Data System (ADS)

    de Larquier, S.; Ribeiro, A.; Frissell, N. A.; Spaleta, J.; Kunduri, B.; Thomas, E. G.; Ruohoniemi, J.; Baker, J. B.

    2013-12-01

    Space weather research relies heavily on combining and comparing data from multiple observational platforms. Current frameworks exist to aggregate some of the data sources, most based on file downloads via web or ftp interfaces. Empirical models are mostly fortran based and lack interfaces with more useful scripting languages. In an effort to improve data and model access, the SuperDARN community has been developing a Python-based Space Science Data Visualization Toolkit (DaViTpy). At the center of this development was a redesign of how our data (from 30 years of SuperDARN radars) was made available. Several access solutions are now wrapped into one convenient Python interface which probes local directories, a new remote NoSQL database, and an FTP server to retrieve the requested data based on availability. Motivated by the efficiency of this interface and the inherent need for data from multiple instruments, we implemented similar modules for other space science datasets (POES, OMNI, Kp, AE...), and also included fundamental empirical models with Python interfaces to enhance data analysis (IRI, HWM, MSIS...). All these modules and more are gathered in a single convenient toolkit, which is collaboratively developed and distributed using Github and continues to grow. While still in its early stages, we expect this toolkit will facilitate multi-instrument space weather research and improve scientific productivity.

  12. Teacher Quality Toolkit

    ERIC Educational Resources Information Center

    Lauer, Patricia A.; Dean, Ceri B.

    2004-01-01

    This Teacher Quality Toolkit aims to support the continuum of teacher learning by providing tools that institutions of higher education, districts, and schools can use to improve both preservice and inservice teacher education. The toolkit incorporates McREL?s accumulated knowledge and experience related to teacher quality and standards-based…

  13. Community Schools Evaluation Toolkit

    ERIC Educational Resources Information Center

    Shah, Shital C.; Brink, Katrina; London, Rebecca; Masur, Shelly; Quihuis, Gisell

    2009-01-01

    This toolkit is designed to help community schools evaluate their efforts so that they learn from their successes, identify current challenges, and plan future efforts. It provides a step-by-step process for planning and conducting an evaluation at your community school site(s). The toolkit is a practical, hands-on guide that makes it possible for…

  14. Student Success Center Toolkit

    ERIC Educational Resources Information Center

    Jobs For the Future, 2014

    2014-01-01

    "Student Success Center Toolkit" is a compilation of materials organized to assist Student Success Center directors as they staff, launch, operate, and sustain Centers. The toolkit features materials created and used by existing Centers, such as staffing and budgeting templates, launch materials, sample meeting agendas, and fundraising…

  15. TOOLKIT, Version 2. 0

    SciTech Connect

    Schroeder, E.; Bagot, B.; McNeill, R.L.

    1990-05-09

    The purpose of this User's Guide is to show by example many of the features of Toolkit II. Some examples will be copies of screens as they appear while running the Toolkit. Other examples will show what the user should enter in various situations; in these instances, what the computer asserts will be in boldface and what the user responds will be in regular type. The User's Guide is divided into four sections. The first section, FOCUS Databases'', will give a broad overview of the Focus administrative databases that are available on the VAX; easy-to-use reports are available for most of them in the Toolkit. The second section, Getting Started'', will cover the steps necessary to log onto the Computer Center VAX cluster and how to start Focus and the Toolkit. The third section, Using the Toolkit'', will discuss some of the features in the Toolkit -- the available reports and how to access them, as well as some utilities. The fourth section, Helpful Hints'', will cover some useful facts about the VAX and Focus as well as some of the more common problems that can occur. The Toolkit is not set in concrete but is continually being revised and improved. If you have any opinions as to changes that you would like to see made to the Toolkit or new features that you would like included, please let us know. Since we do try to respond to the needs of the user and make periodic improvement to the Toolkit, this User's Guide may not correspond exactly to what is available in the computer. In general, changes are made to provide new options or features; rarely is an existing feature deleted.

  16. JAVA Stereo Display Toolkit

    NASA Technical Reports Server (NTRS)

    Edmonds, Karina

    2008-01-01

    This toolkit provides a common interface for displaying graphical user interface (GUI) components in stereo using either specialized stereo display hardware (e.g., liquid crystal shutter or polarized glasses) or anaglyph display (red/blue glasses) on standard workstation displays. An application using this toolkit will work without modification in either environment, allowing stereo software to reach a wider audience without sacrificing high-quality display on dedicated hardware. The toolkit is written in Java for use with the Swing GUI Toolkit and has cross-platform compatibility. It hooks into the graphics system, allowing any standard Swing component to be displayed in stereo. It uses the OpenGL graphics library to control the stereo hardware and to perform the rendering. It also supports anaglyph and special stereo hardware using the same API (application-program interface), and has the ability to simulate color stereo in anaglyph mode by combining the red band of the left image with the green/blue bands of the right image. This is a low-level toolkit that accomplishes simply the display of components (including the JadeDisplay image display component). It does not include higher-level functions such as disparity adjustment, 3D cursor, or overlays all of which can be built using this toolkit.

  17. System Design Toolkit for Integrated Modular Avionics for Space

    NASA Astrophysics Data System (ADS)

    Hann, Mark; Balbastre Betoret, Patricia; Simo Ten, Jose Enrique; De Ferluc, Regis; Ramachandran, Jinesh

    2015-09-01

    The IMA-SP development process identified tools were needed to perform the activities of: i) Partitioning and Resource Allocation and ii) System Feasibility Assessment. This paper describes the definition, design, implementation and test of the tool support required to perform the IMA-SP development process activities. This includes the definition of a data model, with associated files and file formats, describing the complete setup of a partitioned system and allowing system feasibility assessment; the development of a prototype of the tool set, that is called the IMA-SP System Design Toolkit (SDT) and the demonstration of the toolkit on a case study.

  18. The Einstein Toolkit

    NASA Astrophysics Data System (ADS)

    Löffler, Frank

    2012-03-01

    The Einstein Toolkit Consortium is developing and supporting open software for relativistic astrophysics. Its aim is to provide the core computational tools that can enable new science, broaden our community, facilitate interdisciplinary research and take advantage of petascale computers and advanced cyberinfrastructure. The Einstein Toolkit currently consists of an open set of over 100 modules for the Cactus framework, primarily for computational relativity along with associated tools for simulation management and visualization. The toolkit includes solvers for vacuum spacetimes as well as relativistic magneto-hydrodynamics, along with modules for initial data, analysis and computational infrastructure. These modules have been developed and improved over many years by many different researchers. The Einstein Toolkit is supported by a distributed model, combining core support of software, tools, and documentation in its own repositories and through partnerships with other developers who contribute open software and coordinate together on development. As of January 2012 it has 68 registered members from 30 research groups world-wide. This talk will present the current capabilities of the Einstein Toolkit and will point to information how to leverage it for future research.

  19. Knowledge information management toolkit and method

    DOEpatents

    Hempstead, Antoinette R.; Brown, Kenneth L.

    2006-08-15

    A system is provided for managing user entry and/or modification of knowledge information into a knowledge base file having an integrator support component and a data source access support component. The system includes processing circuitry, memory, a user interface, and a knowledge base toolkit. The memory communicates with the processing circuitry and is configured to store at least one knowledge base. The user interface communicates with the processing circuitry and is configured for user entry and/or modification of knowledge pieces within a knowledge base. The knowledge base toolkit is configured for converting knowledge in at least one knowledge base from a first knowledge base form into a second knowledge base form. A method is also provided.

  20. Water Security Toolkit

    Energy Science and Technology Software Center (ESTSC)

    2012-09-11

    The Water Security Toolkit (WST) provides software for modeling and analyzing water distribution systems to minimize the potential impact of contamination incidents. WST wraps capabilities for contaminant transport, impact assessment, and sensor network design with response action plans, including source identification, rerouting, and decontamination, to provide a range of water security planning and real-time applications.

  1. THE EPANET PROGRAMMER'S TOOLKIT FOR ANALYSIS OF WATER DISTRIBUTION SYSTEMS

    EPA Science Inventory

    The EPANET Programmer's Toolkit is a collection of functions that helps simplify computer programming of water distribution network analyses. the functions can be used to read in a pipe network description file, modify selected component properties, run multiple hydraulic and wa...

  2. Parallel Climate Analysis Toolkit (ParCAT)

    Energy Science and Technology Software Center (ESTSC)

    2013-06-30

    The parallel analysis toolkit (ParCAT) provides parallel statistical processing of large climate model simulation datasets. ParCAT provides parallel point-wise average calculations, frequency distributions, sum/differences of two datasets, and difference-of-average and average-of-difference for two datasets for arbitrary subsets of simulation time. ParCAT is a command-line utility that can be easily integrated in scripts or embedded in other application. ParCAT supports CMIP5 post-processed datasets as well as non-CMIP5 post-processed datasets. ParCAT reads and writes standard netCDF files.

  3. Self-assessment toolkit.

    PubMed

    2016-09-01

    A new health and integration toolkit has been launched by NHS Clinical Commissioners, in partnership with the Local Government Association, NHS Confederation and the Association of Directors of Adult Services. The self-assessment tool is designed to help local health and care leaders, through health and well-being boards, to assess their ambition, capability, capacity and readiness to integrate local health and social care services. PMID:27581897

  4. Network algorithms for information analysis using the Titan Toolkit.

    SciTech Connect

    McLendon, William Clarence, III; Baumes, Jeffrey; Wilson, Andrew T.; Wylie, Brian Neil; Shead, Timothy M.

    2010-07-01

    The analysis of networked activities is dramatically more challenging than many traditional kinds of analysis. A network is defined by a set of entities (people, organizations, banks, computers, etc.) linked by various types of relationships. These entities and relationships are often uninteresting alone, and only become significant in aggregate. The analysis and visualization of these networks is one of the driving factors behind the creation of the Titan Toolkit. Given the broad set of problem domains and the wide ranging databases in use by the information analysis community, the Titan Toolkit's flexible, component based pipeline provides an excellent platform for constructing specific combinations of network algorithms and visualizations.

  5. Molecular model generator toolkit

    SciTech Connect

    Schneider, R.D.

    1994-07-01

    This report is a user manual for an ASCII file of Fortran source code which must be compiled before use. The software will assist in creating plastic models of molecules whose specifications are described in the Brookhaven Protein Databank. Other data files can be used if they are in the same format as the files in the databank. The output file is a program for a 3-D Systems Stereolithography Apparatus and the program is run on a SGI Indigo workstation.

  6. The Weather and Climate Toolkit

    NASA Astrophysics Data System (ADS)

    Ansari, S.; Del Greco, S.; Hankins, B.

    2010-12-01

    The Weather and Climate Toolkit (WCT) is free, platform independent software distributed from NOAA’s National Climatic Data Center (NCDC). The WCT allows the visualization and data export of weather and climate data, including Radar, Satellite and Model data. By leveraging the NetCDF for Java library and Common Data Model, the WCT is extremely scalable and capable of supporting many new datasets in the future. Gridded NetCDF files (regular and irregularly spaced, using Climate-Forecast (CF) conventions) are supported, along with many other formats including GRIB. The WCT provides tools for custom data overlays, Web Map Service (WMS) background maps, animations and basic filtering. The export of images and movies is provided in multiple formats. The WCT Data Export Wizard allows for data export in both vector polygon/point (Shapefile, Well-Known Text) and raster (GeoTIFF, ESRI Grid, VTK, Gridded NetCDF) formats. These data export features promote the interoperability of weather and climate information with various scientific communities and common software packages such as ArcGIS, Google Earth, MatLAB, GrADS and R. The WCT also supports an embedded, integrated Google Earth instance. The Google Earth Browser Plugin allows seamless visualization of data on a native 3-D Google Earth instance linked to the standard 2-D map. Level-II NEXRAD data for Hurricane Katrina GPCP (Global Precipitation Product), visualized in 2-D and internal Google Earth view.

  7. Radiation source search toolkit

    NASA Astrophysics Data System (ADS)

    Young, Jason S.

    The newly developed Radiation Source Search Toolkit (RSST) is a toolkit for generating gamma-ray spectroscopy data for use in the testing of source search algorithms. RSST is designed in a modular fashion to allow for ease of use while still maintaining accuracy in developing the output spectra. Users are allowed to define a real-world path for mobile radiation detectors to travel as well as radiation sources for possible detection. RSST can accept measured or simulated radiation spectrum data for generation into a source search simulation. RSST handles traversing the path, computing distance related attenuation, and generating the final output spectra. RSST also has the ability to simulate anisotropic shielding as well as traffic conditions that would impede a ground-based detection platform in a real-world scenario. RSST provides a novel fusion between spectral data and geospatial source search data generation. By utilizing the RSST, researchers can easily generate multiple datasets for testing detection algorithms without the need for actual radiation sources and mobile detector platforms.

  8. The MIS Pipeline Toolkit

    NASA Astrophysics Data System (ADS)

    Teuben, Peter J.; Pound, M. W.; Storm, S.; Mundy, L. G.; Salter, D. M.; Lee, K.; Kwon, W.; Fernandez Lopez, M.; Plunkett, A.

    2013-01-01

    A pipeline toolkit was developed to help organizing, reducing and analyzing a large number of near-identical datasets. This is a very general problem, for which many different solutions have been implemented. In this poster we present one such solution that lends itself to users of the Unix command line, using the Unix "make" utility, and adapts itself easily to observational as well as theoretical projects. Two examples are given, one from the CARMA CLASSy survey, and another from a simulated kinematic survey of early galaxy forming disks. The CLASSy survey (discussed in more detail in three accompanying posters) consists of 5 different star forming regions, observed with CARMA, each containing roughly 10-20 datasets in continuum and 3 different molecular lines, that need to be combined in final data cubes and maps. The strength of such a pipeline toolkit shows itself as new data are accumulated, the data reduction steps are improved and easily re-applied to previously taken data. For this we employed a master script that was run nightly, and collaborators submitted improved script and/or pipeline parameters that control these scripts. MIS is freely available for download.

  9. Mission Simulation Toolkit

    NASA Technical Reports Server (NTRS)

    Pisaich, Gregory; Flueckiger, Lorenzo; Neukom, Christian; Wagner, Mike; Buchanan, Eric; Plice, Laura

    2007-01-01

    The Mission Simulation Toolkit (MST) is a flexible software system for autonomy research. It was developed as part of the Mission Simulation Facility (MSF) project that was started in 2001 to facilitate the development of autonomous planetary robotic missions. Autonomy is a key enabling factor for robotic exploration. There has been a large gap between autonomy software (at the research level), and software that is ready for insertion into near-term space missions. The MST bridges this gap by providing a simulation framework and a suite of tools for supporting research and maturation of autonomy. MST uses a distributed framework based on the High Level Architecture (HLA) standard. A key feature of the MST framework is the ability to plug in new models to replace existing ones with the same services. This enables significant simulation flexibility, particularly the mixing and control of fidelity level. In addition, the MST provides automatic code generation from robot interfaces defined with the Unified Modeling Language (UML), methods for maintaining synchronization across distributed simulation systems, XML-based robot description, and an environment server. Finally, the MSF supports a number of third-party products including dynamic models and terrain databases. Although the communication objects and some of the simulation components that are provided with this toolkit are specifically designed for terrestrial surface rovers, the MST can be applied to any other domain, such as aerial, aquatic, or space.

  10. Multiphysics Application Coupling Toolkit

    Energy Science and Technology Software Center (ESTSC)

    2013-12-02

    This particular consortium implementation of the software integration infrastructure will, in large part, refactor portions of the Rocstar multiphysics infrastructure. Development of this infrastructure originated at the University of Illinois DOE ASCI Center for Simulation of Advanced Rockets (CSAR) to support the center's massively parallel multiphysics simulation application, Rocstar, and has continued at IllinoisRocstar, a small company formed near the end of the University-based program. IllinoisRocstar is now licensing these new developments as free, openmore » source, in hopes to help improve their own and others' access to infrastructure which can be readily utilized in developing coupled or composite software systems; with particular attention to more rapid production and utilization of multiphysics applications in the HPC environment. There are two major pieces to the consortium implementation, the Application Component Toolkit (ACT), and the Multiphysics Application Coupling Toolkit (MPACT). The current development focus is the ACT, which is (will be) the substrate for MPACT. The ACT itself is built up from the components described in the technical approach. In particular, the ACT has the following major components: 1.The Component Object Manager (COM): The COM package provides encapsulation of user applications, and their data. COM also provides the inter-component function call mechanism. 2.The System Integration Manager (SIM): The SIM package provides constructs and mechanisms for orchestrating composite systems of multiply integrated pieces.« less

  11. NAIF Toolkit - Extended

    NASA Technical Reports Server (NTRS)

    Acton, Charles H., Jr.; Bachman, Nathaniel J.; Semenov, Boris V.; Wright, Edward D.

    2010-01-01

    The Navigation Ancillary Infor ma tion Facility (NAIF) at JPL, acting under the direction of NASA s Office of Space Science, has built a data system named SPICE (Spacecraft Planet Instrument Cmatrix Events) to assist scientists in planning and interpreting scientific observations (see figure). SPICE provides geometric and some other ancillary information needed to recover the full value of science instrument data, including correlation of individual instrument data sets with data from other instruments on the same or other spacecraft. This data system is used to produce space mission observation geometry data sets known as SPICE kernels. It is also used to read SPICE kernels and to compute derived quantities such as positions, orientations, lighting angles, etc. The SPICE toolkit consists of a subroutine/ function library, executable programs (both large applications and simple utilities that focus on kernel management), and simple examples of using SPICE toolkit subroutines. This software is very accurate, thoroughly tested, and portable to all computers. It is extremely stable and reusable on all missions. Since the previous version, three significant capabilities have been added: Interactive Data Language (IDL) interface, MATLAB interface, and a geometric event finder subsystem.

  12. Multiphysics Application Coupling Toolkit

    SciTech Connect

    Campbell, Michael T.

    2013-12-02

    This particular consortium implementation of the software integration infrastructure will, in large part, refactor portions of the Rocstar multiphysics infrastructure. Development of this infrastructure originated at the University of Illinois DOE ASCI Center for Simulation of Advanced Rockets (CSAR) to support the center's massively parallel multiphysics simulation application, Rocstar, and has continued at IllinoisRocstar, a small company formed near the end of the University-based program. IllinoisRocstar is now licensing these new developments as free, open source, in hopes to help improve their own and others' access to infrastructure which can be readily utilized in developing coupled or composite software systems; with particular attention to more rapid production and utilization of multiphysics applications in the HPC environment. There are two major pieces to the consortium implementation, the Application Component Toolkit (ACT), and the Multiphysics Application Coupling Toolkit (MPACT). The current development focus is the ACT, which is (will be) the substrate for MPACT. The ACT itself is built up from the components described in the technical approach. In particular, the ACT has the following major components: 1.The Component Object Manager (COM): The COM package provides encapsulation of user applications, and their data. COM also provides the inter-component function call mechanism. 2.The System Integration Manager (SIM): The SIM package provides constructs and mechanisms for orchestrating composite systems of multiply integrated pieces.

  13. Einstein Toolkit for Relativistic Astrophysics

    NASA Astrophysics Data System (ADS)

    Collaborative Effort

    2011-02-01

    The Einstein Toolkit is a collection of software components and tools for simulating and analyzing general relativistic astrophysical systems. Such systems include gravitational wave space-times, collisions of compact objects such as black holes or neutron stars, accretion onto compact objects, core collapse supernovae and Gamma-Ray Bursts. The Einstein Toolkit builds on numerous software efforts in the numerical relativity community including CactusEinstein, Whisky, and Carpet. The Einstein Toolkit currently uses the Cactus Framework as the underlying computational infrastructure that provides large-scale parallelization, general computational components, and a model for collaborative, portable code development.

  14. A Prototype Search Toolkit

    NASA Astrophysics Data System (ADS)

    Knepper, Margaret M.; Fox, Kevin L.; Frieder, Ophir

    Information overload is now a reality. We no longer worry about obtaining a sufficient volume of data; we now are concerned with sifting and understanding the massive volumes of data available to us. To do so, we developed an integrated information processing toolkit that provides the user with a variety of ways to view their information. The views include keyword search results, a domain specific ranking system that allows for adaptively capturing topic vocabularies to customize and focus the search results, navigation pages for browsing, and a geospatial and temporal component to visualize results in time and space, and provide “what if” scenario playing. Integrating the information from different tools and sources gives the user additional information and another way to analyze the data. An example of the integration is illustrated on reports of the avian influenza (bird flu).

  15. Mesh Quality Improvement Toolkit

    Energy Science and Technology Software Center (ESTSC)

    2002-11-15

    MESQUITE is a linkable software library to be used by simulation and mesh generation tools to improve the quality of meshes. Mesh quality is improved by node movement and/or local topological modifications. Various aspects of mesh quality such as smoothness, element shape, size, and orientation are controlled by choosing the appropriate mesh qualtiy metric, and objective function tempate, and a numerical optimization solver to optimize the quality of meshes, MESQUITE uses the TSTT mesh interfacemore » specification to provide an interoperable toolkit that can be used by applications which adopt the standard. A flexible code design makes it easy for meshing researchers to add additional mesh quality metrics, templates, and solvers to develop new quality improvement algorithms by making use of the MESQUITE infrastructure.« less

  16. TOOLKIT FOR ADVANCED OPTIMIZATION

    Energy Science and Technology Software Center (ESTSC)

    2000-10-13

    The TAO project focuses on the development of software for large scale optimization problems. TAO uses an object-oriented design to create a flexible toolkit with strong emphasis on the reuse of external tools where appropriate. Our design enables bi-directional connection to lower level linear algebra support (for example, parallel sparse matrix data structures) as well as higher level application frameworks. The Toolkist for Advanced Optimization (TAO) is aimed at teh solution of large-scale optimization problemsmore » on high-performance architectures. Our main goals are portability, performance, scalable parallelism, and an interface independent of the architecture. TAO is suitable for both single-processor and massively-parallel architectures. The current version of TAO has algorithms for unconstrained and bound-constrained optimization.« less

  17. ParCAT: Parallel Climate Analysis Toolkit

    SciTech Connect

    Smith, Brian E.; Steed, Chad A.; Shipman, Galen M.; Ricciuto, Daniel M.; Thornton, Peter E.; Wehner, Michael; Williams, Dean N.

    2013-01-01

    Climate science is employing increasingly complex models and simulations to analyze the past and predict the future of Earth s climate. This growth in complexity is creating a widening gap between the data being produced and the ability to analyze the datasets. Parallel computing tools are necessary to analyze, compare, and interpret the simulation data. The Parallel Climate Analysis Toolkit (ParCAT) provides basic tools to efficiently use parallel computing techniques to make analysis of these datasets manageable. The toolkit provides the ability to compute spatio-temporal means, differences between runs or differences between averages of runs, and histograms of the values in a data set. ParCAT is implemented as a command-line utility written in C. This allows for easy integration in other tools and allows for use in scripts. This also makes it possible to run ParCAT on many platforms from laptops to supercomputers. ParCAT outputs NetCDF files so it is compatible with existing utilities such as Panoply and UV-CDAT. This paper describes ParCAT and presents results from some example runs on the Titan system at ORNL.

  18. Introducing the Ginga FITS Viewer and Toolkit

    NASA Astrophysics Data System (ADS)

    Jeschke, E.; Inagaki, T.; Kackley, R.

    2013-10-01

    We introduce Ginga, a new open-source FITS viewer and toolkit based on Python astronomical packages such as pyfits, numpy, scipy, matplotlib, and pywcs. For developers, we present a set of Python classes for viewing FITS files under the modern Gtk and Qt widget sets and a more full-featured viewer that has a plugin architecture. We further describe how plugins can be written to extend the viewer with many different capabilities. The software may be of interest to software developers who are looking for a solution for integrating FITS visualization into their Python programs and end users interested in a new and different FITS viewer that is not based on Tcl/Tk widget technology. The software has been released under a BSD license.

  19. ParCAT: A Parallel Climate Analysis Toolkit

    NASA Astrophysics Data System (ADS)

    Haugen, B.; Smith, B.; Steed, C.; Ricciuto, D. M.; Thornton, P. E.; Shipman, G.

    2012-12-01

    Climate science has employed increasingly complex models and simulations to analyze the past and predict the future of our climate. The size and dimensionality of climate simulation data has been growing with the complexity of the models. This growth in data is creating a widening gap between the data being produced and the tools necessary to analyze large, high dimensional data sets. With single run data sets increasing into 10's, 100's and even 1000's of gigabytes, parallel computing tools are becoming a necessity in order to analyze and compare climate simulation data. The Parallel Climate Analysis Toolkit (ParCAT) provides basic tools that efficiently use parallel computing techniques to narrow the gap between data set size and analysis tools. ParCAT was created as a collaborative effort between climate scientists and computer scientists in order to provide efficient parallel implementations of the computing tools that are of use to climate scientists. Some of the basic functionalities included in the toolkit are the ability to compute spatio-temporal means and variances, differences between two runs and histograms of the values in a data set. ParCAT is designed to facilitate the "heavy lifting" that is required for large, multidimensional data sets. The toolkit does not focus on performing the final visualizations and presentation of results but rather, reducing large data sets to smaller, more manageable summaries. The output from ParCAT is provided in commonly used file formats (NetCDF, CSV, ASCII) to allow for simple integration with other tools. The toolkit is currently implemented as a command line utility, but will likely also provide a C library for developers interested in tighter software integration. Elements of the toolkit are already being incorporated into projects such as UV-CDAT and CMDX. There is also an effort underway to implement portions of the CCSM Land Model Diagnostics package using ParCAT in conjunction with Python and gnuplot. Par

  20. Simplifying operations with an uplink/downlink integration toolkit

    NASA Technical Reports Server (NTRS)

    Murphy, Susan C.; Miller, Kevin J.; Guerrero, Ana Maria; Joe, Chester; Louie, John J.; Aguilera, Christine

    1994-01-01

    The Operations Engineering Lab (OEL) at JPL has developed a simple, generic toolkit to integrate the uplink/downlink processes, (often called closing the loop), in JPL's Multimission Ground Data System. This toolkit provides capabilities for integrating telemetry verification points with predicted spacecraft commands and ground events in the Mission Sequence Of Events (SOE) document. In the JPL ground data system, the uplink processing functions and the downlink processing functions are separate subsystems that are not well integrated because of the nature of planetary missions with large one-way light times for spacecraft-to-ground communication. Our new closed-loop monitoring tool allows an analyst or mission controller to view and save uplink commands and ground events with their corresponding downlinked telemetry values regardless of the delay in downlink telemetry and without requiring real-time intervention by the user. An SOE document is a time-ordered list of all the planned ground and spacecraft events, including all commands, sequence loads, ground events, significant mission activities, spacecraft status, and resource allocations. The SOE document is generated by expansion and integration of spacecraft sequence files, ground station allocations, navigation files, and other ground event files. This SOE generation process has been automated within the OEL and includes a graphical, object-oriented SOE editor and real-time viewing tool running under X/Motif. The SOE toolkit was used as the framework for the integrated implementation. The SOE is used by flight engineers to coordinate their operations tasks, serving as a predict data set in ground operations and mission control. The closed-loop SOE toolkit allows simple, automated integration of predicted uplink events with correlated telemetry points in a single SOE document for on-screen viewing and archiving. It automatically interfaces with existing real-time or non real-time sources of information, to

  1. A Scalable Analysis Toolkit

    NASA Technical Reports Server (NTRS)

    Aiken, Alexander

    2001-01-01

    The Scalable Analysis Toolkit (SAT) project aimed to demonstrate that it is feasible and useful to statically detect software bugs in very large systems. The technical focus of the project was on a relatively new class of constraint-based techniques for analysis software, where the desired facts about programs (e.g., the presence of a particular bug) are phrased as constraint problems to be solved. At the beginning of this project, the most successful forms of formal software analysis were limited forms of automatic theorem proving (as exemplified by the analyses used in language type systems and optimizing compilers), semi-automatic theorem proving for full verification, and model checking. With a few notable exceptions these approaches had not been demonstrated to scale to software systems of even 50,000 lines of code. Realistic approaches to large-scale software analysis cannot hope to make every conceivable formal method scale. Thus, the SAT approach is to mix different methods in one application by using coarse and fast but still adequate methods at the largest scales, and reserving the use of more precise but also more expensive methods at smaller scales for critical aspects (that is, aspects critical to the analysis problem under consideration) of a software system. The principled method proposed for combining a heterogeneous collection of formal systems with different scalability characteristics is mixed constraints. This idea had been used previously in small-scale applications with encouraging results: using mostly coarse methods and narrowly targeted precise methods, useful information (meaning the discovery of bugs in real programs) was obtained with excellent scalability.

  2. Pizza.py Toolkit

    Energy Science and Technology Software Center (ESTSC)

    2006-01-01

    Pizza.py is a loosely integrated collection of tools, many of which provide support for the LAMMPS molecular dynamics and ChemCell cell modeling packages. There are tools to create input files. convert between file formats, process log and dump files, create plots, and visualize and animate simulation snapshots. Software packages that are wrapped by Pizza.py. so they can invoked from within Python, include GnuPlot, MatLab, Raster3d. and RasMol. Pizza.py is written in Python and runs onmore » any platform that supports Python. Pizza.py enhances the standard Python interpreter in a few simple ways. Its tools are Python modules which can be invoked interactively, from scripts, or from GUIs when appropriate. Some of the tools require additional Python packages to be installed as part of the users Python. Others are wrappers on software packages (as listed above) which must be available on the users system. It is easy to modify or extend Pizza.py with new functionality or new tools, which need not have anything to do with LAMMPS or ChemCell.« less

  3. Pizza.py Toolkit

    SciTech Connect

    Plimpton, Steve; Jones, Matt; Crozier, Paul

    2006-01-01

    Pizza.py is a loosely integrated collection of tools, many of which provide support for the LAMMPS molecular dynamics and ChemCell cell modeling packages. There are tools to create input files. convert between file formats, process log and dump files, create plots, and visualize and animate simulation snapshots. Software packages that are wrapped by Pizza.py. so they can invoked from within Python, include GnuPlot, MatLab, Raster3d. and RasMol. Pizza.py is written in Python and runs on any platform that supports Python. Pizza.py enhances the standard Python interpreter in a few simple ways. Its tools are Python modules which can be invoked interactively, from scripts, or from GUIs when appropriate. Some of the tools require additional Python packages to be installed as part of the users Python. Others are wrappers on software packages (as listed above) which must be available on the users system. It is easy to modify or extend Pizza.py with new functionality or new tools, which need not have anything to do with LAMMPS or ChemCell.

  4. toolkit computational mesh conceptual model.

    SciTech Connect

    Baur, David G.; Edwards, Harold Carter; Cochran, William K.; Williams, Alan B.; Sjaardema, Gregory D.

    2010-03-01

    The Sierra Toolkit computational mesh is a software library intended to support massively parallel multi-physics computations on dynamically changing unstructured meshes. This domain of intended use is inherently complex due to distributed memory parallelism, parallel scalability, heterogeneity of physics, heterogeneous discretization of an unstructured mesh, and runtime adaptation of the mesh. Management of this inherent complexity begins with a conceptual analysis and modeling of this domain of intended use; i.e., development of a domain model. The Sierra Toolkit computational mesh software library is designed and implemented based upon this domain model. Software developers using, maintaining, or extending the Sierra Toolkit computational mesh library must be familiar with the concepts/domain model presented in this report.

  5. ProtoMD: A prototyping toolkit for multiscale molecular dynamics

    NASA Astrophysics Data System (ADS)

    Somogyi, Endre; Mansour, Andrew Abi; Ortoleva, Peter J.

    2016-05-01

    ProtoMD is a toolkit that facilitates the development of algorithms for multiscale molecular dynamics (MD) simulations. It is designed for multiscale methods which capture the dynamic transfer of information across multiple spatial scales, such as the atomic to the mesoscopic scale, via coevolving microscopic and coarse-grained (CG) variables. ProtoMD can be also be used to calibrate parameters needed in traditional CG-MD methods. The toolkit integrates 'GROMACS wrapper' to initiate MD simulations, and 'MDAnalysis' to analyze and manipulate trajectory files. It facilitates experimentation with a spectrum of coarse-grained variables, prototyping rare events (such as chemical reactions), or simulating nanocharacterization experiments such as terahertz spectroscopy, AFM, nanopore, and time-of-flight mass spectroscopy. ProtoMD is written in python and is freely available under the GNU General Public License from

  6. ProtoMD: A prototyping toolkit for multiscale molecular dynamics

    NASA Astrophysics Data System (ADS)

    Somogyi, Endre; Mansour, Andrew Abi; Ortoleva, Peter J.

    2016-05-01

    ProtoMD is a toolkit that facilitates the development of algorithms for multiscale molecular dynamics (MD) simulations. It is designed for multiscale methods which capture the dynamic transfer of information across multiple spatial scales, such as the atomic to the mesoscopic scale, via coevolving microscopic and coarse-grained (CG) variables. ProtoMD can be also be used to calibrate parameters needed in traditional CG-MD methods. The toolkit integrates 'GROMACS wrapper' to initiate MD simulations, and 'MDAnalysis' to analyze and manipulate trajectory files. It facilitates experimentation with a spectrum of coarse-grained variables, prototyping rare events (such as chemical reactions), or simulating nanocharacterization experiments such as terahertz spectroscopy, AFM, nanopore, and time-of-flight mass spectroscopy. ProtoMD is written in python and is freely available under the GNU General Public License from github.com/CTCNano/proto_md.

  7. MCS Systems Administration Toolkit

    Energy Science and Technology Software Center (ESTSC)

    2001-09-30

    This package contains a number of systems administration utilities to assist a team of system administrators in managing a computer environment by automating routine tasks and centralizing information. Included are utilities to help install software on a network of computers and programs to make an image of a disk drive, to manage and distribute configuration files for a number of systems, and to run self-testss on systems, as well as an example of using amore » database to manage host information and various utilities.« less

  8. Web-based Toolkit for Dynamic Generation of Data Processors

    NASA Astrophysics Data System (ADS)

    Patel, J.; Dascalu, S.; Harris, F. C.; Benedict, K. K.; Gollberg, G.; Sheneman, L.

    2011-12-01

    All computation-intensive scientific research uses structured datasets, including hydrology and all other types of climate-related research. When it comes to testing their hypotheses, researchers might use the same dataset differently, and modify, transform, or convert it to meet their research needs. Currently, many researchers spend a good amount of time performing data processing and building tools to speed up this process. They might routinely repeat the same process activities for new research projects, spending precious time that otherwise could be dedicated to analyzing and interpreting the data. Numerous tools are available to run tests on prepared datasets and many of them work with datasets in different formats. However, there is still a significant need for applications that can comprehensively handle data transformation and conversion activities and help prepare the various processed datasets required by the researchers. We propose a web-based application (a software toolkit) that dynamically generates data processors capable of performing data conversions, transformations, and customizations based on user-defined mappings and selections. As a first step, the proposed solution allows the users to define various data structures and, in the next step, can select various file formats and data conversions for their datasets of interest. In a simple scenario, the core of the proposed web-based toolkit allows the users to define direct mappings between input and output data structures. The toolkit will also support defining complex mappings involving the use of pre-defined sets of mathematical, statistical, date/time, and text manipulation functions. Furthermore, the users will be allowed to define logical cases for input data filtering and sampling. At the end of the process, the toolkit is designed to generate reusable source code and executable binary files for download and use by the scientists. The application is also designed to store all data

  9. STAR: Software Toolkit for Analysis Research

    SciTech Connect

    Doak, J.E.; Prommel, J.M.; Whiteson, R.; Hoffbauer, B.L.; Thomas, T.R.; Helman, P.

    1993-08-01

    Analyzing vast quantities of data from diverse information sources is an increasingly important element for nonproliferation and arms control analysis. Much of the work in this area has used human analysts to assimilate, integrate, and interpret complex information gathered from various sources. With the advent of fast computers, we now have the capability to automate this process thereby shifting this burden away from humans. In addition, there now exist huge data storage capabilities which have made it possible to formulate large integrated databases comprising many thereabouts of information spanning a variety of subjects. We are currently designing a Software Toolkit for Analysis Research (STAR) to address these issues. The goal of STAR is to Produce a research tool that facilitates the development and interchange of algorithms for locating phenomena of interest to nonproliferation and arms control experts. One major component deals with the preparation of information. The ability to manage and effectively transform raw data into a meaningful form is a prerequisite for analysis by any methodology. The relevant information to be analyzed can be either unstructured text structured data, signals, or images. Text can be numerical and/or character, stored in raw data files, databases, streams of bytes, or compressed into bits in formats ranging from fixed, to character-delimited, to a count followed by content The data can be analyzed in real-time or batch mode. Once the data are preprocessed, different analysis techniques can be applied. Some are built using expert knowledge. Others are trained using data collected over a period of time. Currently, we are considering three classes of analyzers for use in our software toolkit: (1) traditional machine learning techniques, (2) the purely statistical system, and (3) expert systems.

  10. A Toolkit for Teacher Engagement

    ERIC Educational Resources Information Center

    Grantmakers for Education, 2014

    2014-01-01

    Teachers are critical to the success of education grantmaking strategies, yet in talking with them we discovered that the world of philanthropy is often a mystery. GFE's Toolkit for Teacher Engagement aims to assist funders in authentically and effectively involving teachers in the education reform and innovation process. Built directly from the…

  11. Build an Assistive Technology Toolkit

    ERIC Educational Resources Information Center

    Ahrens, Kelly

    2011-01-01

    Assistive technology (AT) by its very nature consists of a variety of personal and customized tools for multiple learning styles and physical challenges. The author not only encourages students, parents, and educators to advocate for AT services, she also wants them to go a step further and build their own AT toolkits that can instill independence…

  12. pypet: A Python Toolkit for Data Management of Parameter Explorations.

    PubMed

    Meyer, Robert; Obermayer, Klaus

    2016-01-01

    pypet (Python parameter exploration toolkit) is a new multi-platform Python toolkit for managing numerical simulations. Sampling the space of model parameters is a key aspect of simulations and numerical experiments. pypet is designed to allow easy and arbitrary sampling of trajectories through a parameter space beyond simple grid searches. pypet collects and stores both simulation parameters and results in a single HDF5 file. This collective storage allows fast and convenient loading of data for further analyses. pypet provides various additional features such as multiprocessing and parallelization of simulations, dynamic loading of data, integration of git version control, and supervision of experiments via the electronic lab notebook Sumatra. pypet supports a rich set of data formats, including native Python types, Numpy and Scipy data, Pandas DataFrames, and BRIAN(2) quantities. Besides these formats, users can easily extend the toolkit to allow customized data types. pypet is a flexible tool suited for both short Python scripts and large scale projects. pypet's various features, especially the tight link between parameters and results, promote reproducible research in computational neuroscience and simulation-based disciplines. PMID:27610080

  13. pypet: A Python Toolkit for Data Management of Parameter Explorations

    PubMed Central

    Meyer, Robert; Obermayer, Klaus

    2016-01-01

    pypet (Python parameter exploration toolkit) is a new multi-platform Python toolkit for managing numerical simulations. Sampling the space of model parameters is a key aspect of simulations and numerical experiments. pypet is designed to allow easy and arbitrary sampling of trajectories through a parameter space beyond simple grid searches. pypet collects and stores both simulation parameters and results in a single HDF5 file. This collective storage allows fast and convenient loading of data for further analyses. pypet provides various additional features such as multiprocessing and parallelization of simulations, dynamic loading of data, integration of git version control, and supervision of experiments via the electronic lab notebook Sumatra. pypet supports a rich set of data formats, including native Python types, Numpy and Scipy data, Pandas DataFrames, and BRIAN(2) quantities. Besides these formats, users can easily extend the toolkit to allow customized data types. pypet is a flexible tool suited for both short Python scripts and large scale projects. pypet's various features, especially the tight link between parameters and results, promote reproducible research in computational neuroscience and simulation-based disciplines. PMID:27610080

  14. Parallel Power Grid Simulation Toolkit

    SciTech Connect

    Smith, Steve; Kelley, Brian; Banks, Lawrence; Top, Philip; Woodward, Carol

    2015-09-14

    ParGrid is a 'wrapper' that integrates a coupled Power Grid Simulation toolkit consisting of a library to manage the synchronization and communication of independent simulations. The included library code in ParGid, named FSKIT, is intended to support the coupling multiple continuous and discrete even parallel simulations. The code is designed using modern object oriented C++ methods utilizing C++11 and current Boost libraries to ensure compatibility with multiple operating systems and environments.

  15. User`s guide for SDDS toolkit Version 1.4

    SciTech Connect

    Borland, M.

    1995-07-06

    The Self Describing Data Sets (SDDS) file protocol is the basis for a powerful and expanding toolkit of over 40 generic programs. These programs are used for simulation postprocessing, graphics, data preparation, program interfacing, and experimental data analysis. This document describes Version 1.4 of the SDDS commandline toolkit. Those wishing to write programs using SDDS should consult the Application Programmer`s Guide for SDDS Version 1.4. The first section of the present document is shared with this reference. This document does not describe SDDS-compliant EPICS applications, of which there are presently 25.

  16. The Connectome Viewer Toolkit: An Open Source Framework to Manage, Analyze, and Visualize Connectomes

    PubMed Central

    Gerhard, Stephan; Daducci, Alessandro; Lemkaddem, Alia; Meuli, Reto; Thiran, Jean-Philippe; Hagmann, Patric

    2011-01-01

    Advanced neuroinformatics tools are required for methods of connectome mapping, analysis, and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration, and sharing. We have designed and implemented the Connectome Viewer Toolkit – a set of free and extensible open source neuroimaging tools written in Python. The key components of the toolkit are as follows: (1) The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. (2) The Connectome File Format Library enables management and sharing of connectome files. (3) The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration, and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/ PMID:21713110

  17. Design Optimization Toolkit: Users' Manual

    SciTech Connect

    Aguilo Valentin, Miguel Alejandro

    2014-07-01

    The Design Optimization Toolkit (DOTk) is a stand-alone C++ software package intended to solve complex design optimization problems. DOTk software package provides a range of solution methods that are suited for gradient/nongradient-based optimization, large scale constrained optimization, and topology optimization. DOTk was design to have a flexible user interface to allow easy access to DOTk solution methods from external engineering software packages. This inherent flexibility makes DOTk barely intrusive to other engineering software packages. As part of this inherent flexibility, DOTk software package provides an easy-to-use MATLAB interface that enables users to call DOTk solution methods directly from the MATLAB command window.

  18. Python-ARM Radar Toolkit

    SciTech Connect

    Jonathan Helmus, Scott Collis

    2013-03-17

    The Python-ARM Radar Toolkit (Py-ART) is a collection of radar quality control and retrieval codes which all work on two unifying Python objects: the PyRadar and PyGrid objects. By building ingests to several popular radar formats and then abstracting the interface Py-ART greatly simplifies data processing over several other available utilities. In addition Py-ART makes use of Numpy arrays as its primary storage mechanism enabling use of existing and extensive community software tools.

  19. The REACH Youth Program Learning Toolkit

    ERIC Educational Resources Information Center

    Sierra Health Foundation, 2011

    2011-01-01

    Believing in the value of using video documentaries and data as learning tools, members of the REACH technical assistance team collaborated to develop this toolkit. The learning toolkit was designed using and/or incorporating components of the "Engaging Youth in Community Change: Outcomes and Lessons Learned from Sierra Health Foundation's REACH…

  20. An Introduction to the Einstein Toolkit

    NASA Astrophysics Data System (ADS)

    Zilhão, Miguel; Löffler, Frank

    2013-09-01

    We give an introduction to the Einstein Toolkit, a mature, open-source computational infrastructure for numerical relativity based on the Cactus Framework, for the target group of new users. This toolkit is composed of several different modules, is developed by researchers from different institutions throughout the world and is in active continuous development. Documentation for the toolkit and its several modules is often scattered across different locations, a difficulty new users may at times have to struggle with. Scientific papers exist describing the toolkit and its methods in detail, but they might be overwhelming at first. With these lecture notes we hope to provide an initial overview for new users. We cover how to obtain, compile and run the toolkit, and give an overview of some of the tools and modules provided with it.

  1. The SCRAM tool-kit

    NASA Technical Reports Server (NTRS)

    Tamir, David; Flanigan, Lee A.; Weeks, Jack L.; Siewert, Thomas A.; Kimbrough, Andrew G.; Mcclure, Sidney R.

    1994-01-01

    This paper proposes a new series of on-orbit capabilities to support the near-term Hubble Space Telescope, Extended Duration Orbiter, Long Duration Orbiter, Space Station Freedom, other orbital platforms, and even the future manned Lunar/Mars missions. These proposed capabilities form a toolkit termed Space Construction, Repair, and Maintenance (SCRAM). SCRAM addresses both intra-Vehicular Activity (IVA) and Extra-Vehicular Activity (EVA) needs. SCRAM provides a variety of tools which enable welding, brazing, cutting, coating, heating, and cleaning, as well as corresponding nondestructive examination. Near-term IVA-SCRAM applications include repair and modification to fluid lines, structure, and laboratory equipment inside a shirt-sleeve environment (i.e. inside Spacelab or Space Station). Near-term EVA-SCRAM applications include construction of fluid lines and structural members, repair of punctures by orbital debris, refurbishment of surfaces eroded by contaminants. The SCRAM tool-kit also promises future EVA applications involving mass production tasks automated by robotics and artificial intelligence, for construction of large truss, aerobrake, and nuclear reactor shadow shields structures. The leading candidate tool processes for SCRAM, currently undergoing research and development, include Electron Beam, Gas Tungsten Arc, Plasma Arc, and Laser Beam. A series of strategic space flight experiments would make SCRAM available to help conquer the space frontier.

  2. ADMIT: ALMA Data Mining Toolkit

    NASA Astrophysics Data System (ADS)

    Friedel, Douglas N.; Xu, Lisa; Looney, Leslie; Teuben, Peter J.; Pound, Marc W.; Rauch, Kevin P.; Mundy, Lee G.; Kern, Jeffrey S.

    2015-01-01

    ADMIT (ALMA Data Mining Toolkit) is a toolkit for the creation and analysis of new science products from ALMA data. ADMIT is an ALMA Development Project written purely in Python. While specifically targeted for ALMA science and production use after the ALMA pipeline, it is designed to be generally applicable to radio-astronomical data. ADMIT quickly provides users with a detailed overview of their science products: line identifications, line 'cutout' cubes, moment maps, emission type analysis (e.g., feature detection), etc. Users can download the small ADMIT pipeline product (< 20MB), analyze the results, then fine-tune and re-run the ADMIT pipeline (or any part thereof) on their own machines and interactively inspect the results. ADMIT will have both a GUI and command line interface available for this purpose. By analyzing multiple data cubes simultaneously, data mining between many astronomical sources and line transitions will be possible. Users will also be able to enhance the capabilities of ADMIT by creating customized ADMIT tasks satisfying any special processing needs. Future implementations of ADMIT may include EVLA and other instruments.

  3. Admit: Alma Data Mining Toolkit

    NASA Astrophysics Data System (ADS)

    Friedel, Douglas; Looney, Leslie; Xu, Lisa; Pound, Marc W.; Teuben, Peter J.; Rauch, Kevin P.; Mundy, Lee; Kern, Jeffrey S.

    2015-06-01

    ADMIT (ALMA Data Mining Toolkit) is a toolkit for the creation and analysis of new science products from ALMA data. ADMIT is an ALMA Development Project written purely in Python. While specifically targeted for ALMA science and production use after the ALMA pipeline, it is designed to be generally applicable to radio-astronomical data. ADMIT quickly provides users with a detailed overview of their science products: line identifications, line 'cutout' cubes, moment maps, emission type analysis (e.g., feature detection), etc. Users can download the small ADMIT pipeline product (<20MB), analyze the results, then fine-tune and re-run the ADMIT pipeline (or any part thereof) on their own machines and interactively inspect the results. ADMIT will have both a GUI and command line interface available for this purpose. By analyzing multiple data cubes simultaneously, data mining between many astronomical sources and line transitions will be possible. Users will also be able to enhance the capabilities of ADMIT by creating customized ADMIT tasks satisfying any special processing needs. Future implementations of ADMIT may include EVLA and other instruments.

  4. The NetLogger Toolkit V2.0

    SciTech Connect

    Gunter, Dan; Lee, Jason; Stoufer, Martin; Tierney, Brian

    2003-03-28

    The NetLogger Toolkit is designed to monitor, under actual operating conditions, the behavior of all the elements of the application-to-application communication path in order to determine exactly where time is spent within a complex system Using NetLogger, distnbuted application components are modified to produce timestamped logs of "interesting" events at all the critical points of the distributed system Events from each component are correlated, which allov^ one to characterize the performance of all aspects of the system and network in detail. The NetLogger Toolkit itself consists of four components an API and library of functions to simplify the generation of application-level event logs, a set of tools for collecting and sorting log files, an event archive system, and a tool for visualization and analysis of the log files In order to instrument an application to produce event logs, the application developer inserts calls to the NetLogger API at all the critical points in the code, then links the application with the NetLogger library All the tools in the NetLogger Toolkit share a common log format, and assume the existence of accurate and synchronized system clocks NetLogger messages can be logged using an easy-to-read text based format based on the lETF-proposed ULM format, or a binary format that can still be used through the same API but that is several times faster and smaller, with performance comparable or better than binary message formats such as MPI, XDR, SDDF-Binary, and PBIO. The NetLogger binary format is both highly efficient and self-describing, thus optimized for the dynamic message construction and parsing of application instrumentation. NetLogger includes an "activation" API that allows NetLogger logging to be turned on, off, or modified by changing an external file This IS useful for activating logging in daemons/services (e g GndFTP server). The NetLogger reliability API provides the ability to specify backup logging locations and

  5. The DLESE Evaluation Toolkit Project

    NASA Astrophysics Data System (ADS)

    Buhr, S. M.; Barker, L. J.; Marlino, M.

    2002-12-01

    The Evaluation Toolkit and Community project is a new Digital Library for Earth System Education (DLESE) collection designed to raise awareness of project evaluation within the geoscience education community, and to enable principal investigators, teachers, and evaluators to implement project evaluation more readily. This new resource is grounded in the needs of geoscience educators, and will provide a virtual home for a geoscience education evaluation community. The goals of the project are to 1) provide a robust collection of evaluation resources useful for Earth systems educators, 2) establish a forum and community for evaluation dialogue within DLESE, and 3) disseminate the resources through the DLESE infrastructure and through professional society workshops and proceedings. Collaboration and expertise in education, geoscience and evaluation are necessary if we are to conduct the best possible geoscience education. The Toolkit allows users to engage in evaluation at whichever level best suits their needs, get more evaluation professional development if desired, and access the expertise of other segments of the community. To date, a test web site has been built and populated, initial community feedback from the DLESE and broader community is being garnered, and we have begun to heighten awareness of geoscience education evaluation within our community. The web site contains features that allow users to access professional development about evaluation, search and find evaluation resources, submit resources, find or offer evaluation services, sign up for upcoming workshops, take the user survey, and submit calendar items. The evaluation resource matrix currently contains resources that have met our initial review. The resources are currently organized by type; they will become searchable on multiple dimensions of project type, audience, objectives and evaluation resource type as efforts to develop a collection-specific search engine mature. The peer review

  6. Electronic toolkit for nursing education.

    PubMed

    Trangenstein, Patricia A

    2008-12-01

    In an ever-increasing hectic and mobile society, Web-based instructional tools can enhance and supplement student learning and improve communication and collaboration among participants, give rapid feedback on one's progress, and address diverse ways of learning. Web-based formats offer distinct advantages by allowing the learner to view course materials when they choose, from any Internet connection, and as often as they want. The challenge for nurse educators is to assimilate the knowledge and expertise to understand and appropriately use these tools. A variety of Web-based instructional tools are described in this article. As nurse educators increase their awareness of these potential adjuncts they can select appropriate applications that are supported by their institution to construct their own "toolkit." PMID:18940410

  7. Flightspeed Integral Image Analysis Toolkit

    NASA Technical Reports Server (NTRS)

    Thompson, David R.

    2009-01-01

    The Flightspeed Integral Image Analysis Toolkit (FIIAT) is a C library that provides image analysis functions in a single, portable package. It provides basic low-level filtering, texture analysis, and subwindow descriptor for applications dealing with image interpretation and object recognition. Designed with spaceflight in mind, it addresses: Ease of integration (minimal external dependencies) Fast, real-time operation using integer arithmetic where possible (useful for platforms lacking a dedicated floatingpoint processor) Written entirely in C (easily modified) Mostly static memory allocation 8-bit image data The basic goal of the FIIAT library is to compute meaningful numerical descriptors for images or rectangular image regions. These n-vectors can then be used directly for novelty detection or pattern recognition, or as a feature space for higher-level pattern recognition tasks. The library provides routines for leveraging training data to derive descriptors that are most useful for a specific data set. Its runtime algorithms exploit a structure known as the "integral image." This is a caching method that permits fast summation of values within rectangular regions of an image. This integral frame facilitates a wide range of fast image-processing functions. This toolkit has applicability to a wide range of autonomous image analysis tasks in the space-flight domain, including novelty detection, object and scene classification, target detection for autonomous instrument placement, and science analysis of geomorphology. It makes real-time texture and pattern recognition possible for platforms with severe computational restraints. The software provides an order of magnitude speed increase over alternative software libraries currently in use by the research community. FIIAT can commercially support intelligent video cameras used in intelligent surveillance. It is also useful for object recognition by robots or other autonomous vehicles

  8. Pydpiper: a flexible toolkit for constructing novel registration pipelines

    PubMed Central

    Friedel, Miriam; van Eede, Matthijs C.; Pipitone, Jon; Chakravarty, M. Mallar; Lerch, Jason P.

    2014-01-01

    Using neuroimaging technologies to elucidate the relationship between genotype and phenotype and brain and behavior will be a key contribution to biomedical research in the twenty-first century. Among the many methods for analyzing neuroimaging data, image registration deserves particular attention due to its wide range of applications. Finding strategies to register together many images and analyze the differences between them can be a challenge, particularly given that different experimental designs require different registration strategies. Moreover, writing software that can handle different types of image registration pipelines in a flexible, reusable and extensible way can be challenging. In response to this challenge, we have created Pydpiper, a neuroimaging registration toolkit written in Python. Pydpiper is an open-source, freely available software package that provides multiple modules for various image registration applications. Pydpiper offers five key innovations. Specifically: (1) a robust file handling class that allows access to outputs from all stages of registration at any point in the pipeline; (2) the ability of the framework to eliminate duplicate stages; (3) reusable, easy to subclass modules; (4) a development toolkit written for non-developers; (5) four complete applications that run complex image registration pipelines “out-of-the-box.” In this paper, we will discuss both the general Pydpiper framework and the various ways in which component modules can be pieced together to easily create new registration pipelines. This will include a discussion of the core principles motivating code development and a comparison of Pydpiper with other available toolkits. We also provide a comprehensive, line-by-line example to orient users with limited programming knowledge and highlight some of the most useful features of Pydpiper. In addition, we will present the four current applications of the code. PMID:25126069

  9. Pydpiper: a flexible toolkit for constructing novel registration pipelines.

    PubMed

    Friedel, Miriam; van Eede, Matthijs C; Pipitone, Jon; Chakravarty, M Mallar; Lerch, Jason P

    2014-01-01

    Using neuroimaging technologies to elucidate the relationship between genotype and phenotype and brain and behavior will be a key contribution to biomedical research in the twenty-first century. Among the many methods for analyzing neuroimaging data, image registration deserves particular attention due to its wide range of applications. Finding strategies to register together many images and analyze the differences between them can be a challenge, particularly given that different experimental designs require different registration strategies. Moreover, writing software that can handle different types of image registration pipelines in a flexible, reusable and extensible way can be challenging. In response to this challenge, we have created Pydpiper, a neuroimaging registration toolkit written in Python. Pydpiper is an open-source, freely available software package that provides multiple modules for various image registration applications. Pydpiper offers five key innovations. Specifically: (1) a robust file handling class that allows access to outputs from all stages of registration at any point in the pipeline; (2) the ability of the framework to eliminate duplicate stages; (3) reusable, easy to subclass modules; (4) a development toolkit written for non-developers; (5) four complete applications that run complex image registration pipelines "out-of-the-box." In this paper, we will discuss both the general Pydpiper framework and the various ways in which component modules can be pieced together to easily create new registration pipelines. This will include a discussion of the core principles motivating code development and a comparison of Pydpiper with other available toolkits. We also provide a comprehensive, line-by-line example to orient users with limited programming knowledge and highlight some of the most useful features of Pydpiper. In addition, we will present the four current applications of the code. PMID:25126069

  10. Development of an Integrated Human Factors Toolkit

    NASA Technical Reports Server (NTRS)

    Resnick, Marc L.

    2003-01-01

    An effective integration of human abilities and limitations is crucial to the success of all NASA missions. The Integrated Human Factors Toolkit facilitates this integration by assisting system designers and analysts to select the human factors tools that are most appropriate for the needs of each project. The HF Toolkit contains information about a broad variety of human factors tools addressing human requirements in the physical, information processing and human reliability domains. Analysis of each tool includes consideration of the most appropriate design stage, the amount of expertise in human factors that is required, the amount of experience with the tool and the target job tasks that are needed, and other factors that are critical for successful use of the tool. The benefits of the Toolkit include improved safety, reliability and effectiveness of NASA systems throughout the agency. This report outlines the initial stages of development for the Integrated Human Factors Toolkit.

  11. Jefferson Lab Plotting Toolkit for accelerator controls

    SciTech Connect

    Chen, J; Keesee, M; Larrieu, C; Lei, G

    1999-03-01

    Experimental physics generates numerous data sets that scientists analyze using plots, graphs, etc. The Jefferson Lab Plotting Toolkit, JPT, a graphical user interface toolkit, was developed at Jefferson Lab to do data plotting. JPT provides data structures for sets of data, analyzes the range of the data, calculates the reasonable maximum, minimum, and scale of axes, sets line styles and marker styles, plots curves and fills areas.

  12. SlideToolkit: An Assistive Toolset for the Histological Quantification of Whole Slide Images

    PubMed Central

    Nelissen, Bastiaan G. L.; van Herwaarden, Joost A.; Moll, Frans L.; van Diest, Paul J.; Pasterkamp, Gerard

    2014-01-01

    The demand for accurate and reproducible phenotyping of a disease trait increases with the rising number of biobanks and genome wide association studies. Detailed analysis of histology is a powerful way of phenotyping human tissues. Nonetheless, purely visual assessment of histological slides is time-consuming and liable to sampling variation and optical illusions and thereby observer variation, and external validation may be cumbersome. Therefore, within our own biobank, computerized quantification of digitized histological slides is often preferred as a more precise and reproducible, and sometimes more sensitive approach. Relatively few free toolkits are, however, available for fully digitized microscopic slides, usually known as whole slides images. In order to comply with this need, we developed the slideToolkit as a fast method to handle large quantities of low contrast whole slides images using advanced cell detecting algorithms. The slideToolkit has been developed for modern personal computers and high-performance clusters (HPCs) and is available as an open-source project on github.com. We here illustrate the power of slideToolkit by a repeated measurement of 303 digital slides containing CD3 stained (DAB) abdominal aortic aneurysm tissue from a tissue biobank. Our workflow consists of four consecutive steps. In the first step (acquisition), whole slide images are collected and converted to TIFF files. In the second step (preparation), files are organized. The third step (tiles), creates multiple manageable tiles to count. In the fourth step (analysis), tissue is analyzed and results are stored in a data set. Using this method, two consecutive measurements of 303 slides showed an intraclass correlation of 0.99. In conclusion, slideToolkit provides a free, powerful and versatile collection of tools for automated feature analysis of whole slide images to create reproducible and meaningful phenotypic data sets. PMID:25372389

  13. ISO/IEEE 11073 PHD message generation toolkit to standardize healthcare device.

    PubMed

    Lim, Joon-Ho; Park, Chanyong; Park, Soo-Jun; Lee, Kyu-Chul

    2011-01-01

    As senior population increases, various healthcare devices and services are developed such as fall detection device, home hypertension management service, and etc. However, to vitalize healthcare devices and services market, standardization for interoperability between device and service must precede. To achieve the standardization goal, the IEEE 11073 Personal Health Device (PHD) group has been standardized many healthcare devices, but until now there are few devices compatible with the PHD standard. One of main reasons is that it isn't easy for device manufactures to implement standard communication module by analyzing standard documents of over 600 pages. In this paper, we propose a standard message generation toolkit to easily standardize existing non-standard healthcare devices. The proposed toolkit generates standard PHD messages using inputted device information, and the generated messages are adapted to the device with the standard state machine file. For the experiments, we develop a reference H/W, and test the proposed toolkit with three healthcare devices: blood pressure, weighting scale, and glucose meter. The proposed toolkit has an advantage that even if the user doesn't know the standard in detail, the user can easily standardize the non-standard healthcare devices. PMID:22254521

  14. The detector simulation toolkit HORUS

    NASA Astrophysics Data System (ADS)

    Becker, J.; Pennicard, D.; Graafsma, H.

    2012-10-01

    In recent years, X-ray detectors used and developed at synchrotron sources and Free Electron Lasers (FELs) have become increasing powerful and versatile. However, as the capabilities of modern X-ray cameras grew so did their complexity and therefore their response functions are far from trivial. Since understanding the detecting system and its behavior is vital for any physical experiment, the need for dedicated powerful simulation tools arose. The HPAD Output Response fUnction Simulator (HORUS) was originally developed to analyze the performance implications of certain design choices for the Adaptive Gain Integrating Pixel Detector (AGIPD) and over the years grew to a more universal detector simulation toolkit covering the relevant physics in the energy range from below 1 keV to a few hundred keV. HORUS has already been used to study possible improvements of the AGIPD for X-ray Photon Correlation Spectroscopy (XPCS) at the European XFEL and its performance at low beam energies. It is currently being used to study the optimum detector layout for Coherent Diffration Imaging (CDI) at the European XFEL. Simulations of the charge summing mode of the Medipix3 chip have been essential for the improvements of the charge summing mode in the Medipix3 RX chip. HORUS is universal enough to support arbitrary hybrid pixel detector systems (within limitations). To date, the following detector systems are predefined within HORUS: The AGIPD, the Large Pixel Detector (LPD), the Cornell-Stanford Pixel Array Detector (CSPAD), the Mixed-Mode (MMPAD) and KEKPAD, and the Medipix2, Medipix3 and Medipix3 RX chips.

  15. Cluster-based parallel image processing toolkit

    NASA Astrophysics Data System (ADS)

    Squyres, Jeffery M.; Lumsdaine, Andrew; Stevenson, Robert L.

    1995-03-01

    Many image processing tasks exhibit a high degree of data locality and parallelism and map quite readily to specialized massively parallel computing hardware. However, as network technologies continue to mature, workstation clusters are becoming a viable and economical parallel computing resource, so it is important to understand how to use these environments for parallel image processing as well. In this paper we discuss our implementation of parallel image processing software library (the Parallel Image Processing Toolkit). The Toolkit uses a message- passing model of parallelism designed around the Message Passing Interface (MPI) standard. Experimental results are presented to demonstrate the parallel speedup obtained with the Parallel Image Processing Toolkit in a typical workstation cluster over a wide variety of image processing tasks. We also discuss load balancing and the potential for parallelizing portions of image processing tasks that seem to be inherently sequential, such as visualization and data I/O.

  16. Integrated Systems Health Management (ISHM) Toolkit

    NASA Technical Reports Server (NTRS)

    Venkatesh, Meera; Kapadia, Ravi; Walker, Mark; Wilkins, Kim

    2013-01-01

    A framework of software components has been implemented to facilitate the development of ISHM systems according to a methodology based on Reliability Centered Maintenance (RCM). This framework is collectively referred to as the Toolkit and was developed using General Atomics' Health MAP (TM) technology. The toolkit is intended to provide assistance to software developers of mission-critical system health monitoring applications in the specification, implementation, configuration, and deployment of such applications. In addition to software tools designed to facilitate these objectives, the toolkit also provides direction to software developers in accordance with an ISHM specification and development methodology. The development tools are based on an RCM approach for the development of ISHM systems. This approach focuses on defining, detecting, and predicting the likelihood of system functional failures and their undesirable consequences.

  17. Desensitized Optimal Filtering and Sensor Fusion Toolkit

    NASA Technical Reports Server (NTRS)

    Karlgaard, Christopher D.

    2015-01-01

    Analytical Mechanics Associates, Inc., has developed a software toolkit that filters and processes navigational data from multiple sensor sources. A key component of the toolkit is a trajectory optimization technique that reduces the sensitivity of Kalman filters with respect to model parameter uncertainties. The sensor fusion toolkit also integrates recent advances in adaptive Kalman and sigma-point filters for non-Gaussian problems with error statistics. This Phase II effort provides new filtering and sensor fusion techniques in a convenient package that can be used as a stand-alone application for ground support and/or onboard use. Its modular architecture enables ready integration with existing tools. A suite of sensor models and noise distribution as well as Monte Carlo analysis capability are included to enable statistical performance evaluations.

  18. The Ames MER microscopic imager toolkit

    USGS Publications Warehouse

    Sargent, R.; Deans, Matthew; Kunz, C.; Sims, M.; Herkenhoff, K.

    2005-01-01

    12The Mars Exploration Rovers, Spirit and Opportunity, have spent several successful months on Mars, returning gigabytes of images and spectral data to scientists on Earth. One of the instruments on the MER rovers, the Athena Microscopic Imager (MI), is a fixed focus, megapixel camera providing a ??3mm depth of field and a 31??31mm field of view at a working distance of 63 mm from the lens to the object being imaged. In order to maximize the science return from this instrument, we developed the Ames MI Toolkit and supported its use during the primary mission. The MI Toolkit is a set of programs that operate on collections of MI images, with the goal of making the data more understandable to the scientists on the ground. Because of the limited depth of field of the camera, and the often highly variable topography of the terrain being imaged, MI images of a given rock are often taken as a stack, with the Instrument Deployment Device (IDD) moving along a computed normal vector, pausing every few millimeters for the MI to acquire an image. The MI Toolkit provides image registration and focal section merging, which combine these images to form a single, maximally in-focus image, while compensating for changes in lighting as well as parallax due to the motion of the camera. The MI Toolkit also provides a 3-D reconstruction of the surface being imaged using stereo and can embed 2-D MI images as texture maps into 3-D meshes produced by other imagers on board the rover to provide context. The 2-D images and 3-D meshes output from the Toolkit are easily viewed by scientists using other mission tools, such as Viz or the MI Browser.This paper describes the MI Toolkit in detail, as well as our experience using it with scientists at JPL during the primary MER mission. ?? 2005 IEEE.

  19. The Ames MER Microscopic Imager Toolkit

    NASA Technical Reports Server (NTRS)

    Sargent, Randy; Deans, Matthew; Kunz, Clayton; Sims, Michael; Herkenhoff, Ken

    2005-01-01

    The Mars Exploration Rovers, Spirit and Opportunity, have spent several successful months on Mars, returning gigabytes of images and spectral data to scientists on Earth. One of the instruments on the MER rovers, the Athena Microscopic Imager (MI), is a fixed focus, megapixel camera providing a plus or minus mm depth of field and a 3lx31mm field of view at a working distance of 63 mm from the lens to the object being imaged. In order to maximize the science return from this instrument, we developed the Ames MI Toolkit and supported its use during the primary mission. The MI Toolkit is a set of programs that operate on collections of MI images, with the goal of making the data more understandable to the scientists on the ground. Because of the limited depth of field of the camera, and the often highly variable topography of the terrain being imaged, MI images of a given rock are often taken as a stack, with the Instrument Deployment Device (IDD) moving along a computed normal vector, pausing every few millimeters for the MI to acquire an image. The MI Toolkit provides image registration and focal section merging, which combine these images to form a single, maximally in-focus image, while compensating for changes in lighting as well as parallax due to the motion of the camera. The MI Toolkit also provides a 3-D reconstruction of the surface being imaged using stereo and can embed 2-D MI images as texture maps into 3-D meshes produced by other imagers on board the rover to provide context. The 2-D images and 3-D meshes output from the Toolkit are easily viewed by scientists using other mission tools, such as Viz or the MI Browser. This paper describes the MI Toolkit in detail, as well as our experience using it with scientists at JPL during the primary MER mission.

  20. "Handy Manny" and the Emergent Literacy Technology Toolkit

    ERIC Educational Resources Information Center

    Hourcade, Jack J.; Parette, Howard P., Jr.; Boeckmann, Nichole; Blum, Craig

    2010-01-01

    This paper outlines the use of a technology toolkit to support emergent literacy curriculum and instruction in early childhood education settings. Components of the toolkit include hardware and software that can facilitate key emergent literacy skills. Implementation of the comprehensive technology toolkit enhances the development of these…

  1. SIERRA Toolkit v. 1.0

    Energy Science and Technology Software Center (ESTSC)

    2010-02-24

    The SIERRA Toolkit is a collection of libraries to facilitate the development of parallel engineering analysis applications. These libraries supply basic core services that an engineering analysis application may need such as a parallel distributed and dynamic mesh database (for unstructured meshes), mechanics algorithm support (parallel infrastructure only), interfaces to parallel solvers, parallel mesh and data I/O, and various utilities (timers, diagnostic tools, etc.).The toolkit is intended to reduce the effort required to develop anmore » engineering analysis application by removing the need to develop core capabilities that most every application would require.« less

  2. Anchor Toolkit - a secure mobile agent system

    SciTech Connect

    Mudumbai, Srilekha S.; Johnston, William; Essiari, Abdelilah

    1999-05-19

    Mobile agent technology facilitates intelligent operation insoftware systems with less human interaction. Major challenge todeployment of mobile agents include secure transmission of agents andpreventing unauthorized access to resources between interacting systems,as either hosts, or agents, or both can act maliciously. The Anchortoolkit, designed by LBNL, handles the transmission and secure managementof mobile agents in a heterogeneous distributed computing environment. Itprovides users with the option of incorporating their security managers.This paper concentrates on the architecture, features, access control anddeployment of Anchor toolkit. Application of this toolkit in a securedistributed CVS environment is discussed as a case study.

  3. Autism Speaks Toolkits: Resources for Busy Physicians.

    PubMed

    Bellando, Jayne; Fussell, Jill J; Lopez, Maya

    2016-02-01

    Given the increased prevalence of autism spectrum disorders (ASD), it is likely that busy primary care providers (PCP) are providing care to individuals with ASD in their practice. Autism Speaks provides a wealth of educational, medical, and treatment/intervention information resources for PCPs and families, including at least 32 toolkits. This article serves to familiarize PCPs and families on the different toolkits that are available on the Autism Speaks website. This article is intended to increase physicians' knowledge on the issues that families with children with ASD frequently encounter, to increase their ability to share evidence-based information to guide treatment and care for affected families in their practice. PMID:26149848

  4. TRSkit: A Simple Digital Library Toolkit

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Esler, Sandra L.

    1997-01-01

    This paper introduces TRSkit, a simple and effective toolkit for building digital libraries on the World Wide Web. The toolkit was developed for the creation of the Langley Technical Report Server and the NASA Technical Report Server, but is applicable to most simple distribution paradigms. TRSkit contains a handful of freely available software components designed to be run under the UNIX operating system and served via the World Wide Web. The intended customer is the person that must continuously and synchronously distribute anywhere from 100 - 100,000's of information units and does not have extensive resources to devote to the problem.

  5. Medical Applications of the Geant4 Toolkit

    NASA Astrophysics Data System (ADS)

    Agostinelli, S.; Chauvie, S.; Foppiano, F.; Garelli, S.; Marchetto, F.; Pia, M. G.; Nieminen, P.; Rolando, V.; Solano, A.

    A powerful and suitable tool for attacking the problem of the production and transport of different beams in biological matter is offered by the Geant4 Simulation Toolkit. Various activities in progress in the domain of medical applications are presented: studies on calibration of br achy therapie sources and termoluminescent dosimeters, studies of a complete 3-D inline dosimeter, development of general tools for CT interface for treatment planning, studies involving neutron transport, etc. A novel approach, based on the Geant4 Toolkit, for the study of radiation damage at the cellular and DNA level, is also presented.

  6. Construction aggregates

    USGS Publications Warehouse

    Tepordei, V.V.

    1995-01-01

    Part of the 1994 Industrial Minerals Review. The production, consumption, and applications of construction aggregates are reviewed. In 1994, the production of construction aggregates, which includes crushed stone and construction sand and gravel combined, increased 7.7 percent to 2.14 Gt compared with the previous year. These record production levels are mostly a result of funding for highway construction work provided by the Intermodal Surface Transportation Efficiency Act of 1991. Demand is expected to increase for construction aggregates in 1995.

  7. The NetLogger Toolkit V2.0

    Energy Science and Technology Software Center (ESTSC)

    2003-03-28

    The NetLogger Toolkit is designed to monitor, under actual operating conditions, the behavior of all the elements of the application-to-application communication path in order to determine exactly where time is spent within a complex system Using NetLogger, distnbuted application components are modified to produce timestamped logs of "interesting" events at all the critical points of the distributed system Events from each component are correlated, which allov^ one to characterize the performance of all aspects ofmore » the system and network in detail. The NetLogger Toolkit itself consists of four components an API and library of functions to simplify the generation of application-level event logs, a set of tools for collecting and sorting log files, an event archive system, and a tool for visualization and analysis of the log files In order to instrument an application to produce event logs, the application developer inserts calls to the NetLogger API at all the critical points in the code, then links the application with the NetLogger library All the tools in the NetLogger Toolkit share a common log format, and assume the existence of accurate and synchronized system clocks NetLogger messages can be logged using an easy-to-read text based format based on the lETF-proposed ULM format, or a binary format that can still be used through the same API but that is several times faster and smaller, with performance comparable or better than binary message formats such as MPI, XDR, SDDF-Binary, and PBIO. The NetLogger binary format is both highly efficient and self-describing, thus optimized for the dynamic message construction and parsing of application instrumentation. NetLogger includes an "activation" API that allows NetLogger logging to be turned on, off, or modified by changing an external file This IS useful for activating logging in daemons/services (e g GndFTP server). The NetLogger reliability API provides the ability to specify backup logging locations and

  8. Services development toolkit for Open Research Data (Promarket)

    NASA Astrophysics Data System (ADS)

    Som de Cerff, W.; Schwichtenberg, H.; Gemünd, A.; Claus, S.; Reichwald, J.; Denvil, S.; Mazetti, P.; Nativi, S.

    2012-04-01

    According to the declaration of the Organisation for Economic Co-operation and Development (OECD) on Open Access: "OECD Principles and Guidelines for Access to Research Data from Public Funding" research data should be available for everyone and Europe follows these directions (Digital Agenda, N. Kroes). Data being 'open' does not mean directly applicable: research data are often complex to use and difficult to interpret by non-experts. Also, if extra services are needed, e.g. certain delivery guarantees, SLAs need to be negotiated. Comparable to OSS development models, where software is open and services and support are paid for, there is a large potential for commercial activities and services around this open and free research data. E.g. Climate, weather or data from instruments can be used to generate business values when offered as easy and reliable services for Apps integration. The project will design a toolkit for developers in research data centres. The tools will allow to develop services to provide research data and map business processes e.g. automatic service level agreements to their service to make open research data attractive for commercial and academic use by the centre and others. It will enable to build and develop open, reliable and scalable services and end products, e.g. accessible from end user devices such as smart phones. Researchers, interested citizen or company developers will be enabled to access open data as an "easy-to-use" service and aggregate it with other services. The project will address a broad range of developers and give them a toolkit in well-known settings, portable, scalable, open and useable in public development environments and tools. This topic will be addressed technically by utilizing service-oriented approaches based on open standards and protocols, combined with new programming models and techniques.

  9. Plus 50: Business Community Outreach Toolkit

    ERIC Educational Resources Information Center

    American Association of Community Colleges (NJ1), 2009

    2009-01-01

    This toolkit is designed to support you in building partnerships with the business community. It includes a series of fact sheets you can distribute to employers that discuss the value in hiring plus 50 workers. Individual sections contain footnotes. (Contains 5 web resources.)

  10. Portable Extensible Toolkit for Scientific Computation

    Energy Science and Technology Software Center (ESTSC)

    1995-06-30

    PETSC2.0 is a software toolkit for portable, parallel (and serial) numerical solution of partial differential equations and minimization problems. It includes software for the solution of linear and nonlinear systems of equations. These codes are written in a data-structure-neutral manner to enable easy reuse and flexibility.

  11. Teacher Quality Toolkit. 2nd Edition

    ERIC Educational Resources Information Center

    Lauer, Patricia A.; Dean, Ceri B.; Martin-Glenn, Mya L.; Asensio, Margaret L.

    2005-01-01

    The Teacher Quality Toolkit addresses the continuum of teacher learning by providing tools that can be used to improve both preservice, and inservice teacher education. Each chapter provides self assessment tools that can guide progress toward improved teacher quality and describes resources for designing exemplary programs and practices. Chapters…

  12. A toolkit for building earth system models

    SciTech Connect

    Foster, I.

    1993-03-01

    An earth system model is a computer code designed to simulate the interrelated processes that determine the earth's weather and climate, such as atmospheric circulation, atmospheric physics, atmospheric chemistry, oceanic circulation, and biosphere. I propose a toolkit that would support a modular, or object-oriented, approach to the implementation of such models.

  13. A toolkit for building earth system models

    SciTech Connect

    Foster, I.

    1993-03-01

    An earth system model is a computer code designed to simulate the interrelated processes that determine the earth`s weather and climate, such as atmospheric circulation, atmospheric physics, atmospheric chemistry, oceanic circulation, and biosphere. I propose a toolkit that would support a modular, or object-oriented, approach to the implementation of such models.

  14. A Toolkit for the Effective Teaching Assistant

    ERIC Educational Resources Information Center

    Tyrer, Richard; Gunn, Stuart; Lee, Chris; Parker, Maureen; Pittman, Mary; Townsend, Mark

    2004-01-01

    This book offers the notion of a "toolkit" to allow Teaching Assistants (TAs) and colleagues to review and revise their thinking and practice about real issues and challenges in managing individuals, groups, colleagues and themselves in school. In a rapidly changing educational environment the book focuses on combining the underpinning knowledge…

  15. Healthy People 2010: Oral Health Toolkit

    ERIC Educational Resources Information Center

    Isman, Beverly

    2007-01-01

    The purpose of this Toolkit is to provide guidance, technical tools, and resources to help states, territories, tribes and communities develop and implement successful oral health components of Healthy People 2010 plans as well as other oral health plans. These plans are useful for: (1) promoting, implementing and tracking oral health objectives;…

  16. Media Toolkit for Anti-Drug Action.

    ERIC Educational Resources Information Center

    Office of National Drug Control Policy, Washington, DC.

    This toolkit provides proven methods, models, and templates for tying anti-drug efforts to the National Youth Anti-Drug Media Campaign. It helps organizations deliver the Campaign's messages to the media and to other groups and individuals who care about keeping the nation's youth drug free. Eight sections focus on: (1) "Campaign Overview"…

  17. Ready, Set, Respect! GLSEN's Elementary School Toolkit

    ERIC Educational Resources Information Center

    Gay, Lesbian and Straight Education Network (GLSEN), 2012

    2012-01-01

    "Ready, Set, Respect!" provides a set of tools to help elementary school educators ensure that all students feel safe and respected and develop respectful attitudes and behaviors. It is not a program to be followed but instead is designed to help educators prepare themselves for teaching about and modeling respect. The toolkit responds to…

  18. The Two-Way Immersion Toolkit

    ERIC Educational Resources Information Center

    Howard, Elizabeth; Sugarman, Julie; Perdomo, Marleny; Adger, Carolyn Temple

    2005-01-01

    This Toolkit is meant to be a resource for teachers, parents, and administrators involved with two-way immersion (TWI) programs, particularly those at the elementary level. Two-way immersion is a form of dual language instruction that brings together students from two native language groups for language, literacy, and academic content instruction…

  19. Integrated System Health Management Development Toolkit

    NASA Technical Reports Server (NTRS)

    Figueroa, Jorge; Smith, Harvey; Morris, Jon

    2009-01-01

    This software toolkit is designed to model complex systems for the implementation of embedded Integrated System Health Management (ISHM) capability, which focuses on determining the condition (health) of every element in a complex system (detect anomalies, diagnose causes, and predict future anomalies), and to provide data, information, and knowledge (DIaK) to control systems for safe and effective operation.

  20. Fragment Impact Toolkit: A Toolkit for Modeling Fragment Generation and Impacts on Targets

    NASA Astrophysics Data System (ADS)

    Shevitz, Daniel

    2005-07-01

    In this talk we will detail the status of the Fragment Impact Toolkit. The toolkit is used to model nearby explosion problems and assess probabilities of user-specified outcomes. The toolkit offers a framework, without locking the user into any particular set of states, assumptions, or constraints. The toolkit breaks a fragment impact problem into five components, all of which are extendable: (1) source description that includes the geometry of the source; (2) fragment generation that comprises the fragmentation process, including fragment size distributions (if required) and assignment of initial conditions, such a velocity; (3) fragment flight that includes what occurs to fragments while airborne; (4) target intersection that includes specification of target geometry, position, and orientation; and (5) target consequence that includes what occurs when fragments hit a target. Two notable contributions of the toolkit are the ability to have sources that break up with position-dependent and user-specifiable size probability distributions and then impact targets of arbitrary complexity. In this paper we will show examples of how to use the toolkit and simulate targets, including airplanes and stacks of munitions.

  1. Weighted aggregation

    NASA Technical Reports Server (NTRS)

    Feiveson, A. H. (Principal Investigator)

    1979-01-01

    The use of a weighted aggregation technique to improve the precision of the overall LACIE estimate is considered. The manner in which a weighted aggregation technique is implemented given a set of weights is described. The problem of variance estimation is discussed and the question of how to obtain the weights in an operational environment is addressed.

  2. Global Arrays Parallel Programming Toolkit

    SciTech Connect

    Nieplocha, Jaroslaw; Krishnan, Manoj Kumar; Palmer, Bruce J.; Tipparaju, Vinod; Harrison, Robert J.; Chavarría-Miranda, Daniel

    2011-01-01

    The two predominant classes of programming models for parallel computing are distributed memory and shared memory. Both shared memory and distributed memory models have advantages and shortcomings. Shared memory model is much easier to use but it ignores data locality/placement. Given the hierarchical nature of the memory subsystems in modern computers this characteristic can have a negative impact on performance and scalability. Careful code restructuring to increase data reuse and replacing fine grain load/stores with block access to shared data can address the problem and yield performance for shared memory that is competitive with message-passing. However, this performance comes at the cost of compromising the ease of use that the shared memory model advertises. Distributed memory models, such as message-passing or one-sided communication, offer performance and scalability but they are difficult to program. The Global Arrays toolkit attempts to offer the best features of both models. It implements a shared-memory programming model in which data locality is managed by the programmer. This management is achieved by calls to functions that transfer data between a global address space (a distributed array) and local storage. In this respect, the GA model has similarities to the distributed shared-memory models that provide an explicit acquire/release protocol. However, the GA model acknowledges that remote data is slower to access than local data and allows data locality to be specified by the programmer and hence managed. GA is related to the global address space languages such as UPC, Titanium, and, to a lesser extent, Co-Array Fortran. In addition, by providing a set of data-parallel operations, GA is also related to data-parallel languages such as HPF, ZPL, and Data Parallel C. However, the Global Array programming model is implemented as a library that works with most languages used for technical computing and does not rely on compiler technology for achieving

  3. The Reconstruction Toolkit (RTK), an open-source cone-beam CT reconstruction toolkit based on the Insight Toolkit (ITK)

    NASA Astrophysics Data System (ADS)

    Rit, S.; Vila Oliva, M.; Brousmiche, S.; Labarbe, R.; Sarrut, D.; Sharp, G. C.

    2014-03-01

    We propose the Reconstruction Toolkit (RTK, http://www.openrtk.org), an open-source toolkit for fast cone-beam CT reconstruction, based on the Insight Toolkit (ITK) and using GPU code extracted from Plastimatch. RTK is developed by an open consortium (see affiliations) under the non-contaminating Apache 2.0 license. The quality of the platform is daily checked with regression tests in partnership with Kitware, the company supporting ITK. Several features are already available: Elekta, Varian and IBA inputs, multi-threaded Feldkamp-David-Kress reconstruction on CPU and GPU, Parker short scan weighting, multi-threaded CPU and GPU forward projectors, etc. Each feature is either accessible through command line tools or C++ classes that can be included in independent software. A MIDAS community has been opened to share CatPhan datasets of several vendors (Elekta, Varian and IBA). RTK will be used in the upcoming cone-beam CT scanner developed by IBA for proton therapy rooms. Many features are under development: new input format support, iterative reconstruction, hybrid Monte Carlo / deterministic CBCT simulation, etc. RTK has been built to freely share tomographic reconstruction developments between researchers and is open for new contributions.

  4. Construction aggregates

    USGS Publications Warehouse

    Langer, W.H.; Tepordei, V.V.; Bolen, W.P.

    2000-01-01

    Construction aggregates consist primarily of crushed stone and construction sand and gravel. Total estimated production of construction aggregates increased in 1999 by about 2% to 2.39 Gt (2.64 billion st) compared with 1998. This record production level continued an expansion that began in 1992. By commodities, crushed stone production increased 3.3%, while sand and gravel production increased by about 0.5%.

  5. Construction aggregates

    USGS Publications Warehouse

    Tepordei, V.V.

    1994-01-01

    Part of a special section on industrial minerals in 1993. The 1993 production of construction aggregates increased 6.3 percent over the 1992 figure, to reach 2.01 Gt. This represents the highest estimated annual production of combined crushed stone and construction sand and gravel ever recorded in the U.S. The outlook for construction aggregates and the issues facing the industry are discussed.

  6. A toolkit for detecting technical surprise.

    SciTech Connect

    Trahan, Michael Wayne; Foehse, Mark C.

    2010-10-01

    The detection of a scientific or technological surprise within a secretive country or institute is very difficult. The ability to detect such surprises would allow analysts to identify the capabilities that could be a military or economic threat to national security. Sandia's current approach utilizing ThreatView has been successful in revealing potential technological surprises. However, as data sets become larger, it becomes critical to use algorithms as filters along with the visualization environments. Our two-year LDRD had two primary goals. First, we developed a tool, a Self-Organizing Map (SOM), to extend ThreatView and improve our understanding of the issues involved in working with textual data sets. Second, we developed a toolkit for detecting indicators of technical surprise in textual data sets. Our toolkit has been successfully used to perform technology assessments for the Science & Technology Intelligence (S&TI) program.

  7. Early origin of the bilaterian developmental toolkit

    PubMed Central

    Erwin, Douglas H.

    2009-01-01

    Whole-genome sequences from the choanoflagellate Monosiga brevicollis, the placozoan Trichoplax adhaerens and the cnidarian Nematostella vectensis have confirmed results from comparative evolutionary developmental studies that much of the developmental toolkit once thought to be characteristic of bilaterians appeared much earlier in the evolution of animals. The diversity of transcription factors and signalling pathway genes in animals with a limited number of cell types and a restricted developmental repertoire is puzzling, particularly in light of claims that such highly conserved elements among bilaterians provide evidence of a morphologically complex protostome–deuterostome ancestor. Here, I explore the early origination of elements of what became the bilaterian toolkit, and suggest that placozoans and cnidarians represent a depauperate residue of a once more diverse assemblage of early animals, some of which may be represented in the Ediacaran fauna (c. 585–542 Myr ago). PMID:19571245

  8. The Interactive Learning Toolkit: supporting interactive classrooms

    NASA Astrophysics Data System (ADS)

    Dutta, S.; McCauley, V.; Mazur, E.

    2004-05-01

    Research-based interactive learning techniques have dramatically improved student understanding. We have created the 'Interactive Learning Toolkit' (ILT), a web-based learning management system, to help implement two such pedagogies: Just in Time Teaching and Peer Instruction. Our main goal in developing this toolkit is to save the instructor time and effort and to use technology to facilitate the interaction between the students and the instructor (and between students themselves). After a brief review of both pedagogies, we will demonstrate the many exciting new features of the ILT. We will show how technology can not only implement, but also supplement and improve these pedagogies. We would like acknowdge grants from NSF and DEAS, Harvard University

  9. HVAC Fault Detection and Diagnosis Toolkit

    Energy Science and Technology Software Center (ESTSC)

    2004-12-31

    This toolkit supports component-level model-based fault detection methods in commercial building HVAC systems. The toolbox consists of five basic modules: a parameter estimator for model calibration, a preprocessor, an AHU model simulator, a steady-state detector, and a comparator. Each of these modules and the fuzzy logic rules for fault diagnosis are described in detail. The toolbox is written in C++ and also invokes the SPARK simulation program.

  10. chemf: A purely functional chemistry toolkit

    PubMed Central

    2012-01-01

    Background Although programming in a type-safe and referentially transparent style offers several advantages over working with mutable data structures and side effects, this style of programming has not seen much use in chemistry-related software. Since functional programming languages were designed with referential transparency in mind, these languages offer a lot of support when writing immutable data structures and side-effects free code. We therefore started implementing our own toolkit based on the above programming paradigms in a modern, versatile programming language. Results We present our initial results with functional programming in chemistry by first describing an immutable data structure for molecular graphs together with a couple of simple algorithms to calculate basic molecular properties before writing a complete SMILES parser in accordance with the OpenSMILES specification. Along the way we show how to deal with input validation, error handling, bulk operations, and parallelization in a purely functional way. At the end we also analyze and improve our algorithms and data structures in terms of performance and compare it to existing toolkits both object-oriented and purely functional. All code was written in Scala, a modern multi-paradigm programming language with a strong support for functional programming and a highly sophisticated type system. Conclusions We have successfully made the first important steps towards a purely functional chemistry toolkit. The data structures and algorithms presented in this article perform well while at the same time they can be safely used in parallelized applications, such as computer aided drug design experiments, without further adjustments. This stands in contrast to existing object-oriented toolkits where thread safety of data structures and algorithms is a deliberate design decision that can be hard to implement. Finally, the level of type-safety achieved by Scala highly increased the reliability of our code

  11. Application experiences with the Globus toolkit.

    SciTech Connect

    Brunett, S.

    1998-06-09

    The Globus grid toolkit is a collection of software components designed to support the development of applications for high-performance distributed computing environments, or ''computational grids'' [14]. The Globus toolkit is an implementation of a ''bag of services'' architecture, which provides application and tool developers not with a monolithic system but rather with a set of stand-alone services. Each Globus component provides a basic service, such as authentication, resource allocation, information, communication, fault detection, and remote data access. Different applications and tools can combine these services in different ways to construct ''grid-enabled'' systems. The Globus toolkit has been used to construct the Globus Ubiquitous Supercomputing Testbed, or GUSTO: a large-scale testbed spanning 20 sites and included over 4000 compute nodes for a total compute power of over 2 TFLOPS. Over the past six months, we and others have used this testbed to conduct a variety of application experiments, including multi-user collaborative environments (tele-immersion), computational steering, distributed supercomputing, and high throughput computing. The goal of this paper is to review what has been learned from these experiments regarding the effectiveness of the toolkit approach. To this end, we describe two of the application experiments in detail, noting what worked well and what worked less well. The two applications are a distributed supercomputing application, SF-Express, in which multiple supercomputers are harnessed to perform large distributed interactive simulations; and a tele-immersion application, CAVERNsoft, in which the focus is on connecting multiple people to a distributed simulated world.

  12. Mission Operations and Navigation Toolkit Environment

    NASA Technical Reports Server (NTRS)

    Sunseri, Richard F.; Wu, Hsi-Cheng; Hanna, Robert A.; Mossey, Michael P.; Duncan, Courtney B.; Evans, Scott E.; Evans, James R.; Drain, Theodore R.; Guevara, Michelle M.; Martin Mur, Tomas J.; Attiyah, Ahlam A.

    2009-01-01

    MONTE (Mission Operations and Navigation Toolkit Environment) Release 7.3 is an extensible software system designed to support trajectory and navigation analysis/design for space missions. MONTE is intended to replace the current navigation and trajectory analysis software systems, which, at the time of this reporting, are used by JPL's Navigation and Mission Design section. The software provides an integrated, simplified, and flexible system that can be easily maintained to serve the needs of future missions in need of navigation services.

  13. VIDE: The Void IDentification and Examination toolkit

    NASA Astrophysics Data System (ADS)

    Sutter, P. M.; Lavaux, G.; Hamaus, N.; Pisani, A.; Wandelt, B. D.; Warren, M.; Villaescusa-Navarro, F.; Zivick, P.; Mao, Q.; Thompson, B. B.

    2015-03-01

    We present VIDE, the Void IDentification and Examination toolkit, an open-source Python/C++ code for finding cosmic voids in galaxy redshift surveys and N -body simulations, characterizing their properties, and providing a platform for more detailed analysis. At its core, VIDE uses a substantially enhanced version of ZOBOV (Neyinck 2008) to calculate a Voronoi tessellation for estimating the density field and performing a watershed transform to construct voids. Additionally, VIDE provides significant functionality for both pre- and post-processing: for example, VIDE can work with volume- or magnitude-limited galaxy samples with arbitrary survey geometries, or dark matter particles or halo catalogs in a variety of common formats. It can also randomly subsample inputs and includes a Halo Occupation Distribution model for constructing mock galaxy populations. VIDE uses the watershed levels to place voids in a hierarchical tree, outputs a summary of void properties in plain ASCII, and provides a Python API to perform many analysis tasks, such as loading and manipulating void catalogs and particle members, filtering, plotting, computing clustering statistics, stacking, comparing catalogs, and fitting density profiles. While centered around ZOBOV, the toolkit is designed to be as modular as possible and accommodate other void finders. VIDE has been in development for several years and has already been used to produce a wealth of results, which we summarize in this work to highlight the capabilities of the toolkit. VIDE is publicly available at

  14. An Overview of the Geant4 Toolkit

    SciTech Connect

    Apostolakis, John; Wright, Dennis H.

    2007-03-19

    Geant4 is a toolkit for the simulation of the transport of radiation through matter. With a flexible kernel and choices between different physics modeling choices, it has been tailored to the requirements of a wide range of applications.With the toolkit a user can describe a setup's or detector's geometry and materials, navigate inside it, simulate the physical interactions using a choice of physics engines, underlying physics cross-sections and models, visualise and store results.Physics models describing electromagnetic and hadronic interactions are provided, as are decays and processes for optical photons. Several models, with different precision and performance are available for many processes. The toolkit includes coherent physics model configurations, which are called physics lists. Users can choose an existing physics list or create their own, depending on their requirements and the application area. A clear structure and readable code, enable the user to investigate the origin of physics results.Application areas include detector simulation and background simulation in High Energy Physics experiments, simulation of accelerator setups, studies in medical imaging and treatment, and the study of the effects of solar radiation on spacecraft instruments.

  15. An Overview of the GEANT4 Toolkit

    SciTech Connect

    Apostolakis, John; Wright, Dennis H.; /SLAC

    2007-10-05

    Geant4 is a toolkit for the simulation of the transport of radiation through matter. With a flexible kernel and choices between different physics modeling choices, it has been tailored to the requirements of a wide range of applications. With the toolkit a user can describe a setup's or detector's geometry and materials, navigate inside it, simulate the physical interactions using a choice of physics engines, underlying physics cross-sections and models, visualize and store results. Physics models describing electromagnetic and hadronic interactions are provided, as are decays and processes for optical photons. Several models, with different precision and performance are available for many processes. The toolkit includes coherent physics model configurations, which are called physics lists. Users can choose an existing physics list or create their own, depending on their requirements and the application area. A clear structure and readable code, enable the user to investigate the origin of physics results. Application areas include detector simulation and background simulation in High Energy Physics experiments, simulation of accelerator setups, studies in medical imaging and treatment, and the study of the effects of solar radiation on spacecraft instruments.

  16. Capturing Petascale Application Characteristics with the Sequoia Toolkit

    SciTech Connect

    Vetter, Jeffrey S; Bhatia, Nikhil; Grobelny, Eric M; Roth, Philip C

    2005-09-01

    Characterization of the computation, communication, memory, and I/O demands of current scientific applications is crucial for identifying which technologies will enable petascale scientific computing. In this paper, we present the Sequoia Toolkit for characterizing HPC applications. The Sequoia Toolkit consists of the Sequoia trace capture library and the Sequoia Event Analysis Library, or SEAL, that facilitates the development of tools for analyzing Sequoia event traces. Using the Sequoia Toolkit, we have characterized the behavior of application runs with up to 2048 application processes. To illustrate the use of the Sequoia Toolkit, we present a preliminary characterization of LAMMPS, a molecular dynamics application of great interest to the computational biology community.

  17. An evaluation capacity building toolkit for principal investigators of undergraduate research experiences: A demonstration of transforming theory into practice.

    PubMed

    Rorrer, Audrey S

    2016-04-01

    This paper describes the approach and process undertaken to develop evaluation capacity among the leaders of a federally funded undergraduate research program. An evaluation toolkit was developed for Computer and Information Sciences and Engineering(1) Research Experiences for Undergraduates(2) (CISE REU) programs to address the ongoing need for evaluation capacity among principal investigators who manage program evaluation. The toolkit was the result of collaboration within the CISE REU community with the purpose being to provide targeted instructional resources and tools for quality program evaluation. Challenges were to balance the desire for standardized assessment with the responsibility to account for individual program contexts. Toolkit contents included instructional materials about evaluation practice, a standardized applicant management tool, and a modulated outcomes measure. Resulting benefits from toolkit deployment were having cost effective, sustainable evaluation tools, a community evaluation forum, and aggregate measurement of key program outcomes for the national program. Lessons learned included the imperative of understanding the evaluation context, engaging stakeholders, and building stakeholder trust. Results from project measures are presented along with a discussion of guidelines for facilitating evaluation capacity building that will serve a variety of contexts. PMID:26788814

  18. Water Security Toolkit User Manual Version 1.2.

    SciTech Connect

    Klise, Katherine A.; Siirola, John Daniel; Hart, David; Hart, William Eugene; Phillips, Cynthia Ann; Haxton, Terranna; Murray, Regan; Janke, Robert; Taxon, Thomas; Laird, Carl; Seth, Arpan; Hackebeil, Gabriel; McGee, Shawn; Mann, Angelica

    2014-08-01

    The Water Security Toolkit (WST) is a suite of open source software tools that can be used by water utilities to create response strategies to reduce the impact of contamination in a water distribution network . WST includes hydraulic and water quality modeling software , optimizati on methodologies , and visualization tools to identify: (1) sensor locations to detect contamination, (2) locations in the network in which the contamination was introduced, (3) hydrants to remove contaminated water from the distribution system, (4) locations in the network to inject decontamination agents to inactivate, remove, or destroy contaminants, (5) locations in the network to take grab sample s to help identify the source of contamination and (6) valves to close in order to isolate contaminate d areas of the network. This user manual describes the different components of WST , along w ith examples and case studies. License Notice The Water Security Toolkit (WST) v.1.2 Copyright c 2012 Sandia Corporation. Under the terms of Contract DE-AC04-94AL85000, there is a non-exclusive license for use of this work by or on behalf of the U.S. government. This software is distributed under the Revised BSD License (see below). In addition, WST leverages a variety of third-party software packages, which have separate licensing policies: Acro Revised BSD License argparse Python Software Foundation License Boost Boost Software License Coopr Revised BSD License Coverage BSD License Distribute Python Software Foundation License / Zope Public License EPANET Public Domain EPANET-ERD Revised BSD License EPANET-MSX GNU Lesser General Public License (LGPL) v.3 gcovr Revised BSD License GRASP AT&T Commercial License for noncommercial use; includes randomsample and sideconstraints executable files LZMA SDK Public Domain nose GNU Lesser General Public License (LGPL) v.2.1 ordereddict MIT License pip MIT License PLY BSD License PyEPANET Revised BSD License Pyro MIT License PyUtilib Revised BSD License Py

  19. NGS QC Toolkit: A Toolkit for Quality Control of Next Generation Sequencing Data

    PubMed Central

    Patel, Ravi K.; Jain, Mukesh

    2012-01-01

    Next generation sequencing (NGS) technologies provide a high-throughput means to generate large amount of sequence data. However, quality control (QC) of sequence data generated from these technologies is extremely important for meaningful downstream analysis. Further, highly efficient and fast processing tools are required to handle the large volume of datasets. Here, we have developed an application, NGS QC Toolkit, for quality check and filtering of high-quality data. This toolkit is a standalone and open source application freely available at http://www.nipgr.res.in/ngsqctoolkit.html. All the tools in the application have been implemented in Perl programming language. The toolkit is comprised of user-friendly tools for QC of sequencing data generated using Roche 454 and Illumina platforms, and additional tools to aid QC (sequence format converter and trimming tools) and analysis (statistics tools). A variety of options have been provided to facilitate the QC at user-defined parameters. The toolkit is expected to be very useful for the QC of NGS data to facilitate better downstream analysis. PMID:22312429

  20. An Incident Management Preparedness and Coordination Toolkit

    SciTech Connect

    Koch, Daniel B; Payne, Patricia W

    2012-01-01

    Although the use of Geographic Information Systems (GIS) by centrally-located operations staff is well established in the area of emergency response, utilization by first responders in the field is uneven. Cost, complexity, and connectivity are often the deciding factors preventing wider adoption. For the past several years, Oak Ridge National Laboratory (ORNL) has been developing a mobile GIS solution using free and open-source software targeting the needs of front-line personnel. Termed IMPACT, for Incident Management Preparedness and Coordination Toolkit, this ORNL application can complement existing GIS infrastructure and extend its power and capabilities to responders first on the scene of a natural or man-made disaster.

  1. Accelerator physics analysis with an integrated toolkit

    SciTech Connect

    Holt, J.A.; Michelotti, L.; Satogata, T.

    1992-08-01

    Work is in progress on an integrated software toolkit for linear and nonlinear accelerator design, analysis, and simulation. As a first application, ``beamline`` and ``MXYZPTLK`` (differential algebra) class libraries, were used with an X Windows graphics library to build an user-friendly, interactive phase space tracker which, additionally, finds periodic orbits. This program was used to analyse a theoretical lattice which contains octupoles and decapoles to find the 20th order, stable and unstable periodic orbits and to explore the local phase space structure.

  2. Tips from the toolkit: 1 - know yourself.

    PubMed

    Steer, Neville

    2010-01-01

    High performance organisations review their strategy and business processes as part of usual business operations. If you are new to the field of general practice, do you have a career plan for the next 5-10 years? If you are an experienced general practitioner, are you using much the same business model and processes as when you started out? The following article sets out some ideas you might use to have a fresh approach to your professional career. It is based on The Royal Australian College of General Practitioners' 'General practice management toolkit'. PMID:20369141

  3. MCS Large Cluster Systems Software Toolkit

    Energy Science and Technology Software Center (ESTSC)

    2002-11-01

    This package contains a number of systems utilities for managing a set of computers joined in a "cluster". The utilities assist a team of systems administrators in managing the cluster by automating routine tasks, centralizing information, and monitoring individual computers within the cluster. Included in the toolkit are scripts used to boot a computer from a floppy, a program to turn on and off the power to a system, and a system for using amore » database to organize cluster information.« less

  4. Graph algorithms in the titan toolkit.

    SciTech Connect

    McLendon, William Clarence, III; Wylie, Brian Neil

    2009-10-01

    Graph algorithms are a key component in a wide variety of intelligence analysis activities. The Graph-Based Informatics for Non-Proliferation and Counter-Terrorism project addresses the critical need of making these graph algorithms accessible to Sandia analysts in a manner that is both intuitive and effective. Specifically we describe the design and implementation of an open source toolkit for doing graph analysis, informatics, and visualization that provides Sandia with novel analysis capability for non-proliferation and counter-terrorism.

  5. Quality Assurance Toolkit for Distance Higher Education Institutions and Programmes

    ERIC Educational Resources Information Center

    Rama, Kondapalli, Ed.; Hope, Andrea, Ed.

    2009-01-01

    The Commonwealth of Learning is proud to partner with the Sri Lankan Ministry of Higher Education and UNESCO to produce this "Quality Assurance Toolkit for Distance Higher Education Institutions and Programmes". The Toolkit has been prepared with three features. First, it is a generic document on quality assurance, complete with a glossary of…

  6. Veterinary Immunology Committee Toolkit Workshop 2010: Progress and plans

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The Third Veterinary Immunology Committee (VIC) Toolkit Workshop took place at the Ninth International Veterinary Immunology Symposium (IVIS) in Tokyo, Japan on August 18, 2020. The Workshop built on previous Toolkit Workshops and covered various aspects of reagent development, commercialisation an...

  7. 77 FR 73022 - U.S. Environmental Solutions Toolkit

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-07

    ... International Trade Administration U.S. Environmental Solutions Toolkit AGENCY: International Trade... web- based U.S. Environmental Solutions Toolkit to be used by foreign environmental officials and foreign end-users of environmental technologies that will outline U.S. approaches to a series...

  8. 77 FR 73023 - U.S. Environmental Solutions Toolkit

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-07

    ... International Trade Administration U.S. Environmental Solutions Toolkit AGENCY: International Trade.... Environmental Solutions Toolkit to be used by foreign environmental officials and foreign end-users of environmental technologies that will outline U.S. ] approaches to a series of environmental problems...

  9. Construction aggregates

    USGS Publications Warehouse

    Tepordei, V.V.

    1993-01-01

    Part of a special section on the market performance of industrial minerals in 1992. Production of construction aggregates increased by 4.6 percent in 1992. This increase was due, in part, to the increased funding for transportation and infrastructure projects. The U.S. produced about 1.05 Gt of crushed stone and an estimated 734 Mt of construction sand and gravel in 1992. Demand is expected to increase by about 5 percent in 1993.

  10. Construction aggregates

    USGS Publications Warehouse

    Tepordei, V.V.

    1996-01-01

    Part of the Annual Commodities Review 1995. Production of construction aggregates such as crushed stone and construction sand and gravel showed a marginal increase in 1995. Most of the 1995 increases were due to funding for highway construction work. The major areas of concern to the industry included issues relating to wetlands classification and the classification of crystalline silica as a probable human carcinogen. Despite this, an increase in demand is anticipated for 1996.

  11. Construction aggregates

    USGS Publications Warehouse

    Nelson, T.I.; Bolen, W.P.

    2007-01-01

    Construction aggregates, primarily stone, sand and gravel, are recovered from widespread naturally occurring mineral deposits and processed for use primarily in the construction industry. They are mined, crushed, sorted by size and sold loose or combined with portland cement or asphaltic cement to make concrete products to build roads, houses, buildings, and other structures. Much smaller quantities are used in agriculture, cement manufacture, chemical and metallurgical processes, glass production and many other products.

  12. MIS: A Miriad Interferometry Singledish Toolkit

    NASA Astrophysics Data System (ADS)

    Pound, Marc; Teuben, Peter

    2011-10-01

    MIS is a pipeline toolkit using the package MIRIAD to combine Interferometric and Single Dish data. This was prompted by our observations made with the Combined Array For Research in Millimeter-wave Astronomy (CARMA) interferometer of the star-forming region NGC 1333, a large survey highlighting the new 23-element and singledish observing modes. The project consists of 20 CARMA datasets each containing interferometric as well as simultaneously obtained single dish data, for 3 molecular spectral lines and continuum, in 527 different pointings, covering an area of about 8 by 11 arcminutes. A small group of collaborators then shared this toolkit and their parameters via CVS, and scripts were developed to ensure uniform data reduction across the group. The pipeline was run end-to-end each night that new observations were obtained, producing maps that contained all the data to date. This approach could serve as a model for repeated calibration and mapping of large mixed-mode correlation datasets from ALMA.

  13. MIS: A MIRIAD Interferometry Singledish Toolkit

    NASA Astrophysics Data System (ADS)

    Pound, M. W.; Teuben, P.

    2012-09-01

    Building on the “drPACS” contribution at ADASS XX of a simple Unix pipeline infrastructure, we implemented a pipeline toolkit using the package MIRIAD to combine Interferometric and Single Dish data (MIS). This was prompted by our observations made with the Combined Array For Research in Millimeter-wave Astronomy (CARMA) interferometer of the star-forming region NGC 1333, a large survey highlighting the new 23-element and singledish observing modes. The project consists of 20 CARMA datasets each containing interferometric as well as simultaneously obtained single dish data, for 3 molecular spectral lines and continuum, in 527 different pointings, covering an area of about 8 by 11 arcminutes. A small group of collaborators then shared this toolkit and their parameters via CVS, and scripts were developed to ensure uniform data reduction across the group. The pipeline was run end-to-end each night as new observations were obtained, producing maps that contained all the data to date. We will show examples of the scripts and data products. This approach could serve as a model for repeated calibration and mapping of large mixed-mode correlation datasets from ALMA.

  14. The Best Ever Alarm System Toolkit

    SciTech Connect

    Kasemir, Kay; Chen, Xihui; Danilova, Katia

    2009-01-01

    Learning from our experience with the standard Experimental Physics and Industrial Control System (EPICS) alarm handler (ALH) as well as a similar intermediate approach based on script-generated operator screens, we developed the Best Ever Alarm System Toolkit (BEAST). It is based on Java and Eclipse on the Control System Studio (CSS) platform, using a relational database (RDB) to store the configuration and log actions. It employs a Java Message Service (JMS) for communication between the modular pieces of the toolkit, which include an Alarm Server to maintain the current alarm state, an arbitrary number of Alarm Client user interfaces (GUI), and tools to annunciate alarms or log alarm related actions. Web reports allow us to monitor the alarm system performance and spot deficiencies in the alarm configuration. The Alarm Client GUI not only gives the end users various ways to view alarms in tree and table, but also makes it easy to access the guidance information, the related operator displays and other CSS tools. It also allows online configuration to be simply modified from the GUI. Coupled with a good "alarm philosophy" on how to provide useful alarms, we can finally improve the configuration to achieve an effective alarm system.

  15. UTGB toolkit for personalized genome browsers

    PubMed Central

    Saito, Taro L.; Yoshimura, Jun; Sasaki, Shin; Ahsan, Budrul; Sasaki, Atsushi; Kuroshu, Reginaldo; Morishita, Shinichi

    2009-01-01

    The advent of high-throughput DNA sequencers has increased the pace of collecting enormous amounts of genomic information, yielding billions of nucleotides on a weekly basis. This advance represents an improvement of two orders of magnitude over traditional Sanger sequencers in terms of the number of nucleotides per unit time, allowing even small groups of researchers to obtain huge volumes of genomic data over fairly short period. Consequently, a pressing need exists for the development of personalized genome browsers for analyzing these immense amounts of locally stored data. The UTGB (University of Tokyo Genome Browser) Toolkit is designed to meet three major requirements for personalization of genome browsers: easy installation of the system with minimum efforts, browsing locally stored data and rapid interactive design of web interfaces tailored to individual needs. The UTGB Toolkit is licensed under an open source license. Availability: The software is freely available at http://utgenome.org/. Contact: moris@cb.k.u-tokyo.ac.jp PMID:19497937

  16. ADMIT: The ALMA Data Mining Toolkit

    NASA Astrophysics Data System (ADS)

    Teuben, P.; Pound, M.; Mundy, L.; Rauch, K.; Friedel, D.; Looney, L.; Xu, L.; Kern, J.

    2015-09-01

    ADMIT (ALMA Data Mining ToolkiT), a toolkit for the creation of new science products from ALMA data, is being developed as an ALMA Development Project. It is written in Python and, while specifically targeted for a uniform analysis of the ALMA science products that come out of the ALMA pipeline, it is designed to be generally applicable to (radio) astronomical data. It first provides users with a detailed view of their science products created by ADMIT inside the ALMA pipeline: line identifications, line ‘cutout' cubes, moment maps, emission type analysis (e.g., feature detection). Using descriptor vectors the ALMA data archive is enriched with useful information to make archive data mining possible. Users can also opt to download the (small) ADMIT pipeline product, then fine-tune and re-run the pipeline and inspect their hopefully improved data. By running many projects in a parallel fashion, data mining between many astronomical sources and line transitions will also be possible. Future implementations of ADMIT may include EVLA and other instruments.

  17. Construction aggregates

    USGS Publications Warehouse

    Bolen, W.P.; Tepordei, V.V.

    2001-01-01

    The estimated production during 2000 of construction aggregates, crushed stone, and construction sand and gravel increased by about 2.6% to 2.7 Gt (3 billion st), compared with 1999. The expansion that started in 1992 continued with record production levels for the ninth consecutive year. By commodity, construction sand and gravel production increased by 4.5% to 1.16 Gt (1.28 billion st), while crushed stone production increased by 1.3% to 1.56 Gt (1.72 billion st).

  18. The Image-Guided Surgery ToolKit IGSTK: an open source C++ software toolkit

    NASA Astrophysics Data System (ADS)

    Cheng, Peng; Ibanez, Luis; Gobbi, David; Gary, Kevin; Aylward, Stephen; Jomier, Julien; Enquobahrie, Andinet; Zhang, Hui; Kim, Hee-su; Blake, M. Brian; Cleary, Kevin

    2007-03-01

    The Image-Guided Surgery Toolkit (IGSTK) is an open source C++ software library that provides the basic components needed to develop image-guided surgery applications. The focus of the toolkit is on robustness using a state machine architecture. This paper presents an overview of the project based on a recent book which can be downloaded from igstk.org. The paper includes an introduction to open source projects, a discussion of our software development process and the best practices that were developed, and an overview of requirements. The paper also presents the architecture framework and main components. This presentation is followed by a discussion of the state machine model that was incorporated and the associated rationale. The paper concludes with an example application.

  19. The image-guided surgery toolkit IGSTK: an open source C++ software toolkit.

    PubMed

    Enquobahrie, Andinet; Cheng, Patrick; Gary, Kevin; Ibanez, Luis; Gobbi, David; Lindseth, Frank; Yaniv, Ziv; Aylward, Stephen; Jomier, Julien; Cleary, Kevin

    2007-11-01

    This paper presents an overview of the image-guided surgery toolkit (IGSTK). IGSTK is an open source C++ software library that provides the basic components needed to develop image-guided surgery applications. It is intended for fast prototyping and development of image-guided surgery applications. The toolkit was developed through a collaboration between academic and industry partners. Because IGSTK was designed for safety-critical applications, the development team has adopted lightweight software processes that emphasizes safety and robustness while, at the same time, supporting geographically separated developers. A software process that is philosophically similar to agile software methods was adopted emphasizing iterative, incremental, and test-driven development principles. The guiding principle in the architecture design of IGSTK is patient safety. The IGSTK team implemented a component-based architecture and used state machine software design methodologies to improve the reliability and safety of the components. Every IGSTK component has a well-defined set of features that are governed by state machines. The state machine ensures that the component is always in a valid state and that all state transitions are valid and meaningful. Realizing that the continued success and viability of an open source toolkit depends on a strong user community, the IGSTK team is following several key strategies to build an active user community. These include maintaining a users and developers' mailing list, providing documentation (application programming interface reference document and book), presenting demonstration applications, and delivering tutorial sessions at relevant scientific conferences. PMID:17703338

  20. TEVA-SPOT Toolkit 1.2

    Energy Science and Technology Software Center (ESTSC)

    2007-07-26

    The TEVA-SPOT Toolkit (SPOT) supports the design of contaminant warning systems (CWSs) that use real-time sensors to detect contaminants in municipal water distribution networks. Specifically, SPOT provides the capability to select the locations for installing sensors in order to maximize the utility and effectiveness of the CWS. SPOT models the sensor placement process as an optimization problem, and the user can specify a wide range of performance objectives for contaminant warning system design, including populationmore » health effects, time to detection, extent of contamination, volume consumed and number of failed detections. For example, a SPOT user can integrate expert knowledge during the design process by specigying required sensor placements or designating network locations as forbidden. Further, cost considerations can be integrated by limiting the design with user-specified installation costs at each location.« less

  1. Introduction to the Geant4 Simulation toolkit

    NASA Astrophysics Data System (ADS)

    Guatelli, S.; Cutajar, D.; Oborn, B.; Rosenfeld, A. B.

    2011-05-01

    Geant4 is a Monte Carlo simulation Toolkit, describing the interactions of particles with matter. Geant4 is widely used in radiation physics research, from High Energy Physics, to medical physics and space science, thanks to its sophisticated physics component, coupled with advanced functionality in geometry description. Geant4 is widely used at the Centre for Medical Radiation Physics (CMRP), at the University of Wollongong, to characterise and optimise novel detector concepts, radiotherapy treatments, and imaging solutions. This lecture consists of an introduction to Monte Carlo method, and to Geant4. Particular attention will be devoted to the Geant4 physics component, and to the physics models describing electromagnetic and hadronic physics interactions. The second part of the lecture will be focused on the methodology to adopt to develop a Geant4 simulation application.

  2. Monitoring Extreme-scale Lustre Toolkit

    SciTech Connect

    Brim, Michael J; Lothian, Josh

    2015-01-01

    We discuss the design and ongoing development of the Monitoring Extreme-scale Lustre Toolkit (MELT), a unified Lustre performance monitoring and analysis infrastructure that provides continuous, low-overhead summary information on the health and performance of Lustre, as well as on-demand, in-depth problem diagnosis and root-cause analysis. The MELT infrastructure leverages a distributed overlay network to enable monitoring of center-wide Lustre filesystems where clients are located across many network domains. We preview interactive command-line utilities that help administrators and users to observe Lustre performance at various levels of resolution, from individual servers or clients to whole filesystems, including job-level reporting. Finally, we discuss our future plans for automating the root-cause analysis of common Lustre performance problems.

  3. Introduction to the Geant4 Simulation toolkit

    SciTech Connect

    Guatelli, S.; Cutajar, D.; Rosenfeld, A. B.; Oborn, B.

    2011-05-05

    Geant4 is a Monte Carlo simulation Toolkit, describing the interactions of particles with matter. Geant4 is widely used in radiation physics research, from High Energy Physics, to medical physics and space science, thanks to its sophisticated physics component, coupled with advanced functionality in geometry description. Geant4 is widely used at the Centre for Medical Radiation Physics (CMRP), at the University of Wollongong, to characterise and optimise novel detector concepts, radiotherapy treatments, and imaging solutions. This lecture consists of an introduction to Monte Carlo method, and to Geant4. Particular attention will be devoted to the Geant4 physics component, and to the physics models describing electromagnetic and hadronic physics interactions. The second part of the lecture will be focused on the methodology to adopt to develop a Geant4 simulation application.

  4. NBII-SAIN Data Management Toolkit

    USGS Publications Warehouse

    Burley, Thomas E.; Peine, John D.

    2009-01-01

    percent of the cost of a spatial information system is associated with spatial data collection and management (U.S. General Accounting Office, 2003). These figures indicate that the resources (time, personnel, money) of many agencies and organizations could be used more efficiently and effectively. Dedicated and conscientious data management coordination and documentation is critical for reducing such redundancy. Substantial cost savings and increased efficiency are direct results of a pro-active data management approach. In addition, details of projects as well as data and information are frequently lost as a result of real-world occurrences such as the passing of time, job turnover, and equipment changes and failure. A standardized, well documented database allows resource managers to identify issues, analyze options, and ultimately make better decisions in the context of adaptive management (National Land and Water Resources Audit and the Australia New Zealand Land Information Council on behalf of the Australian National Government, 2003). Many environmentally focused, scientific, or natural resource management organizations collect and create both spatial and non-spatial data in some form. Data management appropriate for those data will be contingent upon the project goal(s) and objectives and thus will vary on a case-by-case basis. This project and the resulting Data Management Toolkit, hereafter referred to as the Toolkit, is therefore not intended to be comprehensive in terms of addressing all of the data management needs of all projects that contain biological, geospatial, and other types of data. The Toolkit emphasizes the idea of connecting a project's data and the related management needs to the defined project goals and objectives from the outset. In that context, the Toolkit presents and describes the fundamental components of sound data and information management that are common to projects involving biological, geospatial, and other related data

  5. Data Exploration Toolkit for serial diffraction experiments

    PubMed Central

    Zeldin, Oliver B.; Brewster, Aaron S.; Hattne, Johan; Uervirojnangkoorn, Monarin; Lyubimov, Artem Y.; Zhou, Qiangjun; Zhao, Minglei; Weis, William I.; Sauter, Nicholas K.; Brunger, Axel T.

    2015-01-01

    Ultrafast diffraction at X-ray free-electron lasers (XFELs) has the potential to yield new insights into important biological systems that produce radiation-sensitive crystals. An unavoidable feature of the ‘diffraction before destruction’ nature of these experiments is that images are obtained from many distinct crystals and/or different regions of the same crystal. Combined with other sources of XFEL shot-to-shot variation, this introduces significant heterogeneity into the diffraction data, complicating processing and interpretation. To enable researchers to get the most from their collected data, a toolkit is presented that provides insights into the quality of, and the variation present in, serial crystallography data sets. These tools operate on the unmerged, partial intensity integration results from many individual crystals, and can be used on two levels: firstly to guide the experimental strategy during data collection, and secondly to help users make informed choices during data processing. PMID:25664746

  6. Data Exploration Toolkit for serial diffraction experiments

    DOE PAGESBeta

    Zeldin, Oliver B.; Brewster, Aaron S.; Hattne, Johan; Uervirojnangkoorn, Monarin; Lyubimov, Artem Y.; Zhou, Qiangjun; Zhao, Minglei; Weis, William I.; Sauter, Nicholas K.; Brunger, Axel T.

    2015-01-23

    Ultrafast diffraction at X-ray free-electron lasers (XFELs) has the potential to yield new insights into important biological systems that produce radiation-sensitive crystals. An unavoidable feature of the 'diffraction before destruction' nature of these experiments is that images are obtained from many distinct crystals and/or different regions of the same crystal. Combined with other sources of XFEL shot-to-shot variation, this introduces significant heterogeneity into the diffraction data, complicating processing and interpretation. To enable researchers to get the most from their collected data, a toolkit is presented that provides insights into the quality of, and the variation present in, serial crystallography datamore » sets. These tools operate on the unmerged, partial intensity integration results from many individual crystals, and can be used on two levels: firstly to guide the experimental strategy during data collection, and secondly to help users make informed choices during data processing.« less

  7. A Perl toolkit for LIMS development

    PubMed Central

    Morris, James A; Gayther, Simon A; Jacobs, Ian J; Jones, Christopher

    2008-01-01

    Background High throughput laboratory techniques generate huge quantities of scientific data. Laboratory Information Management Systems (LIMS) are a necessary requirement, dealing with sample tracking, data storage and data reporting. Commercial LIMS solutions are available, but these can be both costly and overly complex for the task. The development of bespoke LIMS solutions offers a number of advantages, including the flexibility to fulfil all a laboratory's requirements at a fraction of the price of a commercial system. The programming language Perl is a perfect development solution for LIMS applications because of Perl's powerful but simple to use database and web interaction, it is also well known for enabling rapid application development and deployment, and boasts a very active and helpful developer community. The development of an in house LIMS from scratch however can take considerable time and resources, so programming tools that enable the rapid development of LIMS applications are essential but there are currently no LIMS development tools for Perl. Results We have developed ArrayPipeline, a Perl toolkit providing object oriented methods that facilitate the rapid development of bespoke LIMS applications. The toolkit includes Perl objects that encapsulate key components of a LIMS, providing methods for creating interactive web pages, interacting with databases, error tracking and reporting, and user and session management. The MT_Plate object provides methods for manipulation and management of microtitre plates, while a given LIMS can be encapsulated by extension of the core modules, providing system specific methods for database interaction and web page management. Conclusion This important addition to the Perl developer's library will make the development of in house LIMS applications quicker and easier encouraging laboratories to create bespoke LIMS applications to meet their specific data management requirements. PMID:18353174

  8. Handheld access to radiology teaching files: an automated system for format conversion and content creation

    NASA Astrophysics Data System (ADS)

    Raman, Raghav; Raman, Lalithakala; Raman, Bhargav; Gold, Garry; Beaulieu, Christopher F.

    2002-05-01

    Current handheld Personal Digital Assistants (PDAs) can be used to view radiology teaching files. We have developed a toolkit that allows rapid creation of radiology teaching files in handheld formats from existing repositories. Our toolkit incorporated a desktop converter, a web conversion server and an application programming interface (API). Our API was integrated with an existing pilot teaching file database. We evaluated our system by obtaining test DICOM and JPEG images from our PACS system, our pilot database and from personal collections and converting them on a Windows workstation (Microsoft, Redmond, CA) and on other platforms using the web server. Our toolkit anonymized, annotated and categorized images using DICOM header information and data entered by the authors. Image processing was automatically customized for the target handheld device. We used freeware handheld image viewers as well as our custom applications that allowed window/level manipulation and viewing of additional textual information. Our toolkit provides desktop and web access to image conversion tools to produce organized handheld teaching file packages for most handheld devices and our API allows existing teaching file databases to incorporate handheld compatibility. The distribution of radiology teaching files on PDAs can increase the accessibility to radiology teaching.

  9. The Galley Parallel File System

    NASA Technical Reports Server (NTRS)

    Nieuwejaar, Nils; Kotz, David

    1996-01-01

    Most current multiprocessor file systems are designed to use multiple disks in parallel, using the high aggregate bandwidth to meet the growing I/0 requirements of parallel scientific applications. Many multiprocessor file systems provide applications with a conventional Unix-like interface, allowing the application to access multiple disks transparently. This interface conceals the parallelism within the file system, increasing the ease of programmability, but making it difficult or impossible for sophisticated programmers and libraries to use knowledge about their I/O needs to exploit that parallelism. In addition to providing an insufficient interface, most current multiprocessor file systems are optimized for a different workload than they are being asked to support. We introduce Galley, a new parallel file system that is intended to efficiently support realistic scientific multiprocessor workloads. We discuss Galley's file structure and application interface, as well as the performance advantages offered by that interface.

  10. The Radar Software Toolkit: Anaylsis software for the ITM community

    NASA Astrophysics Data System (ADS)

    Barnes, R. J.; Greenwald, R.

    2005-05-01

    The Radar Software Toolkit is a collection of data analysis, modelling and visualization tools originally developed for the SuperDARN project. It has evolved over the years into a robust, multi-platform software toolkit for working with a variety of ITM data sets including data from the Polar, TIMED and ACE spacecraft, ground based magnetometers, Incoherrent Scatter Radars, and SuperDARN. The toolkit includes implementations of the Altitude Adjusted Coordinate System (AACGM), the International Geomagnetic Reference Field (IGRF), SGP4 and a set of coordinate transform functions. It also includes a sophisticated XML based data visualization system. The toolkit is written using a combination of ANSI C, Java and the Interactive Data Language (IDL) and has been tested on a variety of platforms.

  11. Toolkit for evaluating impacts of public participation in scientific research

    NASA Astrophysics Data System (ADS)

    Bonney, R.; Phillips, T.

    2011-12-01

    The Toolkit for Evaluating Impacts of Public Participation in Scientific Research is being developed to meet a major need in the field of visitor studies: To provide project developers and other professionals, especially those with limited knowledge or understanding of evaluation techniques, with a systematic method for assessing project impact that facilitates longitudinal and cross-project comparisons. The need for the toolkit was first identified at the Citizen Science workshop held at the Cornell Lab of Ornithology in 2007 (McEver et al. 2007) and reaffirmed by a CAISE inquiry group that produced the recent report: "Public Participation in Scientific Research: Defining the Field and Assessing its Potential for Informal Science Education" (Bonney et al. 2009). This presentation will introduce the Toolkit, show how it is intended to be used, and describe ways that project directors can use their programmatic goals and use toolkit materials to outline a plan for evaluating the impacts of their project.

  12. General relativistic magneto-hydrodynamics with the Einstein Toolkit

    NASA Astrophysics Data System (ADS)

    Moesta, Philipp; Mundim, Bruno; Faber, Joshua; Noble, Scott; Bode, Tanja; Haas, Roland; Loeffler, Frank; Ott, Christian; Reisswig, Christian; Schnetter, Erik

    2013-04-01

    The Einstein Toolkit Consortium is developing and supporting open software for relativistic astrophysics. Its aim is to provide the core computational tools that can enable new science, broaden our community, facilitate interdisciplinary research and take advantage of petascale computers and advanced cyberinfrastructure. The Einstein Toolkit currently consists of an open set of over 100 modules for the Cactus framework, primarily for computational relativity along with associated tools for simulation management and visualization. The toolkit includes solvers for vacuum spacetimes as well as relativistic magneto-hydrodynamics. This talk will present the current capabilities of the Einstein Toolkit with a particular focus on recent improvements made to the general relativistic magneto-hydrodynamics modeling and will point to information how to leverage it for future research.

  13. Charon Message-Passing Toolkit for Scientific Computations

    NASA Technical Reports Server (NTRS)

    VanderWijngaart, Rob F.; Saini, Subhash (Technical Monitor)

    1998-01-01

    The Charon toolkit for piecemeal development of high-efficiency parallel programs for scientific computing is described. The portable toolkit, callable from C and Fortran, provides flexible domain decompositions and high-level distributed constructs for easy translation of serial legacy code or design to distributed environments. Gradual tuning can subsequently be applied to obtain high performance, possibly by using explicit message passing. Charon also features general structured communications that support stencil-based computations with complex recurrences. Through the separation of partitioning and distribution, the toolkit can also be used for blocking of uni-processor code, and for debugging of parallel algorithms on serial machines. An elaborate review of recent parallelization aids is presented to highlight the need for a toolkit like Charon. Some performance results of parallelizing the NAS Parallel Benchmark SP program using Charon are given, showing good scalability.

  14. Charon Message-Passing Toolkit for Scientific Computations

    NASA Technical Reports Server (NTRS)

    VanderWijngarrt, Rob F.; Saini, Subhash (Technical Monitor)

    1998-01-01

    The Charon toolkit for piecemeal development of high-efficiency parallel programs for scientific computing is described. The portable toolkit, callable from C and Fortran, provides flexible domain decompositions and high-level distributed constructs for easy translation of serial legacy code or design to distributed environments. Gradual tuning can subsequently be applied to obtain high performance, possibly by using explicit message passing. Charon also features general structured communications that support stencil-based computations with complex recurrences. Through the separation of partitioning and distribution, the toolkit can also be used for blocking of uni-processor code, and for debugging of parallel algorithms on serial machines. An elaborate review of recent parallelization aids is presented to highlight the need for a toolkit like Charon. Some performance results of parallelizing the NAS Parallel Benchmark SP program using Charon are given, showing good scalability. Some performance results of parallelizing the NAS Parallel Benchmark SP program using Charon are given, showing good scalability.

  15. Integrating legacy software toolkits into China-VO system

    NASA Astrophysics Data System (ADS)

    Wang, Xiao-Qian, Cui, Chen-Zhou, Zhao, Yong-Heng

    2005-12-01

    Virtual Observatory (VO) is a collection of data-archives and software toolkits. It aims to provide astronomers research resources with uniformed interfaces, using advanced information technologies. In this article, we discuss the necessaries and feasibilities of integrating legacy software toolkits into China-VO system at first; then analyse granularity about integrating. Three general integrating methods are given in detail. At last, we introduce an instance of integrating "Image Magick" - an software for image processing and discuss more about VO integration.

  16. Multi-Physics Demonstration Problem with the SHARP Reactor Simulation Toolkit

    SciTech Connect

    Merzari, E.; Shemon, E. R.; Yu, Y. Q.; Thomas, J. W.; Obabko, A.; Jain, Rajeev; Mahadevan, Vijay; Tautges, Timothy; Solberg, Jerome; Ferencz, Robert Mark; Whitesides, R.

    2015-12-21

    This report describes to employ SHARP to perform a first-of-a-kind analysis of the core radial expansion phenomenon in an SFR. This effort required significant advances in the framework Multi-Physics Demonstration Problem with the SHARP Reactor Simulation Toolkit used to drive the coupled simulations, manipulate the mesh in response to the deformation of the geometry, and generate the necessary modified mesh files. Furthermore, the model geometry is fairly complex, and consistent mesh generation for the three physics modules required significant effort. Fully-integrated simulations of a 7-assembly mini-core test problem have been performed, and the results are presented here. Physics models of a full-core model of the Advanced Burner Test Reactor have also been developed for each of the three physics modules. Standalone results of each of the three physics modules for the ABTR are presented here, which provides a demonstration of the feasibility of the fully-integrated simulation.

  17. The Trick Simulation Toolkit: A NASA/Opensource Framework for Running Time Based Physics Models

    NASA Technical Reports Server (NTRS)

    Penn, John M.

    2016-01-01

    The Trick Simulation Toolkit is a simulation development environment used to create high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. Its purpose is to generate a simulation executable from a collection of user-supplied models and a simulation definition file. For each Trick-based simulation, Trick automatically provides job scheduling, numerical integration, the ability to write and restore human readable checkpoints, data recording, interactive variable manipulation, a run-time interpreter, and many other commonly needed capabilities. This allows simulation developers to concentrate on their domain expertise and the algorithms and equations of their models. Also included in Trick are tools for plotting recorded data and various other supporting utilities and libraries. Trick is written in C/C++ and Java and supports both Linux and MacOSX computer operating systems. This paper describes Trick's design and use at NASA Johnson Space Center.

  18. Biomechanical ToolKit: Open-source framework to visualize and process biomechanical data.

    PubMed

    Barre, Arnaud; Armand, Stéphane

    2014-04-01

    C3D file format is widely used in the biomechanical field by companies and laboratories to store motion capture systems data. However, few software packages can visualize and modify the integrality of the data in the C3D file. Our objective was to develop an open-source and multi-platform framework to read, write, modify and visualize data from any motion analysis systems using standard (C3D) and proprietary file formats (used by many companies producing motion capture systems). The Biomechanical ToolKit (BTK) was developed to provide cost-effective and efficient tools for the biomechanical community to easily deal with motion analysis data. A large panel of operations is available to read, modify and process data through C++ API, bindings for high-level languages (Matlab, Octave, and Python), and standalone application (Mokka). All these tools are open-source and cross-platform and run on all major operating systems (Windows, Linux, MacOS X). PMID:24548899

  19. CART—a chemical annotation retrieval toolkit

    PubMed Central

    Deghou, Samy; Zeller, Georg; Iskar, Murat; Driessen, Marja; Castillo, Mercedes; van Noort, Vera; Bork, Peer

    2016-01-01

    Motivation: Data on bioactivities of drug-like chemicals are rapidly accumulating in public repositories, creating new opportunities for research in computational systems pharmacology. However, integrative analysis of these data sets is difficult due to prevailing ambiguity between chemical names and identifiers and a lack of cross-references between databases. Results: To address this challenge, we have developed CART, a Chemical Annotation Retrieval Toolkit. As a key functionality, it matches an input list of chemical names into a comprehensive reference space to assign unambiguous chemical identifiers. In this unified space, bioactivity annotations can be easily retrieved from databases covering a wide variety of chemical effects on biological systems. Subsequently, CART can determine annotations enriched in the input set of chemicals and display these in tabular format and interactive network visualizations, thereby facilitating integrative analysis of chemical bioactivity data. Availability and Implementation: CART is available as a Galaxy web service (cart.embl.de). Source code and an easy-to-install command line tool can also be obtained from the web site. Contact: bork@embl.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27256313

  20. Security Assessment Simulation Toolkit (SAST) Final Report

    SciTech Connect

    Meitzler, Wayne D.; Ouderkirk, Steven J.; Hughes, Chad O.

    2009-11-15

    The Department of Defense Technical Support Working Group (DoD TSWG) investment in the Pacific Northwest National Laboratory (PNNL) Security Assessment Simulation Toolkit (SAST) research planted a technology seed that germinated into a suite of follow-on Research and Development (R&D) projects culminating in software that is used by multiple DoD organizations. The DoD TSWG technology transfer goal for SAST is already in progress. The Defense Information Systems Agency (DISA), the Defense-wide Information Assurance Program (DIAP), the Marine Corps, Office Of Naval Research (ONR) National Center For Advanced Secure Systems Research (NCASSR) and Office Of Secretary Of Defense International Exercise Program (OSD NII) are currently investing to take SAST to the next level. PNNL currently distributes the software to over 6 government organizations and 30 DoD users. For the past five DoD wide Bulwark Defender exercises, the adoption of this new technology created an expanding role for SAST. In 2009, SAST was also used in the OSD NII International Exercise and is currently scheduled for use in 2010.

  1. UQ Toolkit v 2.0

    SciTech Connect

    2013-10-03

    The Uncertainty Quantification (UQ) Toolkit is a software library for the characterizaton and propagation of uncertainties in computational models. For the characterization of uncertainties, Bayesian inference tools are provided to infer uncertain model parameters, as well as Bayesian compressive sensing methods for discovering sparse representations of high-dimensional input-output response surfaces, and also Karhunen-Loève expansions for representing stochastic processes. Uncertain parameters are treated as random variables and represented with Polynomial Chaos expansions (PCEs). The library implements several spectral basis function types (e.g. Hermite basis functions in terms of Gaussian random variables or Legendre basis functions in terms of uniform random variables) that can be used to represent random variables with PCEs. For propagation of uncertainty, tools are provided to propagate PCEs that describe the input uncertainty through the computational model using either intrusive methods (Galerkin projection of equations onto basis functions) or non-intrusive methods (perform deterministic operation at sampled values of the random values and project the obtained results onto basis functions).

  2. Data Exploration Toolkit for serial diffraction experiments

    SciTech Connect

    Zeldin, Oliver B.; Brewster, Aaron S.; Hattne, Johan; Uervirojnangkoorn, Monarin; Lyubimov, Artem Y.; Zhou, Qiangjun; Zhao, Minglei; Weis, William I.; Sauter, Nicholas K.; Brunger, Axel T.

    2015-02-01

    This paper describes a set of tools allowing experimentalists insight into the variation present within large serial data sets. Ultrafast diffraction at X-ray free-electron lasers (XFELs) has the potential to yield new insights into important biological systems that produce radiation-sensitive crystals. An unavoidable feature of the ‘diffraction before destruction’ nature of these experiments is that images are obtained from many distinct crystals and/or different regions of the same crystal. Combined with other sources of XFEL shot-to-shot variation, this introduces significant heterogeneity into the diffraction data, complicating processing and interpretation. To enable researchers to get the most from their collected data, a toolkit is presented that provides insights into the quality of, and the variation present in, serial crystallography data sets. These tools operate on the unmerged, partial intensity integration results from many individual crystals, and can be used on two levels: firstly to guide the experimental strategy during data collection, and secondly to help users make informed choices during data processing.

  3. Expanding the conceptual toolkit of organ gifting.

    PubMed

    M Shaw, Rhonda

    2015-07-01

    In jurisdictions where the sale of body tissue and organs is illegal, organ transplantation is often spoken of as a gift of life. In the social sciences and bioethics this concept has been subject to critique over the course of the last two decades for failing to reflect the complexities of organ and tissue exchange. I suggest that a new ethical model of organ donation and transplantation is needed to capture the range of experiences in this domain. The proposed model is both analytical and empirically oriented, and draws on research findings linking a series of qualitative sociological studies undertaken in New Zealand between 2007 and 2013. The studies were based on document analysis, field notes and 127 semi-structured in-depth interviews with people from different cultural and constituent groups directly involved in organ transfer processes. The aim of the article is to contribute to sociological knowledge about organ exchange and to expand the conceptual toolkit of organ donation to include the unconditional gift, the gift relation, gift exchange, body project, and body work. The rationale for the proposed model is to provide an explanatory framework for organ donors and transplant recipients and to assist the development of ethical guidelines and health policy discourse. PMID:25728628

  4. UQ Toolkit v 2.0

    Energy Science and Technology Software Center (ESTSC)

    2013-10-03

    The Uncertainty Quantification (UQ) Toolkit is a software library for the characterizaton and propagation of uncertainties in computational models. For the characterization of uncertainties, Bayesian inference tools are provided to infer uncertain model parameters, as well as Bayesian compressive sensing methods for discovering sparse representations of high-dimensional input-output response surfaces, and also Karhunen-Loève expansions for representing stochastic processes. Uncertain parameters are treated as random variables and represented with Polynomial Chaos expansions (PCEs). The library implementsmore » several spectral basis function types (e.g. Hermite basis functions in terms of Gaussian random variables or Legendre basis functions in terms of uniform random variables) that can be used to represent random variables with PCEs. For propagation of uncertainty, tools are provided to propagate PCEs that describe the input uncertainty through the computational model using either intrusive methods (Galerkin projection of equations onto basis functions) or non-intrusive methods (perform deterministic operation at sampled values of the random values and project the obtained results onto basis functions).« less

  5. Asteroids Outreach Toolkit Development: Using Iterative Feedback In Informal Education

    NASA Astrophysics Data System (ADS)

    White, Vivian; Berendsen, M.; Gurton, S.; Dusenbery, P. B.

    2011-01-01

    The Night Sky Network is a collaboration of close to 350 astronomy clubs across the US that actively engage in public outreach within their communities. Since 2004, the Astronomical Society of the Pacific has been creating outreach ToolKits filled with carefully crafted sets of physical materials designed to help these volunteer clubs explain the wonders of the night sky to the public. The effectiveness of the ToolKit activities and demonstrations is the direct result of a thorough testing and vetting process. Find out how this iterative assessment process can help other programs create useful tools for both formal and informal educators. The current Space Rocks Outreach ToolKit focuses on explaining asteroids, comets, and meteorites to the general public using quick, big-picture activities that get audiences involved. Eight previous ToolKits cover a wide range of topics from the Moon to black holes. In each case, amateur astronomers and the public helped direct the development the activities along the way through surveys, focus groups, and active field-testing. The resulting activities have been embraced by the larger informal learning community and are enthusiastically being delivered to millions of people across the US and around the world. Each ToolKit is delivered free of charge to active Night Sky Network astronomy clubs. All activity write-ups are available free to download at the website listed here. Amateur astronomers receive frequent questions from the public about Earth impacts, meteors, and comets so this set of activities will help them explain the dynamics of these phenomena to the public. The Space Rocks ToolKit resources complement the Great Balls of Fire museum exhibit produced by Space Science Institute's National Center for Interactive Learning and scheduled for release in 2011. NSF has funded this national traveling exhibition and outreach ToolKit under Grant DRL-0813528.

  6. The Configuration Space Toolkit (C-Space Toolkit or CSTK) Ver. 2.5 beta

    SciTech Connect

    Chen, Pang-Chieh; Hwang, Yong; Xavier, Patrick; Lewis, Christopher; Lafarge, Robert; & Watterberg, Peter

    2010-02-24

    The C-Space Toolkit provides a software library that makes it easier to program motion planning, simulation, robotics, and virtual reality codes using the Configuration Space abstraction. Key functionality (1) enables the user to special create representations of movable and stationary rigid geometric objects, and (2) perform fast distance, interference (clash) detection, collision detection, closest-feature pairs, and contact queries in terms of object configuration. Not only can queries be computed at any given point in configuration space, but they can be done exactly over linear-translational path segments and approximately for rotational path segments. Interference detection and distance computations can be done with respect to the Minkowski sum of the original geometry and a piece of convex geometry. The Toolkit takes as raw model input (1) collections of convex polygons that form the boundaries of models and (2) convex polyhedra, cones, cylinders, and discs that are models and model components. Configurations are given in terms of homogeneous transforms. A simple OpenGL-based system for displaying and animating the geometric objects is included in the implementation. This version, 2.5 Beta, incorporates feature additions and enhancements, improvements in algorithms, improved robustness, bug fixes and cleaned-up source code, better compliance with standards and recent programming convention, changes to the build process for the software, support for more recent hardware and software platforms, and improvements to documentation and source-code comments.

  7. The Configuration Space Toolkit (C-Space Toolkit or CSTK) Ver. 2.5 beta

    Energy Science and Technology Software Center (ESTSC)

    2010-02-24

    The C-Space Toolkit provides a software library that makes it easier to program motion planning, simulation, robotics, and virtual reality codes using the Configuration Space abstraction. Key functionality (1) enables the user to special create representations of movable and stationary rigid geometric objects, and (2) perform fast distance, interference (clash) detection, collision detection, closest-feature pairs, and contact queries in terms of object configuration. Not only can queries be computed at any given point in configurationmore » space, but they can be done exactly over linear-translational path segments and approximately for rotational path segments. Interference detection and distance computations can be done with respect to the Minkowski sum of the original geometry and a piece of convex geometry. The Toolkit takes as raw model input (1) collections of convex polygons that form the boundaries of models and (2) convex polyhedra, cones, cylinders, and discs that are models and model components. Configurations are given in terms of homogeneous transforms. A simple OpenGL-based system for displaying and animating the geometric objects is included in the implementation. This version, 2.5 Beta, incorporates feature additions and enhancements, improvements in algorithms, improved robustness, bug fixes and cleaned-up source code, better compliance with standards and recent programming convention, changes to the build process for the software, support for more recent hardware and software platforms, and improvements to documentation and source-code comments.« less

  8. A Toolkit for Eye Recognition of LAMOST Spectroscopy

    NASA Astrophysics Data System (ADS)

    Yuan, H.; Zhang, H.; Zhang, Y.; Lei, Y.; Dong, Y.; Zhao, Y.

    2014-05-01

    The Large sky Area Multi-Object fiber Spectroscopic Telescope (LAMOST, also named the Guo Shou Jing Telescope) has finished the pilot survey and now begun the normal survey by the end of 2012 September. There have already been millions of targets observed, including thousands of quasar candidates. Because of the difficulty in the automatic identification of quasar spectra, eye recognition is always necessary and efficient. However massive spectra identification by eye is a huge job. In order to improve the efficiency and effectiveness of spectra , a toolkit for eye recognition of LAMOST spectroscopy is developed. Spectral cross-correlation templates from the Sloan Digital Sky Survey (SDSS) are applied as references, including O star, O/B transition star, B star, A star, F/A transition star, F star, G star, K star, M1 star, M3 star,M5 star,M8 star, L1 star, magnetic white dwarf, carbon star, white dwarf, B white dwarf, low metallicity K sub-dwarf, "Early-type" galaxy, galaxy, "Later-type" galaxy, Luminous Red Galaxy, QSO, QSO with some BAL activity and High-luminosity QSO. By adjusting the redshift and flux ratio of the template spectra in an interactive graphic interface, the spectral type of the target can be discriminated in a easy and feasible way and the redshift is estimated at the same time with a precision of about millesimal. The advantage of the tool in dealing with low quality spectra is indicated. Spectra from the Pilot Survey of LAMSOT are applied as examples and spectra from SDSS are also tested from comparison. Target spectra in both image format and fits format are supported. For convenience several spectra accessing manners are provided. All the spectra from LAMOST pilot survey can be located and acquired via the VOTable files on the internet as suggested by International Virtual Observatory Alliance (IVOA). After the construction of the Simple Spectral Access Protocol (SSAP) service by the Chinese Astronomical Data Center (CAsDC), spectra can be

  9. The Einstein Toolkit: a community computational infrastructure for relativistic astrophysics

    NASA Astrophysics Data System (ADS)

    Löffler, Frank; Faber, Joshua; Bentivegna, Eloisa; Bode, Tanja; Diener, Peter; Haas, Roland; Hinder, Ian; Mundim, Bruno C.; Ott, Christian D.; Schnetter, Erik; Allen, Gabrielle; Campanelli, Manuela; Laguna, Pablo

    2012-06-01

    We describe the Einstein Toolkit, a community-driven, freely accessible computational infrastructure intended for use in numerical relativity, relativistic astrophysics, and other applications. The toolkit, developed by a collaboration involving researchers from multiple institutions around the world, combines a core set of components needed to simulate astrophysical objects such as black holes, compact objects, and collapsing stars, as well as a full suite of analysis tools. The Einstein Toolkit is currently based on the Cactus framework for high-performance computing and the Carpet adaptive mesh refinement driver. It implements spacetime evolution via the BSSN evolution system and general relativistic hydrodynamics in a finite-volume discretization. The toolkit is under continuous development and contains many new code components that have been publicly released for the first time and are described in this paper. We discuss the motivation behind the release of the toolkit, the philosophy underlying its development, and the goals of the project. A summary of the implemented numerical techniques is included, as are results of numerical test covering a variety of sample astrophysical problems.

  10. Cyber Security Audit and Attack Detection Toolkit

    SciTech Connect

    Peterson, Dale

    2012-05-31

    This goal of this project was to develop cyber security audit and attack detection tools for industrial control systems (ICS). Digital Bond developed and released a tool named Bandolier that audits ICS components commonly used in the energy sector against an optimal security configuration. The Portaledge Project developed a capability for the PI Historian, the most widely used Historian in the energy sector, to aggregate security events and detect cyber attacks.

  11. Validation of Power Output for the WIND Toolkit

    SciTech Connect

    King, J.; Clifton, A.; Hodge, B. M.

    2014-09-01

    Renewable energy integration studies require wind data sets of high quality with realistic representations of the variability, ramping characteristics, and forecast performance for current wind power plants. The Wind Integration National Data Set (WIND) Toolkit is meant to be an update for and expansion of the original data sets created for the weather years from 2004 through 2006 during the Western Wind and Solar Integration Study and the Eastern Wind Integration Study. The WIND Toolkit expands these data sets to include the entire continental United States, increasing the total number of sites represented, and it includes the weather years from 2007 through 2012. In addition, the WIND Toolkit has a finer resolution for both the temporal and geographic dimensions. Three separate data sets will be created: a meteorological data set, a wind power data set, and a forecast data set. This report describes the validation of the wind power data set.

  12. Evolution of the plant-microbe symbiotic 'toolkit'.

    PubMed

    Delaux, Pierre-Marc; Séjalon-Delmas, Nathalie; Bécard, Guillaume; Ané, Jean-Michel

    2013-06-01

    Beneficial associations between plants and arbuscular mycorrhizal fungi play a major role in terrestrial environments and in the sustainability of agroecosystems. Proteins, microRNAs, and small molecules have been identified in model angiosperms as required for the establishment of arbuscular mycorrhizal associations and define a symbiotic 'toolkit' used for other interactions such as the rhizobia-legume symbiosis. Based on recent studies, we propose an evolutionary framework for this toolkit. Some components appeared recently in angiosperms, whereas others are highly conserved even in land plants unable to form arbuscular mycorrhizal associations. The exciting finding that some components pre-date the appearance of arbuscular mycorrhizal fungi suggests the existence of unknown roles for this toolkit and even the possibility of symbiotic associations in charophyte green algae. PMID:23462549

  13. The PRIDE (Partnership to Improve Diabetes Education) Toolkit

    PubMed Central

    Wolff, Kathleen; Chambers, Laura; Bumol, Stefan; White, Richard O.; Gregory, Becky Pratt; Davis, Dianne; Rothman, Russell L.

    2016-01-01

    Purpose Patients with low literacy, low numeracy, and/or linguistic needs can experience challenges understanding diabetes information and applying concepts to their self-management. The authors designed a toolkit of education materials that are sensitive to patients' literacy and numeracy levels, language preferences, and cultural norms and that encourage shared goal setting to improve diabetes self-management and health outcomes. The Partnership to Improve Diabetes Education (PRIDE) toolkit was developed to facilitate diabetes self-management education and support. Methods The PRIDE toolkit includes a comprehensive set of 30 interactive education modules in English and Spanish to support diabetes self-management activities. The toolkit builds upon the authors' previously validated Diabetes Literacy and Numeracy Education Toolkit (DLNET) by adding a focus on shared goal setting, addressing the needs of Spanish-speaking patients, and including a broader range of diabetes management topics. Each PRIDE module was evaluated using the Suitability Assessment of Materials (SAM) instrument to determine the material's cultural appropriateness and its sensitivity to the needs of patients with low literacy and low numeracy. Reading grade level was also assessed using the Automated Readability Index (ARI), Coleman-Liau, Flesch-Kincaid, Fry, and SMOG formulas. Conclusions The average reading grade level of the materials was 5.3 (SD 1.0), with a mean SAM of 91.2 (SD 5.4). All of the 30 modules received a “superior” score (SAM >70%) when evaluated by 2 independent raters. The PRIDE toolkit modules can be used by all members of a multidisciplinary team to assist patients with low literacy and low numeracy in managing their diabetes. PMID:26647414

  14. XML Files

    MedlinePlus

    ... nlm.nih.gov/medlineplus/xml.html MedlinePlus XML Files To use the sharing features on this page, ... information on all English and Spanish topic groups. Files generated on July 09, 2016 MedlinePlus Health Topic ...

  15. WIRM: An Open Source Toolkit for Building Biomedical Web Applications

    PubMed Central

    Jakobovits, Rex M.; Rosse, Cornelius; Brinkley, James F.

    2002-01-01

    This article describes an innovative software toolkit that allows the creation of web applications that facilitate the acquisition, integration, and dissemination of multimedia biomedical data over the web, thereby reducing the cost of knowledge sharing. There is a lack of high-level web application development tools suitable for use by researchers, clinicians, and educators who are not skilled programmers. Our Web Interfacing Repository Manager (WIRM) is a software toolkit that reduces the complexity of building custom biomedical web applications. WIRM’s visual modeling tools enable domain experts to describe the structure of their knowledge, from which WIRM automatically generates full-featured, customizable content management systems. PMID:12386108

  16. User's manual for the two-dimensional transputer graphics toolkit

    NASA Technical Reports Server (NTRS)

    Ellis, Graham K.

    1988-01-01

    The user manual for the 2-D graphics toolkit for a transputer based parallel processor is presented. The toolkit consists of a package of 2-D display routines that can be used for the simulation visualizations. It supports multiple windows, double buffered screens for animations, and simple graphics transformations such as translation, rotation, and scaling. The display routines are written in occam to take advantage of the multiprocessing features available on transputers. The package is designed to run on a transputer separate from the graphics board.

  17. RAVE—a Detector-independent vertex reconstruction toolkit

    NASA Astrophysics Data System (ADS)

    Waltenberger, Wolfgang; Mitaroff, Winfried; Moser, Fabian

    2007-10-01

    A detector-independent toolkit for vertex reconstruction (RAVE ) is being developed, along with a standalone framework (VERTIGO ) for testing, analyzing and debugging. The core algorithms represent state of the art for geometric vertex finding and fitting by both linear (Kalman filter) and robust estimation methods. Main design goals are ease of use, flexibility for embedding into existing software frameworks, extensibility, and openness. The implementation is based on modern object-oriented techniques, is coded in C++ with interfaces for Java and Python, and follows an open-source approach. A beta release is available. VERTIGO = "vertex reconstruction toolkit and interface to generic objects".

  18. Incident Management Preparedness and Coordination Toolkit

    SciTech Connect

    Koch, Daniel B.

    2013-04-01

    As with many professions, safety planners and first responders tend to be specialists in certain areas. To be truly useful, tools should be tailored to meet their specific needs. Thus, general software suites aimed at the professional geographic information system (GIS) community might not be the best solution for a first responder with little training in GIS terminology and techniques. On the other hand, commonly used web-based map viewers may not have the capability to be customized for the planning, response, and recovery (PR&R) mission. Data formats should be open and foster easy information flow among local, state, and federal partners. Tools should be free or low-cost to address real-world budget constraints at the local level. They also need to work both with and without a network connection to be robust. The Incident Management Preparedness and Coordination Toolkit (IMPACT) can satisfy many of these needs while working in harmony with established systems at the local, state, and federal levels. The IMPACT software framework, termed the Geospatial Integrated Problem Solving Environment (GIPSE), organizes tasks, tools, and resources for the end user. It uses the concept of software wizards to both customize and extend its functionality. On the Tasks panel are a number of buttons used to initiate various operations. Similar to macros, these task buttons launch scripts that utilize the full functionality of the underlying foundational components such as the SQL spatial database and ORNL-developed map editor. The user is presented with a series of instruction pages which are implemented with HTML for interactivity. On each page are links which initiate specific actions such as creating a map showing various features. Additional tasks may be quickly programmed and added to the panel. The end user can customize the graphical interface to faciltate its use during an emergency. One of the major components of IMPACT is the ORNL Geospatial Viewer (OGV). It is used to

  19. Incident Management Preparedness and Coordination Toolkit

    Energy Science and Technology Software Center (ESTSC)

    2013-04-01

    As with many professions, safety planners and first responders tend to be specialists in certain areas. To be truly useful, tools should be tailored to meet their specific needs. Thus, general software suites aimed at the professional geographic information system (GIS) community might not be the best solution for a first responder with little training in GIS terminology and techniques. On the other hand, commonly used web-based map viewers may not have the capability tomore » be customized for the planning, response, and recovery (PR&R) mission. Data formats should be open and foster easy information flow among local, state, and federal partners. Tools should be free or low-cost to address real-world budget constraints at the local level. They also need to work both with and without a network connection to be robust. The Incident Management Preparedness and Coordination Toolkit (IMPACT) can satisfy many of these needs while working in harmony with established systems at the local, state, and federal levels. The IMPACT software framework, termed the Geospatial Integrated Problem Solving Environment (GIPSE), organizes tasks, tools, and resources for the end user. It uses the concept of software wizards to both customize and extend its functionality. On the Tasks panel are a number of buttons used to initiate various operations. Similar to macros, these task buttons launch scripts that utilize the full functionality of the underlying foundational components such as the SQL spatial database and ORNL-developed map editor. The user is presented with a series of instruction pages which are implemented with HTML for interactivity. On each page are links which initiate specific actions such as creating a map showing various features. Additional tasks may be quickly programmed and added to the panel. The end user can customize the graphical interface to faciltate its use during an emergency. One of the major components of IMPACT is the ORNL Geospatial Viewer (OGV). It is

  20. SatelliteDL: a Toolkit for Analysis of Heterogeneous Satellite Datasets

    NASA Astrophysics Data System (ADS)

    Galloy, M. D.; Fillmore, D.

    2014-12-01

    SatelliteDL is an IDL toolkit for the analysis of satellite Earth observations from a diverse set of platforms and sensors. The core function of the toolkit is the spatial and temporal alignment of satellite swath and geostationary data. The design features an abstraction layer that allows for easy inclusion of new datasets in a modular way. Our overarching objective is to create utilities that automate the mundane aspects of satellite data analysis, are extensible and maintainable, and do not place limitations on the analysis itself. IDL has a powerful suite of statistical and visualization tools that can be used in conjunction with SatelliteDL. Toward this end we have constructed SatelliteDL to include (1) HTML and LaTeX API document generation,(2) a unit test framework,(3) automatic message and error logs,(4) HTML and LaTeX plot and table generation, and(5) several real world examples with bundled datasets available for download. For ease of use, datasets, variables and optional workflows may be specified in a flexible format configuration file. Configuration statements may specify, for example, a region and date range, and the creation of images, plots and statistical summary tables for a long list of variables. SatelliteDL enforces data provenance; all data should be traceable and reproducible. The output NetCDF file metadata holds a complete history of the original datasets and their transformations, and a method exists to reconstruct a configuration file from this information. Release 0.1.0 distributes with ingest methods for GOES, MODIS, VIIRS and CERES radiance data (L1) as well as select 2D atmosphere products (L2) such as aerosol and cloud (MODIS and VIIRS) and radiant flux (CERES). Future releases will provide ingest methods for ocean and land surface products, gridded and time averaged datasets (L3 Daily, Monthly and Yearly), and support for 3D products such as temperature and water vapor profiles. Emphasis will be on NPP Sensor, Environmental and

  1. Geospatial Toolkits and Resource Maps for Selected Countries from the National Renewable Energy Laboratory (NREL)

    DOE Data Explorer

    NREL developed the Geospatial Toolkit (GsT), a map-based software application that integrates resource data and geographic information systems (GIS) for integrated resource assessment. A variety of agencies within countries, along with global datasets, provided country-specific data. Originally developed in 2005, the Geospatial Toolkit was completely redesigned and re-released in November 2010 to provide a more modern, easier-to-use interface with considerably faster analytical querying capabilities. Toolkits are available for 21 countries and each one can be downloaded separately. The source code for the toolkit is also available. [Taken and edited from http://www.nrel.gov/international/geospatial_toolkits.html

  2. GMH: A Message Passing Toolkit for GPU Clusters

    SciTech Connect

    Jie Chen, W. Watson, Weizhen Mao

    2011-01-01

    Driven by the market demand for high-definition 3D graphics, commodity graphics processing units (GPUs) have evolved into highly parallel, multi-threaded, many-core processors, which are ideal for data parallel computing. Many applications have been ported to run on a single GPU with tremendous speedups using general C-style programming languages such as CUDA. However, large applications require multiple GPUs and demand explicit message passing. This paper presents a message passing toolkit, called GMH (GPU Message Handler), on NVIDIA GPUs. This toolkit utilizes a data-parallel thread group as a way to map multiple GPUs on a single host to an MPI rank, and introduces a notion of virtual GPUs as a way to bind a thread to a GPU automatically. This toolkit provides high performance MPI style point-to-point and collective communication, but more importantly, facilitates event-driven APIs to allow an application to be managed and executed by the toolkit at runtime.

  3. Practitioner Toolkit: Working with Adult English Language Learners.

    ERIC Educational Resources Information Center

    Lieshoff, Sylvia Cobos; Aguilar, Noemi; McShane, Susan; Burt, Miriam; Peyton, Joy Kreeft; Terrill, Lynda; Van Duzer, Carol

    2004-01-01

    This document is designed to give support to adult education and family literacy instructors who are new to serving adult English language learners and their families in rural, urban, and faith- and community-based programs. The Toolkit is designed to have a positive impact on the teaching and learning in these programs. The results of two…

  4. The Data Toolkit: Ten Tools for Supporting School Improvement

    ERIC Educational Resources Information Center

    Hess, Robert T.; Robbins, Pam

    2012-01-01

    Using data for school improvement is a key goal of Race to the Top, and now is the time to make data-driven school improvement a priority. However, many educators are drowning in data. Boost your professional learning community's ability to translate data into action with this new book from Pam Robbins and Robert T. Hess. "The Data Toolkit"…

  5. New MISR Toolkit Version 1.4.1 Available

    Atmospheric Science Data Center

    2014-09-03

    ... of the MISR Toolkit (MTK) is now available from the The Open Channel Foundation .  The MTK is a simplified programming ... HDF-EOS to access MISR Level 1B2, Level 2, and ancillary data products. It also handles the MISR conventional format. The interface ...

  6. A Toolkit to Implement Graduate Attributes in Geography Curricula

    ERIC Educational Resources Information Center

    Spronken-Smith, Rachel; McLean, Angela; Smith, Nell; Bond, Carol; Jenkins, Martin; Marshall, Stephen; Frielick, Stanley

    2016-01-01

    This article uses findings from a project on engagement with graduate outcomes across higher education institutions in New Zealand to produce a toolkit for implementing graduate attributes in geography curricula. Key facets include strong leadership; academic developers to facilitate conversations about graduate attributes and teaching towards…

  7. Resource Toolkit for Working with Education Service Providers

    ERIC Educational Resources Information Center

    National Association of Charter School Authorizers (NJ1), 2005

    2005-01-01

    This resource toolkit for working education service providers contains four sections. Section 1, "Roles Responsibilities, and Relationships," contains: (1) "Purchasing Services from an Educational Management Organization," excerpted from "The Charter School Administrative and Governance Guide" (Massachusetts Dept. of Education); (2) ESP Agreement…

  8. Policy to Performance Toolkit: Transitioning Adults to Opportunity

    ERIC Educational Resources Information Center

    Alamprese, Judith A.; Limardo, Chrys

    2012-01-01

    The "Policy to Performance Toolkit" is designed to provide state adult education staff and key stakeholders with guidance and tools to use in developing, implementing, and monitoring state policies and their associated practices that support an effective state adult basic education (ABE) to postsecondary education and training transition…

  9. Manufacturer’s CORBA Interface Testing Toolkit: Overview

    PubMed Central

    Flater, David

    1999-01-01

    The Manufacturer’s CORBA Interface Testing Toolkit (MCITT) is a software package that supports testing of CORBA components and interfaces. It simplifies the testing of complex distributed systems by producing “dummy components” from Interface Testing Language and Component Interaction Specifications and by automating some error-prone programming tasks. It also provides special commands to support conformance, performance, and stress testing.

  10. Cubit Mesh Generation Toolkit V11.1

    Energy Science and Technology Software Center (ESTSC)

    2009-03-25

    CUBIT prepares models to be used in computer-based simulation of real-world events. CUBIT is a full-featured software toolkit for robust generation of two- and three-dimensional finite element meshes (grids) and geometry preparation. Its main goal is to reduce the time to generate meshes, particularly large hex meshes of complicated, interlocking assemblies.

  11. A Beginning Rural Principal's Toolkit: A Guide for Success

    ERIC Educational Resources Information Center

    Ashton, Brian; Duncan, Heather E.

    2012-01-01

    The purpose of this article is to explore both the challenges and skills needed to effectively assume a leadership position and thus to create an entry plan or "toolkit" for a new rural school leader. The entry plan acts as a guide beginning principals may use to navigate the unavoidable confusion that comes with leadership. It also assists…

  12. Using AASL's "Health and Wellness" and "Crisis Toolkits"

    ERIC Educational Resources Information Center

    Logan, Debra Kay

    2009-01-01

    Whether a school library program is the picture of good health in a state that mandates a professionally staffed library media center in every building or is suffering in a low-wealth district that is facing drastic cuts, the recently launched toolkits by the American Association of School Librarians (AASL) are stocked with useful strategies and…

  13. The Archivists' Toolkit: Another Step toward Streamlined Archival Processing

    ERIC Educational Resources Information Center

    Westbrook, Bradley D.; Mandell, Lee; Shepherd, Kelcy; Stevens, Brian; Varghese, Jason

    2006-01-01

    The Archivists' Toolkit is a software application currently in development and designed to support the creation and management of archival information. This article summarizes the development of the application, including some of the problems the application is designed to resolve. Primary emphasis is placed on describing the application's…

  14. A Guide to Today's Teacher Recruitment Challenges. RNT Toolkit.

    ERIC Educational Resources Information Center

    Simmons, Anne

    This publication is the introductory guide to the Recruiting New Teachers, Inc.'s Toolkit, which is designed to help states and school districts meet their teacher recruitment and retention challenges. The guide provides information on: today's national teacher shortage crisis; how to make a case for stepping up recruitment efforts for diverse…

  15. Roles of the Volunteer in Development: Toolkits for Building Capacity.

    ERIC Educational Resources Information Center

    Slater, Marsha; Allsman, Ava; Savage, Ron; Havens, Lani; Blohm, Judee; Raftery, Kate

    This document, which was developed to assist Peace Corps volunteers and those responsible for training them, presents an introductory booklet and six toolkits for use in the training provided to and by volunteers involved in community development. All the materials emphasize long-term participatory approaches to sustainable development and a…

  16. 78 FR 58520 - U.S. Environmental Solutions Toolkit

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-24

    ... International Trade Administration U.S. Environmental Solutions Toolkit AGENCY: International Trade... from U.S. businesses capable of exporting their goods or services relevant to (a) arsenic removal... wastewater treatment. The Department of Commerce continues to develop the web-based U.S....

  17. Using Toolkits to Achieve STEM Enterprise Learning Outcomes

    ERIC Educational Resources Information Center

    Watts, Carys A.; Wray, Katie

    2012-01-01

    Purpose: The purpose of this paper is to evaluate the effectiveness of using several commercial tools in science, technology, engineering and maths (STEM) subjects for enterprise education at Newcastle University, UK. Design/methodology/approach: The paper provides an overview of existing toolkit use in higher education, before reviewing where and…

  18. The Complete Guide to RTI: An Implementation Toolkit

    ERIC Educational Resources Information Center

    Burton, Dolores; Kappenberg, John

    2012-01-01

    This comprehensive toolkit will bring you up to speed on why RTI is one of the most important educational initiatives in recent history and sets the stage for its future role in teacher education and practice. The authors demonstrate innovative ways to use RTI to inform instruction and guide curriculum development in inclusive classroom settings.…

  19. ELCAT: An E-Learning Content Adaptation Toolkit

    ERIC Educational Resources Information Center

    Clements, Iain; Xu, Zhijie

    2005-01-01

    Purpose: The purpose of this paper is to present an e-learning content adaptation toolkit--ELCAT--that helps to achieve the objectives of the KTP project No. 3509. Design/methodology/approach: The chosen methodology is absolutely practical. The tool was put into motion and results were observed as university and the collaborating company members…

  20. Educating Globally Competent Citizens: A Toolkit. Second Edition

    ERIC Educational Resources Information Center

    Elliott-Gower, Steven; Falk, Dennis R.; Shapiro, Martin

    2012-01-01

    Educating Globally Competent Citizens, a product of AASCU's American Democracy Project and its Global Engagement Initiative, introduces readers to a set of global challenges facing society based on the Center for Strategic and International Studies' 7 Revolutions. The toolkit is designed to aid faculty in incorporating global challenges into new…

  1. Mentoring Immigrant & Refugee Youth: A Toolkit for Program Coordinators

    ERIC Educational Resources Information Center

    MENTOR, 2011

    2011-01-01

    "Mentoring Immigrant Youth: A Toolkit for Program Coordinators" is a comprehensive resource that is designed to offer program staff important background information, promising program practices and strategies to build and sustain high-quality mentoring relationships for different categories of immigrant youth. Included in this resource are five…

  2. Capturing and Using Knowledge about the Use of Visualization Toolkits

    SciTech Connect

    Del Rio, Nicholas R.; Pinheiro da Silva, Paulo

    2012-11-02

    When constructing visualization pipelines using toolkits such as Visualization Toolkit (VTK) and Generic Mapping Tools (GMT), developers must understand (1) what toolkit operators will transform their data from its raw state to some required view state and (2) what viewers are available to present the generated view. Traditionally, developers learn about how to construct visualization pipelines by reading documentation and inspecting code examples, which can be costly in terms of the time and effort expended. Once an initial pipeline is constructed, developers may still have to undergo a trial and error process before a satisfactory visualization is generated. This paper presents the Visualization Knowledge Project (VisKo) that is built on a knowledge base of visualization toolkit operators and how they can be piped together to form visualization pipelines. Developers may now rely on VisKo to guide them when constructing visualization pipelines and in some cases, when VisKo has complete knowledge about some set of operators (i.e., sequencing and parameter settings), automatically generate a fully functional visualization pipeline.

  3. Dataset of aggregate producers in New Mexico

    USGS Publications Warehouse

    Orris, Greta J.

    2000-01-01

    This report presents data, including latitude and longitude, for aggregate sites in New Mexico that were believed to be active in the period 1997-1999. The data are presented in paper form in Part A of this report and as Microsoft Excel 97 and Data Interchange Format (DIF) files in Part B. The work was undertaken as part of the effort to update information for the National Atlas. This compilation includes data from: the files of U.S. Geological Survey (USGS); company contacts; the New Mexico Bureau of Mines and Mineral Resources, New Mexico Bureau of Mine Inspection, and the Mining and Minerals Division of the New Mexico Energy, Minerals and Natural Resources Department (Hatton and others, 1998); the Bureau of Land Management Information; and direct communications with some of the aggregate operators. Additional information on most of the sites is available in Hatton and others (1998).

  4. Platelet aggregation test

    MedlinePlus

    ... this page: //medlineplus.gov/ency/article/003669.htm Platelet aggregation test To use the sharing features on this page, please enable JavaScript. The platelet aggregation blood test checks how well platelets , a ...

  5. Thermodynamics of Protein Aggregation

    NASA Astrophysics Data System (ADS)

    Osborne, Kenneth L.; Barz, Bogdan; Bachmann, Michael; Strodel, Birgit

    Amyloid protein aggregation characterizes many neurodegenerative disorders, including Alzheimer's, Parkinson's, and Creutz- feldt-Jakob disease. Evidence suggests that amyloid aggregates may share similar aggregation pathways, implying simulation of full-length amyloid proteins is not necessary for understanding amyloid formation. In this study we simulate GNNQQNY, the N-terminal prion-determining domain of the yeast protein Sup35 to investigate the thermodynamics of structural transitions during aggregation. We use a coarse-grained model with replica-exchange molecular dynamics to investigate the association of 3-, 6-, and 12-chain GNNQQNY systems and we determine the aggregation pathway by studying aggregation states of GN- NQQNY. We find that the aggregation of the hydrophilic GNNQQNY sequence is mainly driven by H-bond formation, leading to the formation of /3-sheets from the very beginning of the assembly process. Condensation (aggregation) and ordering take place simultaneously, which is underpinned by the occurrence of a single heat capacity peak only.

  6. Platelet aggregation test

    MedlinePlus

    The platelet aggregation blood test checks how well platelets , a part of blood, clump together and cause blood to clot. ... Decreased platelet aggregation may be due to: Autoimmune ... Fibrin degradation products Inherited platelet function defects ...

  7. Dissemination of Earth Remote Sensing Data for Use in the NOAA/NWS Damage Assessment Toolkit

    NASA Technical Reports Server (NTRS)

    Molthan, Andrew; Burks, Jason; Camp, Parks; McGrath, Kevin; Bell, Jordan

    2015-01-01

    The National Weather Service has developed the Damage Assessment Toolkit (DAT), an application for smartphones and tablets that allows for the collection, geolocation, and aggregation of various damage indicators that are collected during storm surveys. The DAT supports the often labor-intensive process where meteorologists venture into the storm-affected area, allowing them to acquire geotagged photos of the observed damage while also assigning estimated EF-scale categories based upon their observations. Once the data are collected, the DAT infrastructure aggregates the observations into a server that allows other meteorologists to perform quality control and other analysis steps before completing their survey and making the resulting data available to the public. In addition to in-person observations, Earth remote sensing from operational, polar-orbiting satellites can support the damage assessment process by identifying portions of damage tracks that may be missed due to road limitations, access to private property, or time constraints. Products resulting from change detection techniques can identify damage to vegetation and the land surface, aiding in the survey process. In addition, higher resolution commercial imagery can corroborate ground-based surveys by examining higher-resolution commercial imagery. As part of an ongoing collaboration, NASA and NOAA are working to integrate near real-time Earth remote sensing observations into the NOAA/NWS Damage Assessment Toolkit. This presentation will highlight recent developments in a streamlined approach for disseminating Earth remote sensing data via web mapping services and a new menu interface that has been integrated within the DAT. A review of current and future products will be provided, including products derived from MODIS and VIIRS for preliminary track identification, along with conduits for higher-resolution Landsat, ASTER, and commercial imagery as they become available. In addition to tornado damage

  8. Geological hazards: from early warning systems to public health toolkits.

    PubMed

    Samarasundera, Edgar; Hansell, Anna; Leibovici, Didier; Horwell, Claire J; Anand, Suchith; Oppenheimer, Clive

    2014-11-01

    Extreme geological events, such as earthquakes, are a significant global concern and sometimes their consequences can be devastating. Geographic information plays a critical role in health protection regarding hazards, and there are a range of initiatives using geographic information to communicate risk as well as to support early warning systems operated by geologists. Nevertheless we consider there to remain shortfalls in translating information on extreme geological events into health protection tools, and suggest that social scientists have an important role to play in aiding the development of a new generation of toolkits aimed at public health practitioners. This viewpoint piece reviews the state of the art in this domain and proposes potential contributions different stakeholder groups, including social scientists, could bring to the development of new toolkits. PMID:25255167

  9. Developing Mixed Reality Educational Applications: The Virtual Touch Toolkit.

    PubMed

    Mateu, Juan; Lasala, María José; Alamán, Xavier

    2015-01-01

    In this paper, we present Virtual Touch, a toolkit that allows the development of educational activities through a mixed reality environment such that, using various tangible elements, the interconnection of a virtual world with the real world is enabled. The main goal of Virtual Touch is to facilitate the installation, configuration and programming of different types of technologies, abstracting the creator of educational applications from the technical details involving the use of tangible interfaces and virtual worlds. Therefore, it is specially designed to enable teachers to themselves create educational activities for their students in a simple way, taking into account that teachers generally lack advanced knowledge in computer programming and electronics. The toolkit has been used to develop various educational applications that have been tested in two secondary education high schools in Spain. PMID:26334275

  10. Toolkit of automated database creation and cross-match

    NASA Astrophysics Data System (ADS)

    Zhang, Yanxia; Zheng, Hongwen; Pei, Tong; Zhao, Yongheng

    2012-09-01

    Astronomy steps into a fullwave and data-avalanche era. Astronomical data is measured by Terabyte, even Petabyte. How to save, manage, analyze so massive data is an important issue in astronomy. In order to let astronomers free of the data processing burden and expert in science, various valuable and convenient tools (e.g. Aladin, VOSpec, VOPlot) are developed by VO projects. To suit this requirement, we develop a toolkit to realize automated database creation, automated database index creation and cross-match. The toolkit provides a good interface for users to apply. The cross-match task may be implemented between local databases, remote databases or local database and remote database. The large-scale cross-match is also easily achieved. Moreover, the speed for large-scale cross-match is rather satisfactory.

  11. Developing Mixed Reality Educational Applications: The Virtual Touch Toolkit

    PubMed Central

    Mateu, Juan; Lasala, María José; Alamán, Xavier

    2015-01-01

    In this paper, we present Virtual Touch, a toolkit that allows the development of educational activities through a mixed reality environment such that, using various tangible elements, the interconnection of a virtual world with the real world is enabled. The main goal of Virtual Touch is to facilitate the installation, configuration and programming of different types of technologies, abstracting the creator of educational applications from the technical details involving the use of tangible interfaces and virtual worlds. Therefore, it is specially designed to enable teachers to themselves create educational activities for their students in a simple way, taking into account that teachers generally lack advanced knowledge in computer programming and electronics. The toolkit has been used to develop various educational applications that have been tested in two secondary education high schools in Spain. PMID:26334275

  12. Monitoring the grid with the globus toolkit MDS4.

    SciTech Connect

    Schopf, J. M.; Pearlman, L.; Miller, N.; Kesselman, C.; Foster, I.; D'Arcy, M.; Chervenak, A.; Mathematics and Computer Science; Univ. of Chicago; Univ. of Southern California; Univ. of Edinburgh

    2006-01-01

    The Globus Toolkit Monitoring and Discovery System (MDS4) defines and implements mechanisms for service and resource discovery and monitoring in distributed environments. MDS4 is distinguished from previous similar systems by its extensive use of interfaces and behaviors defined in the WS-Resource Framework and WS-Notification specifications, and by its deep integration into essentially every component of the Globus Toolkit. We describe the MDS4 architecture and the Web service interfaces and behaviors that allow users to discover resources and services, monitor resource and service states, receive updates on current status, and visualize monitoring results. We present two current deployments to provide insights into the functionality that can be achieved via the use of these mechanisms.

  13. Pervasive Collaboratorive Computing Environment Jabber Toolkit

    SciTech Connect

    Gunter, Dan; Lee, Jason

    2004-05-15

    PCCE Project background: Our experience in building distributed collaboratories has shown us that there is a growing need for simple, non-intrusive, and flexible ways to stay in touch and work together. Towards this goal we are developing a Pervasive Collaborative Computing Environment (PCCE) within which participants can rendezvous and interact with each other. The PCCE aims to support continuous or ad hoc collaboration, target daily tasks and base connectivity, be easy to use and install across multiple platforms, leverage off of existing components when possible, use standards-based components, and leverage off of Grid services (e.g., security and directory services). A key concept for this work is "incremental trust", which allows the system's "trust" of a given user to change dynamically. PCCE Jabber client software: This leverages Jabber. an open Instant Messaging (IM) protocol and the related Internet Engineering Task Force (IETF) standards "XMPP" and "XMPP-IM" to allow collaborating parties to chat either one-on-one or in "chat rooms". Standard Jabber clients will work within this framework, but the software will also include extensions to a (multi-platform) GUI client (Gaim) for X.509-based security, search, and incremental trust. This software also includes Web interfaces for managing user registration to a Jabber server. PCCE Jabber server software: Extensions to the code, database, and configuration files for the dominant open-source Jabber server, "jabberd". Extensions for search, X.509 security, and incremental trust. Note that the jabberd software is not included as part of this software.

  14. Pervasive Collaboratorive Computing Environment Jabber Toolkit

    Energy Science and Technology Software Center (ESTSC)

    2004-05-15

    PCCE Project background: Our experience in building distributed collaboratories has shown us that there is a growing need for simple, non-intrusive, and flexible ways to stay in touch and work together. Towards this goal we are developing a Pervasive Collaborative Computing Environment (PCCE) within which participants can rendezvous and interact with each other. The PCCE aims to support continuous or ad hoc collaboration, target daily tasks and base connectivity, be easy to use and installmore » across multiple platforms, leverage off of existing components when possible, use standards-based components, and leverage off of Grid services (e.g., security and directory services). A key concept for this work is "incremental trust", which allows the system's "trust" of a given user to change dynamically. PCCE Jabber client software: This leverages Jabber. an open Instant Messaging (IM) protocol and the related Internet Engineering Task Force (IETF) standards "XMPP" and "XMPP-IM" to allow collaborating parties to chat either one-on-one or in "chat rooms". Standard Jabber clients will work within this framework, but the software will also include extensions to a (multi-platform) GUI client (Gaim) for X.509-based security, search, and incremental trust. This software also includes Web interfaces for managing user registration to a Jabber server. PCCE Jabber server software: Extensions to the code, database, and configuration files for the dominant open-source Jabber server, "jabberd". Extensions for search, X.509 security, and incremental trust. Note that the jabberd software is not included as part of this software.« less

  15. A medical imaging and visualization toolkit in Java.

    PubMed

    Huang, Su; Baimouratov, Rafail; Xiao, Pengdong; Ananthasubramaniam, Anand; Nowinski, Wieslaw L

    2006-03-01

    Medical imaging research and clinical applications usually require combination and integration of various techniques ranging from image processing and analysis to realistic visualization to user-friendly interaction. Researchers with different backgrounds coming from diverse areas have been using numerous types of hardware, software, and environments to obtain their results. We also observe that students often build their tools from scratch resulting in redundant work. A generic and flexible medical imaging and visualization toolkit would be helpful in medical research and educational institutes to reduce redundant development work and hence increase research efficiency. This paper presents our experience in developing a Medical Imaging and Visualization Toolkit (BIL-kit) that is a set of comprehensive libraries as well as a number of interactive tools. The BIL-kit covers a wide range of fundamental functions from image conversion and transformation, image segmentation, and analysis to geometric model generation and manipulation, all the way up to 3D visualization and interactive simulation. The toolkit design and implementation emphasize the reusability and flexibility. BIL-kit is implemented in the Java language so that it works in hybrid and dynamic research and educational environments. This also allows the toolkit to extend its usage for the development of Web-based applications. Several BIL-kit-based tools and applications are presented including image converter, image processor, general anatomy model simulator, vascular modeling environment, and volume viewer. BIL-kit is a suitable platform for researchers and students to develop visualization and simulation prototypes, and it can also be used for the development of clinical applications. PMID:16323064

  16. Bayesian Analysis Toolkit: 1.0 and beyond

    NASA Astrophysics Data System (ADS)

    Beaujean, Frederik; Caldwell, Allen; Greenwald, D.; Kluth, S.; Kröninger, Kevin; Schulz, O.

    2015-12-01

    The Bayesian Analysis Toolkit is a C++ package centered around Markov-chain Monte Carlo sampling. It is used in high-energy physics analyses by experimentalists and theorists alike. The software has matured over the last few years. We present new features to enter version 1.0, then summarize some of the software-engineering lessons learned and give an outlook on future versions.

  17. MAVEN IDL Toolkit: Integrated Data Access and Visualization

    NASA Astrophysics Data System (ADS)

    Larsen, K. W.; Martin, J.; De Wolfe, A. W.; Brain, D. A.

    2014-12-01

    The Mars Atmosphere and Volatile EvolutioN (MAVEN) mission has arrived at Mars and begun its investigations into the state of the upper atmosphere. Because atmospheric processes are subject to a wide variety of internal and external variables, understanding the overall forces driving the composition, structure, and dynamics requires an integrated analysis from all the available data. Eight instruments on the spacecraft are collecting in-situ and remote sensing data on the fields and particles, neutral and ionized, that make up Mars' upper atmosphere. As the scientific questions MAVEN is designed to answer require an understanding of the data from multiple instruments, the project has designed a software toolkit to facilitate the access, analysis, and visualization of the disparate data. The toolkit provides mission scientists with easy access to the data from within the IDL environment, designed to ensure that users are always working with the most recent data available and to eliminate the usual difficulties of data from a variety of data sources and formats. The Toolkit also includes 1-, 2-, and interactive 3-D visualizations to enable the scientists to examine the inter-relations between data from all instruments, as well as from external models. The data and visualizations have been designed to be as flexible and extensible as possible, allowing the scientists to rapidly and easily examine and manipulate the data in the context of the mission and their wider research programs.

  18. Guide to Using the WIND Toolkit Validation Code

    SciTech Connect

    Lieberman-Cribbin, W.; Draxl, C.; Clifton, A.

    2014-12-01

    In response to the U.S. Department of Energy's goal of using 20% wind energy by 2030, the Wind Integration National Dataset (WIND) Toolkit was created to provide information on wind speed, wind direction, temperature, surface air pressure, and air density on more than 126,000 locations across the United States from 2007 to 2013. The numerical weather prediction model output, gridded at 2-km and at a 5-minute resolution, was further converted to detail the wind power production time series of existing and potential wind facility sites. For users of the dataset it is important that the information presented in the WIND Toolkit is accurate and that errors are known, as then corrective steps can be taken. Therefore, we provide validation code written in R that will be made public to provide users with tools to validate data of their own locations. Validation is based on statistical analyses of wind speed, using error metrics such as bias, root-mean-square error, centered root-mean-square error, mean absolute error, and percent error. Plots of diurnal cycles, annual cycles, wind roses, histograms of wind speed, and quantile-quantile plots are created to visualize how well observational data compares to model data. Ideally, validation will confirm beneficial locations to utilize wind energy and encourage regional wind integration studies using the WIND Toolkit.

  19. On combining computational differentiation and toolkits for parallel scientific computing.

    SciTech Connect

    Bischof, C. H.; Buecker, H. M.; Hovland, P. D.

    2000-06-08

    Automatic differentiation is a powerful technique for evaluating derivatives of functions given in the form of a high-level programming language such as Fortran, C, or C++. The program is treated as a potentially very long sequence of elementary statements to which the chain rule of differential calculus is applied over and over again. Combining automatic differentiation and the organizational structure of toolkits for parallel scientific computing provides a mechanism for evaluating derivatives by exploiting mathematical insight on a higher level. In these toolkits, algorithmic structures such as BLAS-like operations, linear and nonlinear solvers, or integrators for ordinary differential equations can be identified by their standardized interfaces and recognized as high-level mathematical objects rather than as a sequence of elementary statements. In this note, the differentiation of a linear solver with respect to some parameter vector is taken as an example. Mathematical insight is used to reformulate this problem into the solution of multiple linear systems that share the same coefficient matrix but differ in their right-hand sides. The experiments reported here use ADIC, a tool for the automatic differentiation of C programs, and PETSC, an object-oriented toolkit for the parallel solution of scientific problems modeled by partial differential equations.

  20. Risk of Resource Failure and Toolkit Variation in Small-Scale Farmers and Herders

    PubMed Central

    Collard, Mark; Ruttle, April; Buchanan, Briggs; O’Brien, Michael J.

    2012-01-01

    Recent work suggests that global variation in toolkit structure among hunter-gatherers is driven by risk of resource failure such that as risk of resource failure increases, toolkits become more diverse and complex. Here we report a study in which we investigated whether the toolkits of small-scale farmers and herders are influenced by risk of resource failure in the same way. In the study, we applied simple linear and multiple regression analysis to data from 45 small-scale food-producing groups to test the risk hypothesis. Our results were not consistent with the hypothesis; none of the risk variables we examined had a significant impact on toolkit diversity or on toolkit complexity. It appears, therefore, that the drivers of toolkit structure differ between hunter-gatherers and small-scale food-producers. PMID:22844421

  1. Matlab based Toolkits used to Interface with Optical Design Software for NASA's James Webb Space Telescope

    NASA Technical Reports Server (NTRS)

    Howard, Joseph

    2007-01-01

    The viewgraph presentation provides an introduction to the James Webb Space Telescope (JWST). The first part provides a brief overview of Matlab toolkits including CodeV, OSLO, and Zemax Toolkits. The toolkit overview examines purpose, layout, how Matlab gets data from CodeV, function layout, and using cvHELP. The second part provides examples of use with JWST, including wavefront sensitivities and alignment simulations.

  2. GMinterp, A Matlab Based Toolkit for Gravity and Magnetic Data Analysis: Example Application to the Airborne Magnetic Anomalies of Biga Peninsula, NW Turkey

    NASA Astrophysics Data System (ADS)

    Ekinci, Y. L.; Yiǧitbaş, E.

    2012-04-01

    The analysis of gravity and magnetic field methods is becoming increasingly significant for the earth sciences as a whole and these potential field methods efficiently assist in working out both shallow and deep geologic problems and play important role on modeling and interpretation procedures. The main advantage of some gravity and magnetic data processing techniques is to present the subtle details in the data which are not clearly identified in anomaly maps, without specifying any prior information about the nature of the source bodies. If the data quality permits, many analyzing techniques can be carried out that help to build a general understanding of the details and parameters of the shallower or deeper causative body distributions such as depth, thickness, lateral and vertical extensions. Gravity and magnetic field data are usually analyzed by means of analytic signal (via directional derivatives) methods, linear transformations, regional and residual anomaly separation techniques, spectral methods, filtering and forward and inverse modeling techniques. Some commercial software packages are commonly used for analyzing potential field data by employing some of the techniques specified above. Additionally, many freeware and open-source codes can be found in the literature, but unfortunately they are focused on special issues of the potential fields. In this study, a toolkit, that performs numerous interpretation and modeling techniques for potential field data, is presented. The toolkit, named GMinterp, is MATLAB-based consisting of a series of linked functions along with a graphical user interface (GUI). GMinterp allows performing complex processing such as transformations and filtering, editing, gridding, mapping, digitizing, extracting cross-sections, forward and inverse modeling and interpretation tasks. The toolkit enables to work with both profile and gridded data as an input file. Tests on the theoretically produced data showed the reliability of

  3. Practical Aspects of the Cellular Force Inference Toolkit (CellFIT)

    PubMed Central

    Veldhuis, Jim H.; Mashburn, David; Hutson, M. Shane; Brodland, G. Wayne

    2016-01-01

    If we are to fully understand the reasons that cells and tissues move and acquire their distinctive geometries during processes such as embryogenesis and wound healing, we will need detailed maps of the forces involved. One of the best current prospects for obtaining this information is force-from-images techniques such as CellFIT, the Cellular Force Inference Toolkit, whose various steps are discussed here. Like other current quasi-static approaches, this one assumes that cell shapes are produced by interactions between interfacial tensions and intracellular pressures. CellFIT, however, allows cells to have curvilinear boundaries, which can significantly improve inference accuracy and reduce noise sensitivity. The quality of a CellFIT analysis depends on how accurately the junction angles and edge curvatures are measured, and a software tool we describe facilitates determination and evaluation of this information. Special attention is required when edges are crenulated or significantly different in shape from a circular arc. Because the tension and pressure equations are overdetermined, a select number of edges can be removed from the analysis, and these might include edges that are poorly defined in the source image, too short to provide accurate angles or curvatures, or non-circular. The approach works well for aggregates with as many as 1000 cells, and introduced errors have significant effects on only a few adjacent cells. An understanding of these considerations will help CellFIT users to get the most out of this promising new technique. PMID:25640437

  4. PAT: a protein analysis toolkit for integrated biocomputing on the web

    PubMed Central

    Gracy, Jérôme; Chiche, Laurent

    2005-01-01

    PAT, for Protein Analysis Toolkit, is an integrated biocomputing server. The main goal of its design was to facilitate the combination of different processing tools for complex protein analyses and to simplify the automation of repetitive tasks. The PAT server provides a standardized web interface to a wide range of protein analysis tools. It is designed as a streamlined analysis environment that implements many features which strongly simplify studies dealing with protein sequences and structures and improve productivity. PAT is able to read and write data in many bioinformatics formats and to create any desired pipeline by seamlessly sending the output of a tool to the input of another tool. PAT can retrieve protein entries from identifier-based queries by using pre-computed database indexes. Users can easily formulate complex queries combining different analysis tools with few mouse clicks, or via a dedicated macro language, and a web session manager provides direct access to any temporary file generated during the user session. PAT is freely accessible on the Internet at . PMID:15980554

  5. FASTAptamer: A Bioinformatic Toolkit for High-throughput Sequence Analysis of Combinatorial Selections

    PubMed Central

    Alam, Khalid K; Chang, Jonathan L; Burke, Donald H

    2015-01-01

    High-throughput sequence (HTS) analysis of combinatorial selection populations accelerates lead discovery and optimization and offers dynamic insight into selection processes. An underlying principle is that selection enriches high-fitness sequences as a fraction of the population, whereas low-fitness sequences are depleted. HTS analysis readily provides the requisite numerical information by tracking the evolutionary trajectory of individual sequences in response to selection pressures. Unlike genomic data, for which a number of software solutions exist, user-friendly tools are not readily available for the combinatorial selections field, leading many users to create custom software. FASTAptamer was designed to address the sequence-level analysis needs of the field. The open source FASTAptamer toolkit counts, normalizes and ranks read counts in a FASTQ file, compares populations for sequence distribution, generates clusters of sequence families, calculates fold-enrichment of sequences throughout the course of a selection and searches for degenerate sequence motifs. While originally designed for aptamer selections, FASTAptamer can be applied to any selection strategy that can utilize next-generation DNA sequencing, such as ribozyme or deoxyribozyme selections, in vivo mutagenesis and various surface display technologies (peptide, antibody fragment, mRNA, etc.). FASTAptamer software, sample data and a user's guide are available for download at http://burkelab.missouri.edu/fastaptamer.html. PMID:25734917

  6. The GBIF integrated publishing toolkit: facilitating the efficient publishing of biodiversity data on the internet.

    PubMed

    Robertson, Tim; Döring, Markus; Guralnick, Robert; Bloom, David; Wieczorek, John; Braak, Kyle; Otegui, Javier; Russell, Laura; Desmet, Peter

    2014-01-01

    The planet is experiencing an ongoing global biodiversity crisis. Measuring the magnitude and rate of change more effectively requires access to organized, easily discoverable, and digitally-formatted biodiversity data, both legacy and new, from across the globe. Assembling this coherent digital representation of biodiversity requires the integration of data that have historically been analog, dispersed, and heterogeneous. The Integrated Publishing Toolkit (IPT) is a software package developed to support biodiversity dataset publication in a common format. The IPT's two primary functions are to 1) encode existing species occurrence datasets and checklists, such as records from natural history collections or observations, in the Darwin Core standard to enhance interoperability of data, and 2) publish and archive data and metadata for broad use in a Darwin Core Archive, a set of files following a standard format. Here we discuss the key need for the IPT, how it has developed in response to community input, and how it continues to evolve to streamline and enhance the interoperability, discoverability, and mobilization of new data types beyond basic Darwin Core records. We close with a discussion how IPT has impacted the biodiversity research community, how it enhances data publishing in more traditional journal venues, along with new features implemented in the latest version of the IPT, and future plans for more enhancements. PMID:25099149

  7. The GBIF Integrated Publishing Toolkit: Facilitating the Efficient Publishing of Biodiversity Data on the Internet

    PubMed Central

    Robertson, Tim; Döring, Markus; Guralnick, Robert; Bloom, David; Wieczorek, John; Braak, Kyle; Otegui, Javier; Russell, Laura; Desmet, Peter

    2014-01-01

    The planet is experiencing an ongoing global biodiversity crisis. Measuring the magnitude and rate of change more effectively requires access to organized, easily discoverable, and digitally-formatted biodiversity data, both legacy and new, from across the globe. Assembling this coherent digital representation of biodiversity requires the integration of data that have historically been analog, dispersed, and heterogeneous. The Integrated Publishing Toolkit (IPT) is a software package developed to support biodiversity dataset publication in a common format. The IPT’s two primary functions are to 1) encode existing species occurrence datasets and checklists, such as records from natural history collections or observations, in the Darwin Core standard to enhance interoperability of data, and 2) publish and archive data and metadata for broad use in a Darwin Core Archive, a set of files following a standard format. Here we discuss the key need for the IPT, how it has developed in response to community input, and how it continues to evolve to streamline and enhance the interoperability, discoverability, and mobilization of new data types beyond basic Darwin Core records. We close with a discussion how IPT has impacted the biodiversity research community, how it enhances data publishing in more traditional journal venues, along with new features implemented in the latest version of the IPT, and future plans for more enhancements. PMID:25099149

  8. molSimplify: A toolkit for automating discovery in inorganic chemistry.

    PubMed

    Ioannidis, Efthymios I; Gani, Terry Z H; Kulik, Heather J

    2016-08-15

    We present an automated, open source toolkit for the first-principles screening and discovery of new inorganic molecules and intermolecular complexes. Challenges remain in the automatic generation of candidate inorganic molecule structures due to the high variability in coordination and bonding, which we overcome through a divide-and-conquer tactic that flexibly combines force-field preoptimization of organic fragments with alignment to first-principles-trained metal-ligand distances. Exploration of chemical space is enabled through random generation of ligands and intermolecular complexes from large chemical databases. We validate the generated structures with the root mean squared (RMS) gradients evaluated from density functional theory (DFT), which are around 0.02 Ha/au across a large 150 molecule test set. Comparison of molSimplify results to full optimization with the universal force field reveals that RMS DFT gradients are improved by 40%. Seamless generation of input files, preparation and execution of electronic structure calculations, and post-processing for each generated structure aids interpretation of underlying chemical and energetic trends. © 2016 Wiley Periodicals, Inc. PMID:27364957

  9. Aggregations in Flatworms.

    ERIC Educational Resources Information Center

    Liffen, C. L.; Hunter, M.

    1980-01-01

    Described is a school project to investigate aggregations in flatworms which may be influenced by light intensity, temperature, and some form of chemical stimulus released by already aggregating flatworms. Such investigations could be adopted to suit many educational levels of science laboratory activities. (DS)

  10. Using the PhenX Toolkit to Add Standard Measures to a Study.

    PubMed

    Hendershot, Tabitha; Pan, Huaqin; Haines, Jonathan; Harlan, William R; Marazita, Mary L; McCarty, Catherine A; Ramos, Erin M; Hamilton, Carol M

    2015-01-01

    The PhenX (consensus measures for Phenotypes and eXposures) Toolkit (https://www.phenxtoolkit.org/) offers high-quality, well-established measures of phenotypes and exposures for use by the scientific community. The goal is to promote the use of standard measures, enhance data interoperability, and help investigators identify opportunities for collaborative and translational research. The Toolkit contains 395 measures drawn from 22 research domains (fields of research), along with additional collections of measures for Substance Abuse and Addiction (SAA) research, Mental Health Research (MHR), and Tobacco Regulatory Research (TRR). Additional measures for TRR that are expected to be released in 2015 include Obesity, Eating Disorders, and Sickle Cell Disease. Measures are selected by working groups of domain experts using a consensus process that includes input from the scientific community. The Toolkit provides a description of each PhenX measure, the rationale for including it in the Toolkit, protocol(s) for collecting the measure, and supporting documentation. Users can browse measures in the Toolkit or can search the Toolkit using the Smart Query Tool or a full text search. PhenX Toolkit users select measures of interest to add to their Toolkit. Registered Toolkit users can save their Toolkit and return to it later to revise or complete. They then have options to download a customized Data Collection Worksheet that specifies the data to be collected, and a Data Dictionary that describes each variable included in the Data Collection Worksheet. The Toolkit also has a Register Your Study feature that facilitates cross-study collaboration by allowing users to find other investigators using the same PhenX measures. PMID:26132000

  11. Cross-File Searching: How Vendors Help--And Don't Help--Improve Compatability.

    ERIC Educational Resources Information Center

    Milstead, Jessica L.

    1999-01-01

    Reports how a cross-section of database producers, search services, and aggregators are using vocabulary management to facilitate cross-file searching. Discusses the range of subject areas and audiences; indexing; vocabulary control within databases; machine aids to indexing; and aids to cross-file searching. A chart contains sources of files and…

  12. Census of Population and Housing, 1980: Summary Tape File 1F, School Districts. Technical Documentation.

    ERIC Educational Resources Information Center

    Bureau of the Census (DOC), Washington, DC. Data User Services Div.

    This report provides technical documentation associated with a 1980 Census of Population and Housing Summary Tape File 1F--the School Districts File. The file contains complete-count data of population and housing aggregated by school district. Population items tabulated include age, race (provisional data), sex, marital status, Spanish origin…

  13. 11 CFR 104.5 - Filing dates (2 U.S.C. 434(a)(2)).

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... campaign committee of a candidate for President shall file reports on the dates specified at 11 CFR 104.5(b... committee filing under 11 CFR 104.5(b)(1)(ii) receives contributions aggregating $100,000 or makes... be waived if under 11 CFR 104.5(c)(1)(ii) a pre-election report is required to be filed during...

  14. Adoption of Test Driven Development and Continuous Integration for the Development of the Trick Simulation Toolkit

    NASA Technical Reports Server (NTRS)

    Penn, John M.

    2013-01-01

    This paper describes the adoption of a Test Driven Development approach and a Continuous Integration System in the development of the Trick Simulation Toolkit, a generic simulation development environment for creating high fidelity training and engineering simulations at the NASA/Johnson Space Center and many other NASA facilities. It describes what was learned and the significant benefits seen, such as fast, thorough, and clear test feedback every time code is checked-in to the code repository. It also describes a system that encourages development of code that is much more flexible, maintainable, and reliable. The Trick Simulation Toolkit development environment provides a common architecture for user-defined simulations. Trick builds executable simulations using user-supplied simulation-definition files (S_define) and user supplied "model code". For each Trick-based simulation, Trick automatically provides job scheduling, checkpoint / restore, data-recording, interactive variable manipulation (variable server), and an input-processor. Also included are tools for plotting recorded data and various other supporting tools and libraries. Trick is written in C/C++ and Java and supports both Linux and MacOSX. Prior to adopting this new development approach, Trick testing consisted primarily of running a few large simulations, with the hope that their complexity and scale would exercise most of Trick's code and expose any recently introduced bugs. Unsurprising, this approach yielded inconsistent results. It was obvious that a more systematic, thorough approach was required. After seeing examples of some Java-based projects that used the JUnit test framework, similar test frameworks for C and C++ were sought. Several were found, all clearly inspired by JUnit. Googletest, a freely available Open source testing framework, was selected as the most appropriate and capable. The new approach was implemented while rewriting the Trick memory management component, to eliminate a

  15. The advanced computational testing and simulation toolkit (ACTS)

    SciTech Connect

    Drummond, L.A.; Marques, O.

    2002-05-21

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  16. Developing Climate Resilience Toolkit Decision Support Training Sectio

    NASA Astrophysics Data System (ADS)

    Livezey, M. M.; Herring, D.; Keck, J.; Meyers, J. C.

    2014-12-01

    The Climate Resilience Toolkit (CRT) is a Federal government effort to address the U.S. President's Climate Action Plan and Executive Order for Climate Preparedness. The toolkit will provide access to tools and products useful for climate-sensitive decision making. To optimize the user experience, the toolkit will also provide access to training materials. The National Oceanic and Atmospheric Administration (NOAA) has been building a climate training capability for 15 years. The target audience for the training has historically been mainly NOAA staff with some modified training programs for external users and stakeholders. NOAA is now using this climate training capacity for the CRT. To organize the CRT training section, we collaborated with the Association of Climate Change Officers to determine the best strategy and identified four additional complimentary skills needed for successful decision making: climate literacy, environmental literacy, risk assessment and management, and strategic execution and monitoring. Developing the climate literacy skills requires knowledge of climate variability and change, as well as an introduction to the suite of available products and services. For the development of an environmental literacy category, specific topics needed include knowledge of climate impacts on specific environmental systems. Climate risk assessment and management introduces a process for decision making and provides knowledge on communication of climate information and integration of climate information in planning processes. The strategic execution and monitoring category provides information on use of NOAA climate products, services, and partnership opportunities for decision making. In order to use the existing training modules, it was necessary to assess their level of complexity, catalog them, and develop guidance for users on a curriculum to take advantage of the training resources to enhance their learning experience. With the development of this CRT

  17. Charged Dust Aggregate Interactions

    NASA Astrophysics Data System (ADS)

    Matthews, Lorin; Hyde, Truell

    2015-11-01

    A proper understanding of the behavior of dust particle aggregates immersed in a complex plasma first requires a knowledge of the basic properties of the system. Among the most important of these are the net electrostatic charge and higher multipole moments on the dust aggregate as well as the manner in which the aggregate interacts with the local electrostatic fields. The formation of elongated, fractal-like aggregates levitating in the sheath electric field of a weakly ionized RF generated plasma discharge has recently been observed experimentally. The resulting data has shown that as aggregates approach one another, they can both accelerate and rotate. At equilibrium, aggregates are observed to levitate with regular spacing, rotating about their long axis aligned parallel to the sheath electric field. Since gas drag tends to slow any such rotation, energy must be constantly fed into the system in order to sustain it. A numerical model designed to analyze this motion provides both the electrostatic charge and higher multipole moments of the aggregate while including the forces due to thermophoresis, neutral gas drag, and the ion wakefield. This model will be used to investigate the ambient conditions leading to the observed interactions. This research is funded by NSF Grant 1414523.

  18. Visualization of sphere packs using a dataflow toolkit.

    PubMed

    Walton, J

    1994-12-01

    We describe the construction of a simple application for the visualization of sphere packs, with applications to molecular graphics. Our development environment is IRIS Explorer, one of the new generation of so-called dataflow toolkits. We emphasize particularly the way in which working in such an environment facilitates the design and construction process, paying special attention to tools which aid the importing of data into the application, the design of the user interface, and the extension or modification of existing tools. Some examples of the use of the application in the field of molecular modeling are presented. PMID:7696218

  19. The RAVE/VERTIGO vertex reconstruction toolkit and framework

    NASA Astrophysics Data System (ADS)

    Waltenberger, W.; Mitaroff, W.; Moser, F.; Pflugfelder, B.; Riedel, H. V.

    2008-07-01

    A detector-independent toolkit for vertex reconstruction (RAVE1) is being developed, along with a standalone framework (VERTIGO2) for testing, analyzing and debugging. The core algorithms represent state-of-the-art for geometric vertex finding and fitting by both linear (Kalman filter) and robust estimation methods. Main design goals are ease of use, flexibility for embedding into existing software frameworks, extensibility, and openness. The implementation is based on modern object-oriented techniques, is coded in C++ with interfaces for Java and Python, and follows an open-source approach. A beta release is available.

  20. TECA: A Parallel Toolkit for Extreme Climate Analysis

    SciTech Connect

    Prabhat, Mr; Ruebel, Oliver; Byna, Surendra; Wu, Kesheng; Li, Fuyu; Wehner, Michael; Bethel, E. Wes

    2012-03-12

    We present TECA, a parallel toolkit for detecting extreme events in large climate datasets. Modern climate datasets expose parallelism across a number of dimensions: spatial locations, timesteps and ensemble members. We design TECA to exploit these modes of parallelism and demonstrate a prototype implementation for detecting and tracking three classes of extreme events: tropical cyclones, extra-tropical cyclones and atmospheric rivers. We process a modern TB-sized CAM5 simulation dataset with TECA, and demonstrate good runtime performance for the three case studies.

  1. The Wind Integration National Dataset (WIND) toolkit (Presentation)

    SciTech Connect

    Caroline Draxl: NREL

    2014-01-01

    Regional wind integration studies require detailed wind power output data at many locations to perform simulations of how the power system will operate under high penetration scenarios. The wind datasets that serve as inputs into the study must realistically reflect the ramping characteristics, spatial and temporal correlations, and capacity factors of the simulated wind plants, as well as being time synchronized with available load profiles.As described in this presentation, the WIND Toolkit fulfills these requirements by providing a state-of-the-art national (US) wind resource, power production and forecast dataset.

  2. Building Emergency Contraception Awareness among Adolescents. A Toolkit for Schools and Community-Based Organizations.

    ERIC Educational Resources Information Center

    Simkin, Linda; Radosh, Alice; Nelsesteun, Kari; Silverstein, Stacy

    This toolkit presents emergency contraception (EC) as a method to help adolescent women avoid pregnancy and abortion after unprotected sexual intercourse. The sections of this toolkit are designed to help increase your knowledge of EC and stay up to date. They provide suggestions for increasing EC awareness in the workplace, whether it is a school…

  3. Growing and Sustaining Parent Engagement: A Toolkit for Parents and Community Partners

    ERIC Educational Resources Information Center

    Center for the Study of Social Policy, 2010

    2010-01-01

    The Toolkit is a quick and easy guide to help support and sustain parent engagement. It provides how to's for implementing three powerful strategies communities can use to maintain and grow parent engagement work that is already underway: Creating a Parent Engagement 1) Roadmap, 2) Checklist and 3) Support Network. This toolkit includes…

  4. A Data Audit and Analysis Toolkit To Support Assessment of the First College Year.

    ERIC Educational Resources Information Center

    Paulson, Karen

    This "toolkit" provides a process by which institutions can identify and use information resources to enhance the experiences and outcomes of first-year students. The toolkit contains a "Technical Manual" designed for use by the technical personnel who will be conducting the data audit and associated analyses. Administrators who want more…

  5. Charon Toolkit for Parallel, Implicit Structured-Grid Computations: Functional Design

    NASA Technical Reports Server (NTRS)

    VanderWijngaart, Rob F.; Kutler, Paul (Technical Monitor)

    1997-01-01

    Charon is a software toolkit that enables engineers to develop high-performing message-passing programs in a convenient and piecemeal fashion. Emphasis is on rapid program development and prototyping. In this report a detailed description of the functional design of the toolkit is presented. It is illustrated by the stepwise parallelization of two representative code examples.

  6. Language Access Toolkit: An Organizing and Advocacy Resource for Community-Based Youth Programs

    ERIC Educational Resources Information Center

    Beyersdorf, Mark Ro

    2013-01-01

    Asian American Legal Defense and Education Fund (AALDEF) developed this language access toolkit to share the expertise and experiences of National Asian American Education Advocates Network (NAAEA) member organizations with other community organizations interested in developing language access campaigns. This toolkit includes an overview of…

  7. Toolkit for a Workshop on Building a Culture of Data Use. REL 2015-063

    ERIC Educational Resources Information Center

    Gerzon, Nancy; Guckenburg, Sarah

    2015-01-01

    The Culture of Data Use Workshop Toolkit helps school and district teams apply research to practice as they establish and support a culture of data use in their educational setting. The field-tested workshop toolkit guides teams through a set of structured activities to develop an understanding of data-use research in schools and to analyze…

  8. The Customer Flow Toolkit: A Framework for Designing High Quality Customer Services.

    ERIC Educational Resources Information Center

    New York Association of Training and Employment Professionals, Albany.

    This document presents a toolkit to assist staff involved in the design and development of New York's one-stop system. Section 1 describes the preplanning issues to be addressed and the intended outcomes that serve as the framework for creation of the customer flow toolkit. Section 2 outlines the following strategies to assist in designing local…

  9. The Development of a Curriculum Toolkit with American Indian and Alaska Native Communities

    ERIC Educational Resources Information Center

    Thompson, Nicole L.; Hare, Dwight; Sempier, Tracie T.; Grace, Cathy

    2008-01-01

    This article explains the creation of the "Growing and Learning with Young Native Children" curriculum toolkit. The curriculum toolkit was designed to give American Indian and Alaska Native early childhood educators who work in a variety of settings the framework for developing a research-based, developmentally appropriate, tribally specific…

  10. Toolkit for Professional Developers: Training Targets 3?6 Grade Teachers

    ERIC Educational Resources Information Center

    McMunn, Nancy; Dunnivant, Michael; Williamson, Jan; Reagan, Hope

    2004-01-01

    The professional development CAR Toolkit is focused on the assessment of reading process at the text level, rather than at the word level. Most students in grades 3-6 generally need support in comprehending text, not just decoding words. While the assessment of reading methods in the CAR Toolkit will help teachers pinpoint difficulties at the word…